CN112562033B - 基于骨盆ct图片获得用于快速识别受伤个体的数据的转化方法 - Google Patents

基于骨盆ct图片获得用于快速识别受伤个体的数据的转化方法 Download PDF

Info

Publication number
CN112562033B
CN112562033B CN202011549518.2A CN202011549518A CN112562033B CN 112562033 B CN112562033 B CN 112562033B CN 202011549518 A CN202011549518 A CN 202011549518A CN 112562033 B CN112562033 B CN 112562033B
Authority
CN
China
Prior art keywords
data
pelvis
feature
conversion equation
individuals
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011549518.2A
Other languages
English (en)
Other versions
CN112562033A (zh
Inventor
王飞翔
夏文涛
刘太昂
汪茂文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Academy Of Forensic Science
Original Assignee
Academy Of Forensic Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Academy Of Forensic Science filed Critical Academy Of Forensic Science
Priority to CN202011549518.2A priority Critical patent/CN112562033B/zh
Publication of CN112562033A publication Critical patent/CN112562033A/zh
Application granted granted Critical
Publication of CN112562033B publication Critical patent/CN112562033B/zh
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Epidemiology (AREA)
  • Computing Systems (AREA)
  • Primary Health Care (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

本发明公开了一种基于骨盆CT图片获得用于快速识别受伤个体的数据的转化方法,包括以下步骤:1)对若干个体进行CT扫描;2)收集这些个体的骨盆CT图片;3)利用卷积神经网络特征提取器对骨盆CT图片进行特征提取,得到特征数据,并利用转化方程对这些特征数据进行转化,得到转化后的综合特征信息数据。本发明可以简便、快速、低成本从骨盆CT图片提取特征,并对这些特征进行转化,得到转化后的综合特征信息数据,综合特征信息数据进一步用于个体受伤的快速识别模型。

Description

基于骨盆CT图片获得用于快速识别受伤个体的数据的转化 方法
技术领域
本发明涉及CT图片识别领域,特别涉及一种基于骨盆CT图片获得用于快速识别受伤个体的数据的转化方法。
背景技术
目前对于CT图片的识别,主要依靠专业的CT读片医师来完成。人工阅片结果的正确性在很大程度上取决于阅片医师的个人素质,这种素质是综合性的,包括医师内部自身专业技能水平方面的,也包括医师精神状态方面的,这样对于阅片结果会带有一定的主观性。如果阅片医师如果精神状态不佳、绪烦乱,则可能会影响阅片的结果,会出现一定的错误,而且人工阅片具有费时、费力、高成本等不足之处。随着人工智能研究的深入,特别是深度学习在图片识别方面的长足发展,越来越多的研究希望借助人工智能从CT图片提取特征,建立快速识别模型。
发明内容
本发明的目的在于提供一种基于骨盆CT图片获得用于快速识别受伤个体的数据的转化方法,可以简便、快速、低成本从骨盆CT图片提取特征,并对这些特征进行转化,得到转化后的综合特征信息数据,综合特征信息数据进一步用于个体受伤的快速识别模型。
本发明解决其技术问题所采用的技术方案是:
一种基于骨盆CT图片获得用于快速识别受伤个体的数据的转化方法,包括以下步骤:
1)对若干个体进行CT扫描;这些个体包括骨盆正常和骨盆受伤的,收集这些个体的骨盆CT图片,阅片医师对这些骨盆CT图片进行标记,标记出正常个体和受伤个体;
2)收集这些个体的骨盆CT图片;
3)利用卷积神经网络特征提取器对骨盆CT图片进行特征提取,得到特征数据,并利用转化方程对这些特征数据进行转化,得到转化后的综合特征信息数据。
本发明可以简便、快速、低成本从骨盆CT图片提取特征,并对这些特征进行转化,得到转化后的综合特征信息数据,这个综合特征信息数据作为数据源代入到个体受伤的快速识别模型中,基于个体受伤的快速识别模型得出的结果,从而判别出个体是否受伤。本发明得到的综合特征信息数据仅为中间结果,并不能直接判断个体是否受伤,只有进一步作为变量代入到个体受伤的快速识别模型后,经过模型的进一步分析计算,才能获得判断个体是否受伤的判断依据。本发明最关键的核心是如何将CT图片信息提取出特征数据,再进一步开发转化模型,排除特征数据中的冗余信息,得到一个进一步的处理数据,从而作为最终识别个体是否受伤模型的数据源,经过模型进一步数据计算和转换,以协助快速获得识别个体是否受伤的判断依据。
所述特征数据包括x1-x29共29个特征数据;所述综合特征信息数据包括Y1-Y8共8个数据。
作为优选,Y1的转化方程为:
Y1=-0.002608[x1]-0.003528[x2]-0.01981[x3]+0.003867[x4]-0.03497[x5]-0.02528[x6]-0.005055[x7]-0.06280[x8]-0.001421[x9]-0.0009575[x10]-0.07293[x11]-0.05886[x12]-0.04936[x13]-0.06122[x14]-0.03372[x15]-0.07198[x16]-0.02127[x17]-0.05554[x18]-0.03507[x19]-0.07764[x20]-0.06150[x21]-0.04577[x22]-0.07286[x23]-0.03010[x24]+0.01372[x25]-0.07084[x26]-0.06171[x27]-0.07209[x28]-0.04400[x29]+15.087。
作为优选,Y2的转化方程为:
Y2=+0.001791[x1]-0.03705[x2]+0.009386[x3]+0.09622[x4]+0.008747[x5]+0.01043[x6]+0.1009[x7]+0.01433[x8]+0.1572[x9]+0.02748[x10]+0.03883[x11]+0.03646[x12]-0.01670[x13]+0.01962[x14]+0.04708[x15]-0.06334[x16]+0.1111[x17]-0.03009[x18]+0.06032[x19]-0.05217[x20]-0.05729[x21]-0.006463[x22]-0.01770[x23]+0.01615[x24]+0.1990[x25]-0.02558[x26]-0.02540[x27]-0.04620[x28]-0.04119[x29]-4.735。
作为优选,Y3的转化方程为:
Y3=-0.1135[x1]-0.2012[x2]-0.003212[x3]+0.1380[x4]+0.03829[x5]-0.1143[x6]-0.01008[x7]-0.07625[x8]+0.09322[x9]-0.02301[x10]+0.06214[x11]+0.02814[x12]+0.02732[x13]-0.04548[x14]+0.04238[x15]-0.01092[x16]-0.01883[x17]+0.02114[x18]+0.05014[x19]-0.1207[x20]-0.01414[x21]-0.03235[x22]-0.05944[x23]+0.02177[x24]+0.09775[x25]+0.0008945[x26]-0.01516[x27]+0.04469[x28]+0.01955[x29]+3.522。
作为优选,Y4的转化方程为:
Y4=+0.1375[x1]-0.04963[x2]+0.04133[x3]+0.1679[x4]-5.722E-06[x5]-0.1303[x6]+0.06474[x7]-0.09825[x8]+0.1322[x9]-0.02689[x10]+0.01115[x11]+0.06599[x12]+0.08598[x13]-0.1722[x14]+0.03749[x15]+0.04015[x16]-0.09560[x17]+0.06715[x18]+0.03999[x19]-0.08566[x20]+0.009685[x21]-0.07393[x22]-0.03277[x23]+0.05434[x24]+0.05768[x25]+0.002512[x26]-0.06272[x27]+0.007202[x28]-0.05132[x29]-3.472。
作为优选,Y5的转化方程为:
Y5=-0.004278[x1]-0.03228[x2]-0.03842[x3]-0.04562[x4]+0.002254[x5]+0.03222[x6]+0.07669[x7]-0.1307[x8]+0.08191[x9]-0.004142[x10]+0.06341[x11]+0.1195[x12]+0.03697[x13]-0.1907[x14]-0.07101[x15]-0.02194[x16]-0.07558[x17]+0.04869[x18]+0.006796[x19]-0.04322[x20]+0.06071[x21]+0.01823[x22]-0.03642[x23]+0.05275[x24]+0.1535[x25]-0.02061[x26]-0.009553[x27]-0.003576[x28]-0.03984[x29]+0.2459。
作为优选,Y6的转化方程为:
Y6=+0.03842[x1]+0.05029[x2]-0.02583[x3]+0.02608[x4]+0.01359[x5]+0.08693[x6]+0.07072[x7]-0.1622[x8]-0.01445[x9]-0.06404[x10]+0.01896[x11]+0.1473[x12]+0.05971[x13]-0.04348[x14]-0.07814[x15]-0.05191[x16]+0.01407[x17]+0.05281[x18]-0.02281[x19]-0.1182[x20]-0.003314[x21]+0.05001[x22]+0.04439[x23]-0.07144[x24]-0.001820[x25]+0.06537[x26]-0.05198[x27]+0.02168[x28]-0.03421[x29]-0.5222。
作为优选,Y7的转化方程为:
Y7=-0.02849[x1]+0.1337[x2]-0.03743[x3]+0.07537[x4]-0.01519[x5]-0.03177[x6]+0.08396[x7]-0.1775[x8]+0.004585[x9]+0.03242[x10]+0.02601[x11]-0.07063[x12]+0.08590[x13]+0.01693[x14]-0.02172[x15]+0.003717[x16]+0.07345[x17]+0.05540[x18]-0.06135[x19]-0.05733[x20]+0.001809[x21]+0.04623[x22]+0.07425[x23]-0.008544[x24]+0.03472[x25]-0.01514[x26]+0.07609[x27]-0.02744[x28]+0.04169[x29]-2.856。
作为优选,Y8的转化方程为:
Y8=-0.04109[x1]+0.04229[x2]-0.03447[x3]-0.05996[x4]+0.004215[x5]-0.01111[x6]+0.04588[x7]-0.07993[x8]+0.1370[x9]-0.01799[x10]+0.01631[x11]-0.08259[x12]+0.06593[x13]+0.1146[x14]-0.09861[x15]+0.02502[x16]+0.1166[x17]+0.1020[x18]-0.02671[x19]-0.07814[x20]+0.03331[x21]-0.04498[x22]+0.05445[x23]-0.05827[x24]-0.04238[x25]-0.05418[x26]-0.05118[x27]-0.07210[x28]+0.04190[x29]+1.110。
本发明的有益效果是:
一、简便快速收集数据:利用CT仪器扫描个体得到骨盆CT图片数据需要10分钟,对骨盆CT图片数据进行特征提取、数据转化,整个过程只需要2秒钟即可,方便、快捷,而且仅需一人即可完成。
二、低成本:本发明基于骨盆CT图片转化后数据作为中间结果,作为数据源代入到个体受伤的快速识别模型中,就能快速识别受伤个体,相比于人工医师阅片,用人少,时间短,降低了成本。
附图说明
图1为正常个体的骨盆CT图片。
图2为受伤个体的骨盆CT图片。
具体实施方式
下面通过具体实施例,对本发明的技术方案作进一步的具体说明。
本发明中,若非特指,所采用的原料和设备等均可从市场购得或是本领域常用的。下述实施例中的方法,如无特别说明,均为本领域的常规方法。
实施例:
一种基于骨盆CT图片获得用于快速识别受伤个体的数据的转化方法,包括以下步骤:
1)对若干个体进行CT扫描;
2)收集这些个体的骨盆CT图片;
3)利用卷积神经网络特征提取器对骨盆CT图片进行特征提取,得到特征数据,并利用转化方程对这些特征数据进行转化,得到转化后的综合特征信息数据;所述特征数据包括x1-x29共29个特征数据;所述综合特征信息数据包括Y1-Y8共8个数据。
Y1的转化方程为:
Y1=-0.002608[x1]-0.003528[x2]-0.01981[x3]+0.003867[x4]-0.03497[x5]-0.02528[x6]-0.005055[x7]-0.06280[x8]-0.001421[x9]-0.0009575[x10]-0.07293[x11]-0.05886[x12]-0.04936[x13]-0.06122[x14]-0.03372[x15]-0.07198[x16]-0.02127[x17]-0.05554[x18]-0.03507[x19]-0.07764[x20]-0.06150[x21]-0.04577[x22]-0.07286[x23]-0.03010[x24]+0.01372[x25]-0.07084[x26]-0.06171[x27]-0.07209[x28]-0.04400[x29]+15.087。
Y2的转化方程为:
Y2=+0.001791[x1]-0.03705[x2]+0.009386[x3]+0.09622[x4]+0.008747[x5]+0.01043[x6]+0.1009[x7]+0.01433[x8]+0.1572[x9]+0.02748[x10]+0.03883[x11]+0.03646[x12]-0.01670[x13]+0.01962[x14]+0.04708[x15]-0.06334[x16]+0.1111[x17]-0.03009[x18]+0.06032[x19]-0.05217[x20]-0.05729[x21]-0.006463[x22]-0.01770[x23]+0.01615[x24]+0.1990[x25]-0.02558[x26]-0.02540[x27]-0.04620[x28]-0.04119[x29]-4.735。
Y3的转化方程为:
Y3=-0.1135[x1]-0.2012[x2]-0.003212[x3]+0.1380[x4]+0.03829[x5]-0.1143[x6]-0.01008[x7]-0.07625[x8]+0.09322[x9]-0.02301[x10]+0.06214[x11]+0.02814[x12]+0.02732[x13]-0.04548[x14]+0.04238[x15]-0.01092[x16]-0.01883[x17]+0.02114[x18]+0.05014[x19]-0.1207[x20]-0.01414[x21]-0.03235[x22]-0.05944[x23]+0.02177[x24]+0.09775[x25]+0.0008945[x26]-0.01516[x27]+0.04469[x28]+0.01955[x29]+3.522。
Y4的转化方程为:
Y4=+0.1375[x1]-0.04963[x2]+0.04133[x3]+0.1679[x4]-5.722E-06[x5]-0.1303[x6]+0.06474[x7]-0.09825[x8]+0.1322[x9]-0.02689[x10]+0.01115[x11]+0.06599[x12]+0.08598[x13]-0.1722[x14]+0.03749[x15]+0.04015[x16]-0.09560[x17]+0.06715[x18]+0.03999[x19]-0.08566[x20]+0.009685[x21]-0.07393[x22]-0.03277[x23]+0.05434[x24]+0.05768[x25]+0.002512[x26]-0.06272[x27]+0.007202[x28]-0.05132[x29]-3.472。
Y5的转化方程为:
Y5=-0.004278[x1]-0.03228[x2]-0.03842[x3]-0.04562[x4]+0.002254[x5]+0.03222[x6]+0.07669[x7]-0.1307[x8]+0.08191[x9]-0.004142[x10]+0.06341[x11]+0.1195[x12]+0.03697[x13]-0.1907[x14]-0.07101[x15]-0.02194[x16]-0.07558[x17]+0.04869[x18]+0.006796[x19]-0.04322[x20]+0.06071[x21]+0.01823[x22]-0.03642[x23]+0.05275[x24]+0.1535[x25]-0.02061[x26]-0.009553[x27]-0.003576[x28]-0.03984[x29]+0.2459。
Y6的转化方程为:
Y6=+0.03842[x1]+0.05029[x2]-0.02583[x3]+0.02608[x4]+0.01359[x5]+0.08693[x6]+0.07072[x7]-0.1622[x8]-0.01445[x9]-0.06404[x10]+0.01896[x11]+0.1473[x12]+0.05971[x13]-0.04348[x14]-0.07814[x15]-0.05191[x16]+0.01407[x17]+0.05281[x18]-0.02281[x19]-0.1182[x20]-0.003314[x21]+0.05001[x22]+0.04439[x23]-0.07144[x24]-0.001820[x25]+0.06537[x26]-0.05198[x27]+0.02168[x28]-0.03421[x29]-0.5222。
Y7的转化方程为:
Y7=-0.02849[x1]+0.1337[x2]-0.03743[x3]+0.07537[x4]-0.01519[x5]-0.03177[x6]+0.08396[x7]-0.1775[x8]+0.004585[x9]+0.03242[x10]+0.02601[x11]-0.07063[x12]+0.08590[x13]+0.01693[x14]-0.02172[x15]+0.003717[x16]+0.07345[x17]+0.05540[x18]-0.06135[x19]-0.05733[x20]+0.001809[x21]+0.04623[x22]+0.07425[x23]-0.008544[x24]+0.03472[x25]-0.01514[x26]+0.07609[x27]-0.02744[x28]+0.04169[x29]-2.856。
Y8的转化方程为:
Y8=-0.04109[x1]+0.04229[x2]-0.03447[x3]-0.05996[x4]+0.004215[x5]-0.01111[x6]+0.04588[x7]-0.07993[x8]+0.1370[x9]-0.01799[x10]+0.01631[x11]-0.08259[x12]+0.06593[x13]+0.1146[x14]-0.09861[x15]+0.02502[x16]+0.1166[x17]+0.1020[x18]-0.02671[x19]-0.07814[x20]+0.03331[x21]-0.04498[x22]+0.05445[x23]-0.05827[x24]-0.04238[x25]-0.05418[x26]-0.05118[x27]-0.07210[x28]+0.04190[x29]+1.110。
具体实例:
基于本发明骨盆CT图片转化后数据快速识别受伤个体的方法,包括如下步骤:
(1)对116个个体进行CT扫描,得到116个骨盆CT图片,如图1和图2所示。阅片医师对这116个骨盆CT图片进行标记,标记出正常个体和受伤个体;
(2)利用卷积神经网络特征提取器对骨盆CT图片进行特征提取,得到特征数据,结果如表1所示;
表1特征数据
特征参数 正常 正常 正常 正常 受伤 受伤 受伤 受伤
x1 26.71 32.19 22.84 22.72 23.31 23.31 23.96 23.31
x2 21.07 15.46 17.28 16.26 9.87 9.87 10.12 9.87
x3 28.59 24.69 7.60 7.70 10.06 7.71 18.76 7.75
x4 20.21 12.62 9.56 6.83 11.75 4.08 11.04 5.85
x5 19.30 15.29 9.43 11.20 13.63 14.30 5.06 12.13
x6 14.15 8.15 9.56 10.98 11.75 10.14 7.93 10.40
x7 16.75 12.43 6.93 11.55 11.66 7.22 4.57 9.08
x8 17.48 13.35 7.89 11.67 9.88 8.63 7.22 8.86
x9 14.91 13.35 10.06 8.48 12.93 15.08 10.70 11.62
x10 13.11 10.87 14.69 11.06 12.69 8.15 7.05 10.59
x11 14.85 12.92 12.90 11.04 5.87 12.41 6.51 10.31
x12 17.18 15.57 12.94 16.66 10.49 12.03 6.10 9.47
x13 13.66 15.30 13.84 12.32 9.47 8.43 7.37 7.78
x14 14.00 10.52 11.92 8.55 8.92 8.39 4.79 9.22
x15 15.02 9.94 15.01 10.06 8.44 7.32 5.28 12.39
x16 20.31 15.80 9.21 13.18 10.44 10.76 6.07 7.90
x17 12.81 7.95 8.83 10.87 8.85 15.39 9.65 12.36
x18 26.28 14.77 8.35 13.97 9.68 10.55 10.35 7.42
x19 14.76 17.19 10.85 10.07 12.47 11.05 7.07 13.02
x20 19.89 14.47 11.66 14.55 11.30 7.61 4.21 12.10
x21 23.63 18.78 7.41 18.06 8.89 10.92 8.80 9.95
x22 12.86 13.52 8.76 14.19 7.46 7.78 4.73 13.26
x23 18.32 15.74 15.70 12.11 8.53 10.98 3.42 15.14
x24 12.69 12.84 19.73 10.56 11.17 10.71 6.71 18.17
x25 12.54 6.19 6.89 9.25 8.15 12.45 4.01 13.15
x26 23.81 20.92 10.38 18.16 10.48 13.07 4.48 11.00
x27 17.49 9.25 18.71 11.11 9.54 10.82 4.93 7.39
x28 20.23 13.88 14.27 9.39 11.42 12.37 6.27 9.15
x29 19.15 10.68 11.54 7.82 9.28 10.06 6.07 11.15
(3)对于提取到的29个特征,由于包含了过多的冗余信息,我们将这29个特征数据代入以下映射转换方程,得到映射转换后数据,如表2所示。
Y1=-0.002608[x1]-0.003528[x2]-0.01981[x3]+0.003867[x4]-0.03497[x5]-0.02528[x6]-0.005055[x7]-0.06280[x8]-0.001421[x9]-0.0009575[x10]-0.07293[x11]-0.05886[x12]-0.04936[x13]-0.06122[x14]-0.03372[x15]-0.07198[x16]-0.02127[x17]-0.05554[x18]-0.03507[x19]-0.07764[x20]-0.06150[x21]-0.04577[x22]-0.07286[x23]-0.03010[x24]+0.01372[x25]-0.07084[x26]-0.06171[x27]-0.07209[x28]-0.04400[x29]+15.087。
Y2=+0.001791[x1]-0.03705[x2]+0.009386[x3]+0.09622[x4]+0.008747[x5]+0.01043[x6]+0.1009[x7]+0.01433[x8]+0.1572[x9]+0.02748[x10]+0.03883[x11]+0.03646[x12]-0.01670[x13]+0.01962[x14]+0.04708[x15]-0.06334[x16]+0.1111[x17]-0.03009[x18]+0.06032[x19]-0.05217[x20]-0.05729[x21]-0.006463[x22]-0.01770[x23]+0.01615[x24]+0.1990[x25]-0.02558[x26]-0.02540[x27]-0.04620[x28]-0.04119[x29]-4.735。
Y3=-0.1135[x1]-0.2012[x2]-0.003212[x3]+0.1380[x4]+0.03829[x5]-0.1143[x6]-0.01008[x7]-0.07625[x8]+0.09322[x9]-0.02301[x10]+0.06214[x11]+0.02814[x12]+0.02732[x13]-0.04548[x14]+0.04238[x15]-0.01092[x16]-0.01883[x17]+0.02114[x18]+0.05014[x19]-0.1207[x20]-0.01414[x21]-0.03235[x22]-0.05944[x23]+0.02177[x24]+0.09775[x25]+0.0008945[x26]-0.01516[x27]+0.04469[x28]+0.01955[x29]+3.522。
Y4=+0.1375[x1]-0.04963[x2]+0.04133[x3]+0.1679[x4]-5.722E-06[x5]-0.1303[x6]+0.06474[x7]-0.09825[x8]+0.1322[x9]-0.02689[x10]+0.01115[x11]+0.06599[x12]+0.08598[x13]-0.1722[x14]+0.03749[x15]+0.04015[x16]-0.09560[x17]+0.06715[x18]+0.03999[x19]-0.08566[x20]+0.009685[x21]-0.07393[x22]-0.03277[x23]+0.05434[x24]+0.05768[x25]+0.002512[x26]-0.06272[x27]+0.007202[x28]-0.05132[x29]-3.472。
Y5=-0.004278[x1]-0.03228[x2]-0.03842[x3]-0.04562[x4]+0.002254[x5]+0.03222[x6]+0.07669[x7]-0.1307[x8]+0.08191[x9]-0.004142[x10]+0.06341[x11]+0.1195[x12]+0.03697[x13]-0.1907[x14]-0.07101[x15]-0.02194[x16]-0.07558[x17]+0.04869[x18]+0.006796[x19]-0.04322[x20]+0.06071[x21]+0.01823[x22]-0.03642[x23]+0.05275[x24]+0.1535[x25]-0.02061[x26]-0.009553[x27]-0.003576[x28]-0.03984[x29]+0.2459。
Y6=+0.03842[x1]+0.05029[x2]-0.02583[x3]+0.02608[x4]+0.01359[x5]+0.08693[x6]+0.07072[x7]-0.1622[x8]-0.01445[x9]-0.06404[x10]+0.01896[x11]+0.1473[x12]+0.05971[x13]-0.04348[x14]-0.07814[x15]-0.05191[x16]+0.01407[x17]+0.05281[x18]-0.02281[x19]-0.1182[x20]-0.003314[x21]+0.05001[x22]+0.04439[x23]-0.07144[x24]-0.001820[x25]+0.06537[x26]-0.05198[x27]+0.02168[x28]-0.03421[x29]-0.5222。
Y7=-0.02849[x1]+0.1337[x2]-0.03743[x3]+0.07537[x4]-0.01519[x5]-0.03177[x6]+0.08396[x7]-0.1775[x8]+0.004585[x9]+0.03242[x10]+0.02601[x11]-0.07063[x12]+0.08590[x13]+0.01693[x14]-0.02172[x15]+0.003717[x16]+0.07345[x17]+0.05540[x18]-0.06135[x19]-0.05733[x20]+0.001809[x21]+0.04623[x22]+0.07425[x23]-0.008544[x24]+0.03472[x25]-0.01514[x26]+0.07609[x27]-0.02744[x28]+0.04169[x29]-2.856。
Y8=-0.04109[x1]+0.04229[x2]-0.03447[x3]-0.05996[x4]+0.004215[x5]-0.01111[x6]+0.04588[x7]-0.07993[x8]+0.1370[x9]-0.01799[x10]+0.01631[x11]-0.08259[x12]+0.06593[x13]+0.1146[x14]-0.09861[x15]+0.02502[x16]+0.1166[x17]+0.1020[x18]-0.02671[x19]-0.07814[x20]+0.03331[x21]-0.04498[x22]+0.05445[x23]-0.05827[x24]-0.04238[x25]-0.05418[x26]-0.05118[x27]-0.07210[x28]+0.04190[x29]+1.110。
表2转换后数据
转换后参数 正常 正常 正常 正常 受伤 受伤 受伤 受伤
Y1 -5.83 -1.35 1.57 0.84 3.92 3.13 7.83 3.16
Y2 1.02 -0.39 -0.43 -0.95 1.00 1.54 -0.45 1.77
Y3 -1.44 -1.41 -1.00 -2.36 0.48 1.01 0.74 0.01
Y4 1.95 3.34 -0.95 -1.07 0.72 -0.34 1.80 -1.01
Y5 -0.93 -0.12 -0.67 1.28 0.26 1.67 -0.19 0.83
Y6 0.69 1.32 -0.58 1.81 0.02 1.13 0.40 -0.50
Y7 1.93 -0.61 1.81 0.40 -0.75 -0.25 -1.08 -0.30
Y8 -0.52 -0.95 -1.14 -0.39 -0.22 1.55 1.09 -0.39
(4)以转化后的综合特征信息数据为自变量,以个体正常和个体受伤为目标变量,采用支持向量机算法建立个体受伤的快速识别模型。对于支持向量机建模过程中,其选择的径向基核函数,其惩罚因子选择为30。其建模准确率为98.28%。
(5)根据个体是否受伤的支持向量机判别模型和待检测个体的转换后数据,快速预报待检测个体是否受伤。其预报准确率为86.67%,如表3所示。
表3预报结果
116个个体是否受伤的支持向量机预报模型的建模结果:
首先对若干个体进行CT扫描,这些个体包括骨盆正常和骨盆受伤的。收集这些个体的骨盆CT图片,阅片医师对这些骨盆CT图片进行标记,标记出正常个体和受伤个体。其次利用卷积神经网络特征提取器对骨盆CT图片进行特征提取,得到特征数据,并利用转化方程对这些特征数据进行转化,得到转化后的综合特征信息数据。最后以转化后的综合特征信息数据为自变量,以个体正常和个体受伤为目标变量,采用支持向量机算法建立个体受伤的快速识别模型。其建模准确率为98.28%。
对30个待测个体的预报结果,如表3所示;对新的个体进行CT扫描得到待判别个体是否受伤的待测骨盆CT图片,通卷积神经网络特征提取器提取出这些待测骨盆CT图片的特征数据,利用转化方程进行转化,最后将转化后的数据代入个体是否受伤的支持向量机判别模型,判别出个体是否受伤。其预报准确率为86.67%,如表3所示。
以上所述的实施例只是本发明的一种较佳的方案,并非对本发明作任何形式上的限制,在不超出权利要求所记载的技术方案的前提下还有其它的变体及改型。

Claims (1)

1.一种基于骨盆CT图片获得用于快速识别受伤个体的数据的转化方法,其特征在于,包括以下步骤:
1)对若干个体进行CT扫描;
2)收集这些个体的骨盆CT图片;
3)利用卷积神经网络特征提取器对骨盆CT图片进行特征提取,得到特征数据,并利用转化方程对这些特征数据进行转化,排除特征数据中的冗余信息,得到转化后的综合特征信息数据;
所述特征数据包括x1-x29共29个特征数据;所述综合特征信息数据包括Y1-Y8共8个数据;
Y1的转化方程为:
Y1=-0.002608[x1]-0.003528[x2]-0.01981[x3]+0.003867[x4]-0.03497[x5]-0.02528[x6]-0.005055[x7]-0.06280[x8]-0.001421[x9]-0.0009575[x10]-0.07293[x11]-0.05886[x12]-0.04936[x13]-0.06122[x14]-0.03372[x15]-0.07198[x16]-0.02127[x17]-0.05554[x18]-0.03507[x19]-0.07764[x20]-0.06150[x21]-0.04577[x22]-0.07286[x23]-0.03010[x24]+0.01372[x25]-0.07084[x26]-0.06171[x27]-0.07209[x28]-0.04400[x29]+15.087;
Y2的转化方程为:
Y2=+0.001791[x1]-0.03705[x2]+0.009386[x3]+0.09622[x4]+0.008747[x5]+0.01043[x6]+0.1009[x7]+0.01433[x8]+0.1572[x9]+0.02748[x10]+0.03883[x11]+0.03646[x12]-0.01670[x13]+0.01962[x14]+0.04708[x15]-0.06334[x16]+0.1111[x17]-0.03009[x18]+0.06032[x19]-0.05217[x20]-0.05729[x21]-0.006463[x22]-0.01770[x23]+0.01615[x24]+0.1990[x25]-0.02558[x26]-0.02540[x27]-0.04620[x28]-0.04119[x29]-4.735;
Y3的转化方程为:
Y3=-0.1135[x1]-0.2012[x2]-0.003212[x3]+0.1380[x4]+0.03829[x5]-0.1143[x6]-0.01008[x7]-0.07625[x8]+0.09322[x9]-0.02301[x10]+0.06214[x11]+0.02814[x12]+0.02732[x13]-0.04548[x14]+0.04238[x15]-0.01092[x16]-0.01883[x17]+0.02114[x18]+0.05014[x19]-0.1207[x20]-0.01414[x21]-0.03235[x22]-0.05944[x23]+0.02177[x24]+0.09775[x25]+0.0008945[x26]-0.01516[x27]+0.04469[x28]+0.01955[x29]+3.522;
Y4的转化方程为:
Y4=+0.1375[x1]-0.04963[x2]+0.04133[x3]+0.1679[x4]-5.722E-06[x5]-0.1303[x6]+0.06474[x7]-0.09825[x8]+0.1322[x9]-0.02689[x10]+0.01115[x11]+0.06599[x12]+0.08598[x13]-0.1722[x14]+0.03749[x15]+0.04015[x16]-0.09560[x17]+0.06715[x18]+0.03999[x19]-0.08566[x20]+0.009685[x21]-0.07393[x22]-0.03277[x23]+0.05434[x24]+0.05768[x25]+0.002512[x26]-0.06272[x27]+0.007202[x28]-0.05132[x29]-3.472;
Y5的转化方程为:
Y5=-0.004278[x1]-0.03228[x2]-0.03842[x3]-0.04562[x4]+0.002254[x5]+0.03222[x6]+0.07669[x7]-0.1307[x8]+0.08191[x9]-0.004142[x10]+0.06341[x11]+0.1195[x12]+0.03697[x13]-0.1907[x14]-0.07101[x15]-0.02194[x16]-0.07558[x17]+0.04869[x18]+0.006796[x19]-0.04322[x20]+0.06071[x21]+0.01823[x22]-0.03642[x23]+0.05275[x24]+0.1535[x25]-0.02061[x26]-0.009553[x27]-0.003576[x28]-0.03984[x29]+0.2459;
Y6的转化方程为:
Y6=+0.03842[x1]+0.05029[x2]-0.02583[x3]+0.02608[x4]+0.01359[x5]+0.08693[x6]+0.07072[x7]-0.1622[x8]-0.01445[x9]-0.06404[x10]+0.01896[x11]+0.1473[x12]+0.05971[x13]-0.04348[x14]-0.07814[x15]-0.05191[x16]+0.01407[x17]+0.05281[x18]-0.02281[x19]-0.1182[x20]-0.003314[x21]+0.05001[x22]+0.04439[x23]-0.07144[x24]-0.001820[x25]+0.06537[x26]-0.05198[x27]+0.02168[x28]-0.03421[x29]-0.5222;
Y7的转化方程为:
Y7=-0.02849[x1]+0.1337[x2]-0.03743[x3]+0.07537[x4]-0.01519[x5]-0.03177[x6]+0.08396[x7]-0.1775[x8]+0.004585[x9]+0.03242[x10]+0.02601[x11]-0.07063[x12]+0.08590[x13]+0.01693[x14]-0.02172[x15]+0.003717[x16]+0.07345[x17]+0.05540[x18]-0.06135[x19]-0.05733[x20]+0.001809[x21]+0.04623[x22]+0.07425[x23]-0.008544[x24]+0.03472[x25]-0.01514[x26]+0.07609[x27]-0.02744[x28]+0.04169[x29]-2.856;
Y8的转化方程为:
Y8=-0.04109[x1]+0.04229[x2]-0.03447[x3]-0.05996[x4]+0.004215[x5]-0.01111[x6]+0.04588[x7]-0.07993[x8]+0.1370[x9]-0.01799[x10]+0.01631[x11]-0.08259[x12]+0.06593[x13]+0.1146[x14]-0.09861[x15]+0.02502[x16]+0.1166[x17]+0.1020[x18]-0.02671[x19]-0.07814[x20]+0.03331[x21]-0.04498[x22]+0.05445[x23]-0.05827[x24]-0.04238[x25]-0.05418[x26]-0.05118[x27]-0.07210[x28]+0.04190[x29]+1.110。
CN202011549518.2A 2020-12-24 2020-12-24 基于骨盆ct图片获得用于快速识别受伤个体的数据的转化方法 Active CN112562033B (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011549518.2A CN112562033B (zh) 2020-12-24 2020-12-24 基于骨盆ct图片获得用于快速识别受伤个体的数据的转化方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011549518.2A CN112562033B (zh) 2020-12-24 2020-12-24 基于骨盆ct图片获得用于快速识别受伤个体的数据的转化方法

Publications (2)

Publication Number Publication Date
CN112562033A CN112562033A (zh) 2021-03-26
CN112562033B true CN112562033B (zh) 2024-01-19

Family

ID=75033309

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011549518.2A Active CN112562033B (zh) 2020-12-24 2020-12-24 基于骨盆ct图片获得用于快速识别受伤个体的数据的转化方法

Country Status (1)

Country Link
CN (1) CN112562033B (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107849599A (zh) * 2015-06-30 2018-03-27 优比欧迈公司 用于诊断测试的方法和系统
CN109410337A (zh) * 2017-08-15 2019-03-01 北京蜂鸟互动科技有限公司 一种基于vr模型的人工智能医疗系统实现方法、系统
CN110070527A (zh) * 2019-04-18 2019-07-30 成都雷熵科技有限公司 一种基于区域全连接神经网络病灶检测方法
CN110907066A (zh) * 2019-11-30 2020-03-24 华能如东八仙角海上风力发电有限责任公司 基于深度学习模型的风电机组齿轮箱轴承温度状态监测方法
CN111145289A (zh) * 2019-12-30 2020-05-12 北京爱康宜诚医疗器材有限公司 骨盆三维数据的提取方法及装置
WO2020172558A1 (en) * 2019-02-21 2020-08-27 The Trustees Of Dartmouth College System and method for automatic detection of vertebral fractures on imaging scans using deep networks

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10420515B2 (en) * 2015-06-15 2019-09-24 Vital Labs, Inc. Method and system for acquiring data for assessment of cardiovascular disease
US20200193552A1 (en) * 2018-12-18 2020-06-18 Slyce Acquisition Inc. Sparse learning for computer vision

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107849599A (zh) * 2015-06-30 2018-03-27 优比欧迈公司 用于诊断测试的方法和系统
CN109410337A (zh) * 2017-08-15 2019-03-01 北京蜂鸟互动科技有限公司 一种基于vr模型的人工智能医疗系统实现方法、系统
WO2020172558A1 (en) * 2019-02-21 2020-08-27 The Trustees Of Dartmouth College System and method for automatic detection of vertebral fractures on imaging scans using deep networks
CN110070527A (zh) * 2019-04-18 2019-07-30 成都雷熵科技有限公司 一种基于区域全连接神经网络病灶检测方法
CN110907066A (zh) * 2019-11-30 2020-03-24 华能如东八仙角海上风力发电有限责任公司 基于深度学习模型的风电机组齿轮箱轴承温度状态监测方法
CN111145289A (zh) * 2019-12-30 2020-05-12 北京爱康宜诚医疗器材有限公司 骨盆三维数据的提取方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CT 影像识别的卷积神经网络模型;管姝等;《计算机辅助设计与图形学学报》;20180831;第30卷(第08期);正文第1-4节,第1530-1535页 *

Also Published As

Publication number Publication date
CN112562033A (zh) 2021-03-26

Similar Documents

Publication Publication Date Title
CN109191457B (zh) 一种病理图像质量有效性识别方法
CN107480677B (zh) 一种识别三维ct图像中感兴趣区域的方法及装置
WO2020024127A1 (zh) 骨龄评估与身高预测模型、其系统及其预测方法
CN109961426B (zh) 一种人脸皮肤肤质的检测方法
CN113314205B (zh) 一种高效的医学影像标注与学习系统
CN1152340C (zh) 基于知识的指纹图像增强方法
CN1759804A (zh) 中医四诊合参智能辨证方法
CN111681230A (zh) 脑白质高信号评分系统及其评分方法
CN110070024B (zh) 一种皮肤压力性损伤热成像图像识别的方法、系统及手机
CN107516004A (zh) 医学影像图片的识别处理方法及装置
CN108596174A (zh) 一种皮肤病影像的病灶定位方法
CN115578372A (zh) 基于目标检测和卷积变换的骨龄评估方法、设备及介质
US20220230748A1 (en) Artificial intelligence cloud diagnosis platform
Suvarna et al. Classification methods of skin burn images
CN112562033B (zh) 基于骨盆ct图片获得用于快速识别受伤个体的数据的转化方法
Devadoss et al. Performance improvement using an automation system for recognition of multiple parametric features based on human footprint
JP2005259049A (ja) 顔面照合装置
Liu et al. Bone image segmentation
CN111640097B (zh) 皮肤镜图像识别方法及设备
CN117237736A (zh) 一种基于机器视觉和深度学习的大曲质量检测方法
CN116824171A (zh) 中医高光谱舌象图像波段的选择方法及相关装置
Malik et al. Novel techniques for enhancement and segmentation of acne vulgaris lesions
Iyatomi et al. Automated color normalization for dermoscopy images
CN112259199A (zh) 医疗图像分类模型的训练方法、系统、存储介质及医疗图像处理装置
Cuiping et al. A system for identification of marine phytoplankton

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant