WO2020054738A1 - 推定装置、推定システム及び推定プログラム - Google Patents
推定装置、推定システム及び推定プログラム Download PDFInfo
- Publication number
- WO2020054738A1 WO2020054738A1 PCT/JP2019/035594 JP2019035594W WO2020054738A1 WO 2020054738 A1 WO2020054738 A1 WO 2020054738A1 JP 2019035594 W JP2019035594 W JP 2019035594W WO 2020054738 A1 WO2020054738 A1 WO 2020054738A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- estimation
- input
- image
- bone
- learning
- Prior art date
Links
- 230000037182 bone density Effects 0.000 claims abstract description 176
- 210000000988 bone and bone Anatomy 0.000 claims abstract description 140
- 238000013528 artificial neural network Methods 0.000 claims description 139
- 208000010392 Bone Fractures Diseases 0.000 claims description 39
- 238000000034 method Methods 0.000 claims description 28
- 238000006243 chemical reaction Methods 0.000 claims description 12
- 230000006870 function Effects 0.000 claims description 12
- 239000007943 implant Substances 0.000 claims description 10
- 206010028980 Neoplasm Diseases 0.000 claims description 9
- 208000001132 Osteoporosis Diseases 0.000 claims description 9
- 230000008859 change Effects 0.000 claims description 7
- 229910052500 inorganic mineral Inorganic materials 0.000 claims description 7
- 239000011707 mineral Substances 0.000 claims description 7
- 238000004364 calculation method Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 6
- 239000000284 extract Substances 0.000 claims description 3
- 208000018084 Bone neoplasm Diseases 0.000 claims description 2
- 238000002560 therapeutic procedure Methods 0.000 claims description 2
- 230000001678 irradiating effect Effects 0.000 description 38
- 230000036541 health Effects 0.000 description 37
- 238000004891 communication Methods 0.000 description 31
- 210000000038 chest Anatomy 0.000 description 27
- 238000003384 imaging method Methods 0.000 description 26
- 210000001624 hip Anatomy 0.000 description 22
- 238000012545 processing Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 14
- 210000000689 upper leg Anatomy 0.000 description 14
- 210000002569 neuron Anatomy 0.000 description 13
- 230000015654 memory Effects 0.000 description 11
- 239000003814 drug Substances 0.000 description 10
- 238000009547 dual-energy X-ray absorptiometry Methods 0.000 description 10
- 210000003127 knee Anatomy 0.000 description 10
- 238000013527 convolutional neural network Methods 0.000 description 8
- 229940079593 drug Drugs 0.000 description 8
- 210000000629 knee joint Anatomy 0.000 description 8
- 210000004705 lumbosacral region Anatomy 0.000 description 8
- 230000035622 drinking Effects 0.000 description 7
- 239000000203 mixture Substances 0.000 description 7
- 230000000391 smoking effect Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 5
- 210000002683 foot Anatomy 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 238000011176 pooling Methods 0.000 description 5
- 238000002591 computed tomography Methods 0.000 description 4
- 238000002595 magnetic resonance imaging Methods 0.000 description 4
- 210000000323 shoulder joint Anatomy 0.000 description 4
- 210000000707 wrist Anatomy 0.000 description 4
- 108010022452 Collagen Type I Proteins 0.000 description 3
- 102000012422 Collagen Type I Human genes 0.000 description 3
- 210000000544 articulatio talocruralis Anatomy 0.000 description 3
- 210000002310 elbow joint Anatomy 0.000 description 3
- 210000004394 hip joint Anatomy 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 230000011164 ossification Effects 0.000 description 3
- 230000006403 short-term memory Effects 0.000 description 3
- 208000006386 Bone Resorption Diseases 0.000 description 2
- 241000282326 Felis catus Species 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 2
- 210000000577 adipose tissue Anatomy 0.000 description 2
- 210000003423 ankle Anatomy 0.000 description 2
- 230000004097 bone metabolism Effects 0.000 description 2
- 230000024279 bone resorption Effects 0.000 description 2
- 238000007469 bone scintigraphy Methods 0.000 description 2
- 229910000389 calcium phosphate Inorganic materials 0.000 description 2
- 239000001506 calcium phosphate Substances 0.000 description 2
- 235000011010 calcium phosphates Nutrition 0.000 description 2
- HVYWMOMLDIMFJA-DPAQBDIFSA-N cholesterol Chemical compound C1C=C2C[C@@H](O)CC[C@]2(C)[C@@H]2[C@@H]1[C@@H]1CC[C@H]([C@H](C)CCCC(C)C)[C@@]1(C)CC2 HVYWMOMLDIMFJA-DPAQBDIFSA-N 0.000 description 2
- 108010049937 collagen type I trimeric cross-linked peptide Proteins 0.000 description 2
- 230000001054 cortical effect Effects 0.000 description 2
- 235000006694 eating habits Nutrition 0.000 description 2
- 238000002513 implantation Methods 0.000 description 2
- 210000004072 lung Anatomy 0.000 description 2
- 210000000236 metacarpal bone Anatomy 0.000 description 2
- 210000003739 neck Anatomy 0.000 description 2
- 230000001172 regenerating effect Effects 0.000 description 2
- 229940095743 selective estrogen receptor modulator Drugs 0.000 description 2
- 239000000333 selective estrogen receptor modulator Substances 0.000 description 2
- 238000002603 single-photon emission computed tomography Methods 0.000 description 2
- 210000004003 subcutaneous fat Anatomy 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 210000000115 thoracic cavity Anatomy 0.000 description 2
- QORWJWZARLRLPR-UHFFFAOYSA-H tricalcium bis(phosphate) Chemical compound [Ca+2].[Ca+2].[Ca+2].[O-]P([O-])([O-])=O.[O-]P([O-])([O-])=O QORWJWZARLRLPR-UHFFFAOYSA-H 0.000 description 2
- 235000013343 vitamin Nutrition 0.000 description 2
- 239000011782 vitamin Substances 0.000 description 2
- 229940088594 vitamin Drugs 0.000 description 2
- 229930003231 vitamin Natural products 0.000 description 2
- 210000003857 wrist joint Anatomy 0.000 description 2
- 241000251468 Actinopterygii Species 0.000 description 1
- 235000001674 Agaricus brunnescens Nutrition 0.000 description 1
- 102000002260 Alkaline Phosphatase Human genes 0.000 description 1
- 108020004774 Alkaline Phosphatase Proteins 0.000 description 1
- 229940122361 Bisphosphonate Drugs 0.000 description 1
- 102000055006 Calcitonin Human genes 0.000 description 1
- 108060001064 Calcitonin Proteins 0.000 description 1
- OYPRJOBELJOOCE-UHFFFAOYSA-N Calcium Chemical compound [Ca] OYPRJOBELJOOCE-UHFFFAOYSA-N 0.000 description 1
- ZAHDXEIQWWLQQL-IHRRRGAJSA-N Deoxypyridinoline Chemical compound OC(=O)[C@@H](N)CCCC[N+]1=CC(O)=C(C[C@H](N)C([O-])=O)C(CC[C@H](N)C(O)=O)=C1 ZAHDXEIQWWLQQL-IHRRRGAJSA-N 0.000 description 1
- LOJFGJZQOKTUBR-XAQOOIOESA-N NC(N)=NCCC[C@@H](C(O)=O)NC(=O)CNC(=O)CNC(=O)[C@H](CC(O)=O)NC(=O)[C@@H](NC(=O)[C@@H](NC(=O)[C@H](CCCCN)NC(=O)[C@@H](N)CCC(O)=O)C)CC1=CN=CN1 Chemical compound NC(N)=NCCC[C@@H](C(O)=O)NC(=O)CNC(=O)CNC(=O)[C@H](CC(O)=O)NC(=O)[C@@H](NC(=O)[C@@H](NC(=O)[C@H](CCCCN)NC(=O)[C@@H](N)CCC(O)=O)C)CC1=CN=CN1 LOJFGJZQOKTUBR-XAQOOIOESA-N 0.000 description 1
- 102000004067 Osteocalcin Human genes 0.000 description 1
- 108090000573 Osteocalcin Proteins 0.000 description 1
- 102000007591 Tartrate-Resistant Acid Phosphatase Human genes 0.000 description 1
- 108010032050 Tartrate-Resistant Acid Phosphatase Proteins 0.000 description 1
- AUYYCJSJGJYCDS-LBPRGKRZSA-N Thyrolar Chemical class IC1=CC(C[C@H](N)C(O)=O)=CC(I)=C1OC1=CC=C(O)C(I)=C1 AUYYCJSJGJYCDS-LBPRGKRZSA-N 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 150000004663 bisphosphonates Chemical class 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 230000037118 bone strength Effects 0.000 description 1
- 229960004015 calcitonin Drugs 0.000 description 1
- BBBFJLBPOGFECG-VJVYQDLKSA-N calcitonin Chemical compound N([C@H](C(=O)N[C@@H](CC(C)C)C(=O)NCC(=O)N[C@@H](CCCCN)C(=O)N[C@@H](CC(C)C)C(=O)N[C@@H](CO)C(=O)N[C@@H](CCC(N)=O)C(=O)N[C@@H](CCC(O)=O)C(=O)N[C@@H](CC(C)C)C(=O)N[C@@H](CC=1NC=NC=1)C(=O)N[C@@H](CCCCN)C(=O)N[C@@H](CC(C)C)C(=O)N[C@@H](CCC(N)=O)C(=O)N[C@@H]([C@@H](C)O)C(=O)N[C@@H](CC=1C=CC(O)=CC=1)C(=O)N1[C@@H](CCC1)C(=O)N[C@@H](CCCNC(N)=N)C(=O)N[C@@H]([C@@H](C)O)C(=O)N[C@@H](CC(N)=O)C(=O)N[C@@H]([C@@H](C)O)C(=O)NCC(=O)N[C@@H](CO)C(=O)NCC(=O)N[C@@H]([C@@H](C)O)C(=O)N1[C@@H](CCC1)C(N)=O)C(C)C)C(=O)[C@@H]1CSSC[C@H](N)C(=O)N[C@@H](CO)C(=O)N[C@@H](CC(N)=O)C(=O)N[C@@H](CC(C)C)C(=O)N[C@@H](CO)C(=O)N[C@@H]([C@@H](C)O)C(=O)N1 BBBFJLBPOGFECG-VJVYQDLKSA-N 0.000 description 1
- 239000011575 calcium Substances 0.000 description 1
- 229910052791 calcium Inorganic materials 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 235000012000 cholesterol Nutrition 0.000 description 1
- 229940096422 collagen type i Drugs 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 235000013365 dairy product Nutrition 0.000 description 1
- 229960001251 denosumab Drugs 0.000 description 1
- 238000001647 drug administration Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000001513 elbow Anatomy 0.000 description 1
- 210000003811 finger Anatomy 0.000 description 1
- 210000001145 finger joint Anatomy 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 229940088597 hormone Drugs 0.000 description 1
- 239000005556 hormone Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 210000001847 jaw Anatomy 0.000 description 1
- 150000002632 lipids Chemical class 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 238000001050 pharmacotherapy Methods 0.000 description 1
- 238000000554 physical therapy Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 210000002966 serum Anatomy 0.000 description 1
- 210000002832 shoulder Anatomy 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000036561 sun exposure Effects 0.000 description 1
- 210000001738 temporomandibular joint Anatomy 0.000 description 1
- 239000005495 thyroid hormone Substances 0.000 description 1
- 229940036555 thyroid hormone Drugs 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 210000002700 urine Anatomy 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
- 150000003722 vitamin derivatives Chemical class 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/50—Clinical applications
- A61B6/505—Clinical applications involving diagnosis of bone
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/48—Diagnostic techniques
- A61B6/482—Diagnostic techniques involving multiple energy imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30008—Bone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- This disclosure relates to estimating bone density.
- Patent Document 1 describes a technique for determining osteoporosis.
- Patent Literature 2 discloses a technique for estimating bone strength.
- the estimating apparatus includes: an input unit to which input information including an image of a bone is input; and an estimation unit related to the bone density of the bone, based on the input information input to the input unit.
- An approximator capable of estimating a result, and an output unit that outputs an estimation result estimated by the approximator, wherein the approximator obtains an estimation result related to the bone density of the bone from the input information.
- the estimation system includes an input unit to which input information including an image of a bone is input, and an estimation unit for determining an estimation result related to the bone density of the bone from the input information.
- An approximater having a learned parameter and capable of estimating an estimation result related to the bone density of the bone from the input information input to the input unit, when an input information is input to the input unit.
- the input information is processed by the approximator.
- the estimation program performs a calculation based on learned parameters for obtaining an estimation result related to the bone density of the bone from input information including an image in which the bone is captured, and This is an estimation program for causing the device to function as a neural network that outputs an estimated value of the bone density of the bone shown in FIG.
- FIG. 3 is a diagram illustrating an example of a configuration of a computer device (estimation device). It is a figure for explaining operation of an estimation device. It is a figure showing an example of composition of a neural network. It is a figure which shows an example of a mode that the image data for learning and the reference
- FIG. 1 is a block diagram showing an example of a configuration of a computer device 1 according to the first embodiment.
- the computer device 1 functions as an estimating device that estimates bone density.
- the computer device 1 may be referred to as an “estimating device 1”.
- the estimation device 1 includes, for example, a control unit 10, a storage unit 20, a communication unit 30, a display unit 40, and an input unit 50.
- the control unit 10, the storage unit 20, the communication unit 30, the display unit 40, and the input unit 50 are electrically connected to each other by, for example, a bus 60.
- the control unit 10 can control the operation of the estimating apparatus 1 by controlling other components of the estimating apparatus 1.
- the control unit 10 can be said to be a control device or a control circuit.
- the controller 10 includes at least one processor to provide control and processing power to perform various functions, as described in further detail below.
- At least one processor is implemented as a single integrated circuit (IC) or as a plurality of communicatively connected integrated circuits (ICs) and / or discrete circuits. You may. At least one processor may be implemented according to various known techniques.
- a processor includes one or more circuits or units configured to perform one or more data calculation procedures or processes, for example, by executing instructions stored in an associated memory.
- the processor may be firmware (eg, a discrete logic component) configured to perform one or more data calculation procedures or processes.
- the processor is one or more processors, controllers, microprocessors, microcontrollers, application specific integrated circuits (ASICs), digital signal processors, programmable logic devices, field programmable gate arrays, or a combination thereof. It may include any combination of devices or configurations, or combinations of other known devices and configurations, and perform the functions described below.
- the control unit 10 includes, for example, a CPU (Central Processing Unit).
- the storage unit 20 includes a non-temporary recording medium readable by the CPU of the control unit 10, such as a ROM (Read Only Memory) and a RAM (Random Access Memory).
- the storage unit 20 stores a control program 100 for controlling the estimation device 1.
- Various functions of the control unit 10 are realized by the CPU of the control unit 10 executing the control program 100 in the storage unit 20. It can be said that the control program 100 is a bone density estimation program for causing the computer device 1 to function as an estimation device.
- the control unit 10 executes the control program 100 in the storage unit 20 to provide the control unit 10 with an approximate value capable of outputting the estimated value 300 of the bone density, as shown in FIG.
- a vessel 280 is formed.
- the approximator 280 includes the neural network 200, for example.
- the control program 100 can be said to be a program for causing the computer device 1 to function as the neural network 200.
- the estimated value of the bone density may be referred to as “bone density estimated value”.
- a configuration example of the neural network 200 will be described later in detail.
- the storage unit 20 stores, besides the control program 100, learned parameters 110 related to the neural network 200, estimation data 120 (hereinafter also referred to as “input information”), learning data 130, and teacher data 140. .
- the learning data 130 and the teacher data 140 are data used when learning the neural network 200.
- the learned parameter 110 and the estimation data 120 are data used when the learned neural network 200 estimates the bone density.
- the learning data 130 is data input to the input layer 210 of the neural network 200 when learning the neural network 200.
- the learning data 130 is also called learning data.
- the teacher data 140 is data indicating a correct value of the bone density.
- the teacher data 140 is compared with output data output from the output layer 230 of the neural network 200 when learning the neural network 200.
- the learning data 130 and the teacher data 140 may be collectively referred to as supervised learning data.
- the estimation data 120 is data that is input to the input layer 210 when the learned neural network 200 estimates bone density.
- the learned parameter 110 is a learned parameter in the neural network 200.
- the learned parameter 110 can be said to be a parameter adjusted by learning of the neural network 200.
- the learned parameter 110 includes a weighting coefficient indicating the weight of the connection between the artificial neurons.
- the learned neural network 200 performs an operation based on the learned parameters 110 on the estimation data 120 input to the input layer 210, and outputs a bone density estimated value 300 from the output layer 230. Is output.
- the data input to the input layer 210 may be input to the input layer 210 via the input unit 50, or may be input directly to the input layer 210. In the case where the input is directly input to the input layer 210, the input layer 210 may be part or all of the input unit 50.
- the bone density estimation value 300 may be referred to as an estimation result 300.
- the communication unit 30 is connected to a communication network including the Internet or the like by wire or wirelessly.
- the communication unit 30 can communicate with other devices such as a cloud server and a web server via a communication network.
- the communication unit 30 can input the information received from the communication network to the control unit 10.
- the communication unit 30 can output the information received from the control unit 10 to a communication network.
- the display unit 40 is, for example, a liquid crystal display or an organic EL display.
- the display unit 40 can display various information such as characters, symbols, and graphics under the control of the control unit 10.
- the input unit 50 can receive an input from the user to the estimation device 1.
- the input unit 50 includes, for example, a keyboard and a mouse.
- the input unit 50 may include a touch panel capable of detecting a user operation on the display surface of the display unit 40.
- the configuration of the estimation device 1 is not limited to the above example.
- the control unit 10 may include a plurality of CPUs. Further, the control unit 10 may include at least one DSP. Further, all functions of the control unit 10 or some functions of the control unit 10 may be realized by a hardware circuit that does not require software for realizing the functions.
- the storage unit 20 may include a non-transitory computer-readable recording medium other than the ROM and the RAM.
- the storage unit 20 may include, for example, a small hard disk drive and a solid state drive (SSD).
- the storage unit 20 may include a memory such as a USB (Universal Serial Bus) memory that is detachable from the estimation device 1.
- USB Universal Serial Bus
- FIG. 3 is a diagram illustrating an example of the configuration of the neural network 200.
- the neural network 200 is, for example, a convolutional neural network (CNN).
- CNN convolutional neural network
- the neural network 200 includes, for example, an input layer 210, a hidden layer 220, and an output layer 230.
- the hidden layer 220 is also called an intermediate layer.
- the hidden layer 220 includes, for example, a plurality of convolutional layers 240, a plurality of pooling layers 250, and a fully connected layer 260.
- a fully connected layer 260 exists before the output layer 230.
- the convolutional layers 240 and the pooling layers 250 are alternately arranged between the input layer 210 and the fully connected layer 260.
- the configuration of the neural network 200 is not limited to the example of FIG.
- the neural network 200 may include one convolutional layer 240 and one pooling layer 250 between the input layer 210 and the fully connected layer 260.
- the neural network 200 may be a neural network other than the convolutional neural network.
- the estimation data 120 includes image data of a simple X-ray image showing a bone whose bone density is to be estimated.
- the estimation target of the bone density is, for example, a person. Therefore, it can be said that the estimation data 120 includes image data of a simple X-ray image showing a human bone.
- the learning data 130 includes image data of a plurality of simple X-ray images showing human bones.
- a simple X-ray image is a two-dimensional image, and is also called a general X-ray image or an X-ray image.
- the estimation target of the bone density may be other than a person.
- the target for estimating the bone density may be an animal such as a dog, cat or horse.
- the target bones are mainly cortical bones and cancellous bones derived from living organisms, and the target bones are replaced with artificial bones containing calcium phosphate as a main component or regenerated artificially produced by regenerative medicine or the like. Bone may be included.
- the image data included in the estimation data 120 may be referred to as “estimation image data”.
- the simple X-ray image indicated by the image data included in the estimation data 120 may be referred to as “simple X-ray image for estimation”.
- the image data included in the learning data 130 may be referred to as “learning image data”.
- the simple X-ray image indicated by the image data included in the learning data 130 may be referred to as “simple X-ray image for learning”.
- the learning data 130 includes a plurality of learning X-ray image data respectively indicating a plurality of simple X-ray images for learning.
- the imaging site of the estimation simple X-ray image examples include a head, a neck, a chest, a waist, a hip, a knee, a foot, a foot, a toe, a shoulder, an elbow, a wrist, a hand, and a finger.
- a temporomandibular joint is employed.
- the estimation data 120 includes image data of a simple X-ray image obtained by irradiating the head with X-rays, and image data of a simple X-ray image obtained by irradiating the neck with X-rays.
- Image data image data of a simple X-ray image obtained by irradiating the chest with X-rays, image data of a simple X-ray image obtained by irradiating the hips with X-rays, X-ray image of the hip joint Image data of a simple X-ray image obtained by irradiating a ray, image data of a simple X-ray image obtained by irradiating a knee joint with X-ray, image data of a simple joint obtained by irradiating an ankle joint Image data of a simple X-ray image obtained, image data of a simple X-ray image obtained by irradiating a foot with X-rays, simple X-ray image obtained by irradiating a toe with X-rays Image data, image data of a simple X-ray image obtained by irradiating the shoulder joint with X-rays, X-ray image of the elbow joint Image data of a simple X-ray image obtained by ir
- Simple X-ray images obtained by irradiating the chest with X-rays include a simple X-ray image showing a lung and a simple X-ray image showing a thoracic vertebra.
- the type of the imaging part of the estimation simple X-ray image is not limited to this.
- the simple X-ray image for estimation may be a front image in which the target part is seen from the front, or a side image in which the target part is seen from the side.
- the head, neck, chest, waist, hip, knee, ankle, foot, and the like may be included in the imaging regions of the plurality of simple X-ray images for learning indicated by the plurality of image data for learning included in the learning data 130.
- At least one of a part, a toe, a shoulder joint, an elbow joint, a wrist joint, a hand part, a finger, and a jaw joint is included.
- the learning data 130 includes image data of a simple X-ray image obtained by irradiating the head with X-rays, and image data of a simple X-ray image obtained by irradiating the neck with X-rays.
- Image data image data of a simple X-ray image obtained by irradiating the chest with X-rays, image data of a simple X-ray image obtained by irradiating the hips with X-rays, X-ray image of the hip joint Image data of a simple X-ray image obtained by irradiating a ray, image data of a simple X-ray image obtained by irradiating a knee joint with X-ray, image data of a simple joint obtained by irradiating an ankle joint Image data of a simple X-ray image obtained, image data of a simple X-ray image obtained by irradiating a foot with X-rays, simple X-ray image obtained by irradiating a toe with X-rays Image data, image data of a simple X-ray image obtained by irradiating the shoulder joint with X-rays, X-ray Image data of a simple X-ray image obtained by irradiation, image data
- the learning data 130 may include some types of image data among the 15 types of image data, or may include all types of image data. Note that the type of the imaging part of the learning simple X-ray image is not limited to this. Further, the plurality of simple X-ray images for learning may include a front image or a side image. Further, the plurality of simple X-ray images for learning may include both a front image and a side image of the same imaging region.
- the teacher data 140 includes, for each of the plurality of learning image data included in the learning data 130, a measured value of a bone density of a person having a bone that appears in the learning simple X-ray image indicated by the learning image data. It is.
- the plurality of measured values of the bone density included in the teacher data 140 include, for example, the measured values of the bone density measured by irradiating the lumbar spine with X-rays and the X-rays of the proximal femur.
- Bone density measured by X-ray irradiation to the radius, bone density measured by X-ray irradiation to the metacarpal bone, and ultrasonic irradiation to the arm At least one of the bone density measured and measured and the bone density measured by applying ultrasonic waves to the heel is included.
- the measured value of the bone density included in the teacher data 140 may be referred to as “reference bone density”.
- a DEXA (dual-energy X-ray absorptiometry) method is known as a method for measuring bone density.
- the DEXA apparatus for measuring bone density using the DEXA method when the bone density of the lumbar vertebra is measured, the lumbar vertebra is irradiated with X-rays (specifically, two types of X-rays) from the front thereof. .
- the DEXA apparatus when the bone density of the proximal part of the femur is measured, the proximal part of the femur is irradiated with X-rays from the front.
- the teacher data 140 may include the bone density of the lumbar vertebra measured by the DEXA device, or may include the bone density of the proximal femur measured by the DEXA device.
- the teacher data 140 may include a bone density measured by irradiating the target site with X-rays from the side surface.
- the teacher data 140 may include the bone density measured by irradiating the lumbar spine with X-rays from the side.
- an ultrasonic method is known.
- an ultrasonic wave is applied to an arm to measure the bone density of the arm, or an ultrasonic wave is applied to a heel to apply the heel to the heel.
- the bone mineral density is measured.
- the teacher data 140 may include the bone density measured by the ultrasonic method.
- the plurality of learning simple X-ray images indicated by the plurality of learning image data included in the learning data 130 include a plurality of different human bones, respectively. Then, as shown in FIG. 4, in the storage unit 20, for each of the plurality of learning image data included in the learning data 130, the bones appearing in the learning simple X-ray image indicated by the learning image data are stored. Is associated with the reference bone density of a person having It can be said that the reference bone density of a person having a bone shown in the learning simple X-ray image is associated with each of the plurality of learning simple X-ray images used in the learning of the neural network 200.
- the reference bone density associated with the learning image data is obtained by calculating, at substantially the same time as when the learning simple X-ray image indicated by the learning image data was captured, the bones appearing in the learning simple X-ray image. Bone density measured for the same person as the person having the bone.
- the part where the reference bone density corresponding to the learning image data is measured may or may not be included.
- the part shown in the simple X-ray image for learning may or may not include a part where the reference bone density corresponding to the simple X-ray image for learning is measured.
- the former a case is considered in which learning image data indicating a simple X-ray image for learning showing a lumbar region is associated with a reference bone density of a lumbar spine.
- the learning image data showing the hip joint is associated with the reference bone density of the proximal part of the femur.
- a case is considered where learning image data in which the chest is captured is associated with the reference bone density of the lumbar spine.
- a case may be considered in which learning image data of a knee joint is associated with the reference bone density of the heel.
- the direction of the part shown in the simple X-ray image indicated by the learning image data is the same as the direction of X-ray irradiation to the target part in the measurement of the reference bone density corresponding to the learning image data. Or different. In other words, even if the direction of the part shown in the learning simple X-ray image and the direction of X-ray irradiation on the target part in the measurement of the reference bone density corresponding to the learning simple X-ray image are the same. Good or different.
- learning image data showing a simple X-ray image showing the chest from the front (hereinafter sometimes referred to as “chest front simple X-ray image”) and X-rays from the front of the lumbar vertebrae
- chest front simple X-ray image X-ray image showing the chest from the front
- lumbar front simple X-ray image a simple X-ray image showing the waist from the front
- a reference bone density measured by irradiating X-rays with the reference bone density is associated.
- learning image data showing a simple X-ray image in which the lumbar region is seen from the side (hereinafter sometimes referred to as “lumbar side simple X-ray image”), It is conceivable that X-rays are associated with the reference bone density measured.
- learning image data showing a simple X-ray image showing the knee joint from the side (hereinafter sometimes referred to as a “knee-side simple X-ray image”), and the learning image data for the proximal part of the femur. It is conceivable that X-rays are radiated from the front and are associated with reference bone densities measured.
- the plurality of learning simple X-ray images indicated by the plurality of learning image data included in the learning data 130 include a simple X-ray image in which a portion of the same type as the estimation simple X-ray image is captured.
- a simple X-ray image in which a part of a type different from that of the estimation simple X-ray image may be included.
- the simple X-ray image for estimation is a chest front simple X-ray image
- a case where a plurality of learning simple X-ray images include a chest front simple X-ray image is considered.
- the simple X-ray image for estimation is a simple X-ray image in which the knee joint is seen from the front (hereinafter, sometimes referred to as a “knee front simple X-ray image”)
- a plurality of simple X-ray images for learning are used.
- the X-ray image includes a knee-side simple X-ray image.
- the estimation simple X-ray image is a waist front simple X-ray image
- a case where the chest front simple X-ray image is included in a plurality of learning simple X-ray images can be considered.
- the estimation simple X-ray image is the waist side simple X-ray image
- the case where the plurality of learning simple X-ray images include the knee front simple X-ray image may be considered.
- the plurality of learning simple X-ray images may include a simple X-ray image in which a part in the same direction as the estimation simple X-ray image is captured, or a part in a different direction from the estimation simple X-ray image. May be included.
- the simple X-ray image for estimation is a front lumbar simple X-ray image
- a case where a plurality of simple X-ray images for learning include a front lumbar simple X-ray image is considered.
- the estimation simple X-ray image is a knee front simple X-ray image
- the simple X-ray image for estimation is a knee-side simple X-ray image
- a case where a plurality of learning simple X-ray images include a knee front simple X-ray image is considered.
- the estimation simple X-ray image is a waist side simple X-ray image
- a case where the plurality of learning simple X-ray images include a chest front simple X-ray image may be considered.
- the teacher data 140 may include a reference bone density measured from a part (bone) included in a part shown in the estimation simple X-ray image, or may be included in a part shown in the estimation simple X-ray image.
- a reference bone density measured from a part (bone) that does not exist may be included.
- the teacher data 140 may include the reference bone density of the lumbar spine.
- the estimation simple X-ray image is a chest front simple X-ray image
- the reference bone density of the metacarpal bone is included in the teacher data 140 is considered.
- the teacher data 140 may include a reference bone density measured by irradiating the target site with X-rays from the same direction as the direction of the site shown in the simple X-ray image for estimation.
- the reference bone density measured by irradiating the target site with X-rays from a direction different from the direction of the site shown in the simple X-ray image may be included.
- the teacher data 140 when the simple X-ray image for estimation is a lumbar front simple X-ray image, the teacher data 140 includes a reference bone density measured by irradiating the lumbar spine with X-rays from the front thereof. The case is conceivable.
- the teacher data 140 is measured by irradiating the proximal part of the femur with X-rays from the front thereof.
- the reference bone density is included.
- grayscale image data representing a simple X-ray image obtained by a simple X-ray imaging apparatus is reduced, and the number of gradations is reduced.
- learning image data and estimation image data are used as learning image data and estimation image data. For example, consider a case where the number of a plurality of pixel data constituting image data obtained by a simple X-ray imaging apparatus is larger than (1024 ⁇ 640) and the number of bits of the pixel data is 16 bits.
- the number of a plurality of pixel data constituting the image data obtained by the simple X-ray imaging apparatus is reduced to, for example, (256 ⁇ 256), (1024 ⁇ 512) or (1024 ⁇ 640),
- the reduced number of bits of the pixel data to 8 bits is used as learning image data and estimation image data.
- each of the learning simple X-ray image and the estimation simple X-ray image is composed of (256 ⁇ 256), (1024 ⁇ 512), or (1024 ⁇ 640) pixels, and the value of the pixel Is represented by 8 bits.
- the learning image data and the estimation image data may be generated by the control unit 10 of the estimation device 1 from image data obtained by a simple X-ray imaging device, or a device other than the estimation device 1 may be generated by a simple X-ray imaging device. It may be generated from image data obtained by the device.
- the image data obtained by the simple X-ray imaging apparatus may be received by the communication unit 30 through a communication network, or may be stored in a removable memory included in the storage unit 20.
- the communication unit 30 receives the learning image data and the estimation image data from another device through the communication network, and the control unit 10 transmits the learning image data and the estimation image data received by the communication unit 30.
- the data may be stored in the storage unit 20.
- learning image data and estimation image data generated by another device may be stored in a removable memory included in the storage unit 20.
- the communication unit 30 may receive the teacher data 140 via the communication network, and the control unit 10 may store the teacher data 140 received by the communication unit 30 in the storage unit 20.
- the teacher data 140 may be stored in a removable memory included in the storage unit 20. Note that the number and the number of bits of the pixel data of the learning image data and the estimation image data are not limited to the above.
- FIG. 5 is a diagram for describing an example of learning of the neural network 200.
- the control unit 10 inputs the learning data 130 to the input layer 210 of the neural network 200 as shown in FIG. Then, the control unit 10 adjusts the variable parameter 110a in the neural network 200 such that an error of the output data 400 output from the output layer 230 of the neural network 200 with respect to the teacher data 140 is reduced. More specifically, the control unit 10 inputs each learning image data in the storage unit 20 to the input layer 210.
- the control unit 10 inputs a plurality of pixel data constituting the learning image data to a plurality of artificial neurons constituting the input layer 210, respectively.
- the control unit 10 reduces the error of the output data 400 output from the output layer 230 when the learning image data is input to the input layer 210 with respect to the reference bone density corresponding to the learning image data.
- the parameter 110a is adjusted.
- a method for adjusting the parameter 110a for example, an error back propagation method is adopted.
- the adjusted parameter 110a becomes the learned parameter 110 and is stored in the storage unit 20.
- the parameters 110a include, for example, parameters used in the hidden layer 220.
- the parameters 110a include a filter coefficient used in the convolutional layer 240 and a weighting coefficient used in the fully connected layer 260.
- the method of adjusting the parameter 110a in other words, the method of learning the parameter 110a is not limited to this.
- the relationship between the learning data 130 including the image data of the plurality of simple X-ray images for learning and the measured value of the bone density as the teacher data 140 is stored in the storage unit 20 using the neural network 200.
- the learned parameters 110 that have been learned are stored.
- the estimation device 1 learns the neural network 200, but another device may learn the neural network 200.
- the storage unit 20 of the estimation device 1 stores the learned parameter 110 generated by another device. Further, it is unnecessary for the storage unit 20 to store the learning data 130 and the teacher data 140.
- the communication unit 30 may receive the learned parameter 110 generated by another device via a communication network, and the control unit 10 may store the learned parameter 110 received by the communication unit 30 in the storage unit 20.
- the learned parameter 110 generated by another device may be stored in the removable memory included in the storage unit 20.
- image data of a plurality of simple X-ray images for learning is input to the input layer 210 as learning data 130, and the reference bone density is used as teacher data 140.
- the learned parameter 110a that has been learned is included.
- the neural network 200 performs an operation based on the learned parameter 110 a on the estimation data 120 input to the input layer 210, and obtains the bone density estimated value 300 from the output layer 230. Output.
- the estimation image data as the estimation data 120 is input to the input layer 210, a plurality of pixel data constituting the estimation image data are respectively input to a plurality of artificial neurons constituting the input layer 210.
- the convolutional layer 240 performs an operation using the filter coefficient included in the learned parameter 110a
- the fully connected layer 260 performs an operation using the weighting coefficient included in the learned parameter 110a.
- the estimation image data indicating the chest front simple X-ray image is input to the input layer 210
- the bone density of a person having a chest bone reflected in the chest front simple X-ray image indicated by the estimation image data Is output from the output layer 230.
- the estimation image data indicating the waist front simple X-ray image is input to the input layer 210
- the person having the lumbar vertebra included in the waist portion, which is included in the waist front simple X-ray image indicated by the estimation image data is displayed.
- the estimated value 300 of the bone density is output from the output layer 230.
- the estimation image data indicating the waist side simple X-ray image is input to the input layer 210
- the person having the lumbar vertebra included in the waist portion, which is included in the waist side simple X-ray image indicated by the estimation image data is displayed.
- the estimated value 300 of the bone density is output from the output layer 230.
- estimation image data indicating a knee front simple X-ray image is input to the input layer 210
- the density estimation value 300 is output from the output layer 230.
- the estimation image data indicating the knee-side simple X-ray image is input to the input layer 210, the human bone having the knee joint bone reflected in the knee-side simple X-ray image indicated by the estimation image data
- the density estimation value 300 is output from the output layer 230.
- the estimated value 300 output from the output layer 230 is at least one of a bone mineral density per unit area (g / cm 2 ), a bone mineral density per unit volume (g / cm 3 ), YAM, T score, and Z score. It may be represented by type. YAM is an abbreviation for “Young Adult Mean” and is sometimes referred to as the average percentage of young adults.
- the output layer 230 may output an estimated value 300 expressed in bone mineral density per unit area (g / cm 2 ) and an estimated value 300 expressed in YAM, or may be expressed in YAM.
- the estimated value 300, the estimated value 300 represented by the T score, and the estimated value 300 represented by the Z score may be output.
- the storage unit 20 may store a plurality of estimation data 120.
- the plurality of simple X-ray images for estimation indicated by the plurality of estimation data 120 in the storage unit 20 may include a plurality of simple X-ray images showing the same type of part, or may be different from each other. A plurality of simple X-ray images showing types of parts may be included. Further, the plurality of simple X-ray images for estimation may include a plurality of simple X-ray images in which a part is seen from the same direction, or a plurality of simple X-ray images in which a part is seen from a different direction. Good.
- the plurality of simple X-ray images for estimation may include a plurality of simple X-ray images in which the directions of the parts shown therein are the same, or include a plurality of simple X-ray images in which the directions of the parts shown therein are different. It may be.
- the control unit 10 inputs each of the plurality of estimation data 120 in the storage unit 20 to the input layer 210 of the neural network 200, and outputs the bone mineral density corresponding to each estimation data 120 from the output layer 230 of the neural network 200.
- An estimate 300 is output.
- the learning of the neural network 200 and the estimation of the bone density in the neural network 200 are performed using the image data of the simple X-ray image.
- Image data of a simple X-ray image in other words, image data of an X-ray image, can be easily obtained because they are used in various medical examinations at many hospitals. Therefore, the bone density can be easily estimated without using an expensive device such as the DEXA device.
- image data of a simple X-ray image taken for a medical examination or the like as image data for estimation, it is possible to easily estimate the bone density by using the opportunity of the medical examination or the like. Therefore, by using the estimation device 1, the service for the hospital user can be improved.
- the front simple X-ray image of the chest or the like bones may not be easily captured due to the influence of organs.
- a front simple X-ray image is likely to be taken in many hospitals.
- the chest front simple X-ray image is often taken during a medical examination or the like, and can be said to be a particularly easily available simple X-ray image. By using the chest front simple X-ray image as the estimation simple X-ray image or the learning simple X-ray image, the bone density can be more easily estimated.
- the simple X-ray image for estimation can be estimated from the image data. Therefore, the convenience of the estimation device 1 (in other words, the computer device 1) can be improved.
- the simple X-ray image for estimation can be estimated from the image data. Therefore, the convenience of the estimation device 1 can be improved.
- the neural network The network 200 can estimate the bone density based on the learned parameters 110. Therefore, the convenience of the estimation device 1 can be improved.
- the direction of the part shown in the learning simple X-ray image is different from the direction of X-ray irradiation of the target part in the measurement of the reference bone density corresponding to the learning simple X-ray image.
- the neural network 200 can estimate the bone density based on the learned parameters 110. Therefore, the convenience of the estimation device 1 can be improved.
- the image data of the simple X-ray image for estimation is used. Can be used to estimate the bone density. Therefore, the convenience of the estimation device 1 can be improved.
- the bone density estimation value 300 obtained by the estimation device 1 may be displayed on the display unit 40. Further, the bone density estimation value 300 obtained by the estimation device 1 may be used by another device.
- FIG. 6 is a diagram illustrating an example of a bone density estimation system 600 including the estimation device 1 and a processing device 500 that performs a process using the bone density estimation value 300 obtained by the estimation device 1.
- the estimation device 1 and the processing device 500 can communicate with each other via the communication network 700.
- the communication network 700 includes, for example, at least one of a wireless network and a wired network.
- the communication network 700 includes, for example, a wireless LAN (Local Area Network) and the Internet.
- LAN Local Area Network
- the communication network 700 is connected to the communication unit 30.
- the control unit 10 causes the communication unit 30 to transmit the bone density estimation value 300 to the processing device 500.
- the processing device 500 performs a process using the bone density estimated value 300 received from the estimation device 1 through the communication network 700.
- the processing device 500 is a display device such as a liquid crystal display device, and displays the estimated bone density value 300.
- the processing device 500 may display the estimated bone density 300 in a table or a graph.
- the processing apparatus 500 may display the bone density estimated value 300 obtained by the plurality of estimating apparatuses 1.
- the configuration of the processing device 500 may be the same as the configuration of the estimation device 1 shown in FIG. 1, or may be different from the configuration of the estimation device 1.
- the processing performed by the processing device 500 using the estimated bone density value 300 is not limited to the above example. Further, the processing device 500 may directly communicate with the estimation device 1 wirelessly or by wire without passing through the communication network 700.
- the learning data 130 includes, for each learning image data, information on the health condition of a person having a bone that appears in the learning simple X-ray image indicated by the learning image data.
- the learning data 130 includes, for each learning image data, information on the health condition of the subject (subject) of the learning simple X-ray image indicated by the learning image data.
- learning health-related information information on the health condition of the subject in the learning simple X-ray image
- learning health-related information information on the health state of the subject in the learning simple X-ray image indicated by the learning image data
- learning health-related information corresponding to the learning image data may be referred to as learning health-related information corresponding to the learning image data.
- the health-related information for learning includes, for example, at least one of age information, gender information, height information, weight information, drinking habit information, smoking habit information, and fracture history information.
- the learning-related health-related information is compiled into a database for each person and is generated as a CSV (Comma-Separated $ Value) format file or a text format file.
- Each of the age information, the height information, and the weight information is represented as, for example, numerical data of a plurality of bits.
- gender information for example, "male” or “female” is represented by 1-bit data
- the drinking habit information "having a drinking habit” or "no drinking habit” is represented by 1-bit data.
- 1-bit data is represented by “having a smoking habit” or “no smoking habit”, and in the information on the presence or absence of a fracture history, “with a fracture” or “without a fracture” is represented by 1-bit data.
- the learning-related health information may include a body fat percentage or a subcutaneous fat percentage of the subject.
- the reference bone density (see FIG. 4) corresponding to the learning image data is used as the learning image data.
- the learning image data indicating a simple X-ray image for learning in which a bone of a certain person is captured and the information on the health state of the certain person (health-related information for learning) are compared with the bone density of the certain person.
- the measured value (reference bone density) is associated.
- the learning image data and the corresponding learning health-related information are simultaneously input to the input layer 210.
- learning image data is input to a part of the plurality of artificial neurons forming the input layer 210, and learning health-related information is input to the other part of the plurality of artificial neurons. Then, when the learning image data and the corresponding learning health-related information are input to the input layer 210, the output data 400 output from the output layer 230, the learning image data and the learning data are output. The reference bone density corresponding to the health related information is compared.
- the estimation data 120 includes the estimation image data and information on the health condition of a person having a bone that appears in the estimation simple X-ray image indicated by the estimation image data.
- the estimation data 120 includes estimation image data and information on the health state of the subject of the estimation simple X-ray image indicated by the estimation image data.
- the information on the health condition of the subject in the simple X-ray image for estimation may be referred to as “health-related information for estimation (hereinafter, also referred to as“ individual data ”in other embodiments)”.
- information on the health state of the subject in the simple X-ray image for estimation indicated by the image data for estimation may be referred to as health-related information for estimation corresponding to the image data for estimation.
- the health-related information for estimation includes at least one of age information, gender information, height information, weight information, drinking habit information, smoking habit information, and fracture history information.
- the estimation-related health-related information includes the same type of information as the learning-related health-related information.
- the estimation-related health-related information may include the body fat percentage or the subcutaneous fat percentage of the subject, similarly to the learning-related health information.
- the image data for estimation and the health-related information for estimation corresponding thereto are simultaneously input to the input layer 210.
- estimation image data is input to a part of the plurality of artificial neurons forming the input layer 210
- estimation-related health-related information is input to other parts of the plurality of artificial neurons.
- the output layer 230 outputs an estimated value of the bone density of the certain person.
- the learning data 130 includes image data of N (N ⁇ 2) learning simple X-ray images in which portions of the same person are captured and the directions of the captured portions are different from each other.
- N learning simple X-ray images may be collectively referred to as a “learning simple X-ray image set”.
- the simple X-ray image set for learning includes, for example, a front image and a side image of the same person.
- the learning simple X-ray image set includes, for example, a chest front simple X-ray image and a waist side simple X-ray image of a certain person.
- the image sizes of the front image and the side image included in the learning simple X-ray image set may be different from each other.
- the width of the image size of the side image may be smaller than the width of the image size of the front image.
- the image data of each learning simple X-ray image of the learning simple X-ray image set may be collectively referred to as a “learning image data set”.
- the learning data 130 includes a learning image data set for a plurality of persons different from each other. Accordingly, the learning data 130 includes a plurality of learning image data sets. Then, one reference bone density is associated with one learning image data set. In other words, a measurement value (reference bone density) of the bone density of the certain person is associated with the learning image data set of the certain person.
- each learning image data set is input to the input layer 210.
- N pieces of learning image data constituting the one learning image data set are simultaneously input to the input layer 210.
- the learning image data set includes first learning image data and second learning image data.
- first learning image data for example, image data of a chest front simple X-ray image
- second learning image data for example, image data of a waist side simple X-ray image
- the output data 400 output from the output layer 230 is compared with the reference bone density corresponding to the learning image data set.
- the estimation data 120 includes image data of N simple X-ray images for estimation, in which parts of the same person are photographed and the directions of the parts appearing in the persons are different from each other.
- the N simple X-ray images for estimation may be collectively referred to as a “set of simple X-ray images for estimation”.
- the simple X-ray image set for estimation includes, for example, a front image and a side image of the same person.
- the estimation simple X-ray image set includes, for example, a waist front simple X-ray image and a knee side simple X-ray image of a certain person.
- the image sizes of the front image and the side image included in the estimation simple X-ray image set may be different from each other.
- the width of the image size of the side image may be smaller than the width of the image size of the front image.
- the image data of each simple X-ray image for estimation of the simple X-ray image set for estimation may be collectively referred to as an “image data set for estimation”.
- the estimation data 120 when used to estimate the bone density, N pieces of estimation image data constituting the estimation image data set are simultaneously input to the input layer 210.
- the estimation image data set includes first estimation image data and second estimation image data.
- the first estimation image data is input to a part of the plurality of artificial neurons constituting the input layer 210
- the second estimation image data is input to the other part of the plurality of artificial neurons.
- the accuracy of estimating the bone density can be improved by using the image data of a plurality of simple X-ray images in which the portions of the same subject are captured and the directions of the captured portions are different from each other.
- the learning data 130 may include a learning image data set and learning health-related information.
- the learning image data set and the learning-related health-related information of the same person are simultaneously input to the input layer 210.
- the estimation data 120 may include an estimation image data set and estimation health-related information. In this case, the image data set for estimation and the health-related information for estimation are simultaneously input to the input layer 210.
- the same learned parameter 110 is used irrespective of the type of bone shown in the X-ray image indicated by the estimation image data, but the type of bone shown in the X-ray image indicated by the estimation image data is used. May be used.
- the neural network 200 has a plurality of learned parameters 110 respectively corresponding to a plurality of types of bones.
- the neural network 200 estimates the bone density by using the learned parameter 110 corresponding to the type of bone appearing in the X-ray image indicated by the input estimation image data. For example, when the lumbar vertebra is captured in the X-ray image indicated by the input estimation image data, the neural network 200 estimates the bone density by using the lumbar vertebra bone density estimation learning parameter 110.
- the neural network 200 uses the learning parameter 110 for estimating the bone density of the proximal part of the femur, Estimate bone density.
- the neural network 200 uses, for example, the learned parameter 110 specified by the user through the input unit 50 among the plurality of learned parameters 110. In this case, the user instructs the learned parameters 110 used by the neural network 200 according to the type of bone shown in the X-ray image indicated by the estimation image data input to the neural network 200.
- a plurality of learning image data each showing a plurality of X-ray images showing the same type of bone is used, and a learned parameter 110 corresponding to the type of the bone is generated.
- the estimating apparatus 1 and the bone density estimating system 600 have been described in detail. However, the above description is an example in all aspects, and the present disclosure is not limited thereto. The various examples described above can be applied in combination as long as they do not conflict with each other. And it is understood that countless examples that are not illustrated can be assumed without departing from the scope of the present disclosure.
- FIG. 7 is a diagram illustrating an example of a configuration of the estimation device 1A according to the present embodiment.
- the approximator 280 further has a second neural network 900.
- the second neural network 900 can detect a fracture based on the learned parameters 910.
- the estimation device 1A according to the present embodiment has the same configuration as the estimation device 1 according to the first embodiment, and a description of the same configuration will be omitted.
- the neural network 200 described in the above example is referred to as a first neural network 200.
- the second neural network 900 has a configuration equivalent to the first neural network 200, for example.
- the second neural network 900 can detect a fracture based on the same estimation image data as the estimation image data included in the estimation data 120 input to the first neural network 200. That is, from one estimation image data, the first neural network 200 can estimate the bone density, and the second neural network 900 can detect the fracture. Note that the detection result 920 of the second neural network 900 may be output from the output layer 230 of the second neural network 900 as in the above example.
- the learning of the second neural network 900 parameters are learned by using learning image data in which a bone without a fracture is captured and learning image data in which a bone with a fracture is captured. Further, in the teacher data, information indicating the presence or absence of a current fracture and information indicating the location of the fracture are associated with each of the learning image data for the bones shown in the learning image data. Further, the teacher data may include information indicating a past fracture history and information indicating a past fracture location. As a result, the second neural network 900 can detect, based on the image data for estimation, the presence / absence and location of a bone fracture in the image data for estimation, and output the detection result 920.
- the estimation device 1A may include a determination unit 930 that determines whether the subject has osteoporosis.
- the determination unit 930 can compare and evaluate the estimation result 300 of the first neural network 200 and the detection result 920 of the second neural network 900 to determine whether or not the subject has osteoporosis.
- the determination unit 900 may determine osteoporosis based on, for example, an original standard or a known guideline. Specifically, when the detection result 920 indicates a fracture in the vertebral body or the proximal part of the femur, the determination unit 900 may determine that the patient has osteoporosis. When the estimated bone density value 300 output by the first neural network 200 indicates YAM, the determination unit 900 determines that the YAM is less than 80% and that the detection result 920 indicates that the vertebral body and the femur are near. If a fracture other than the position is indicated, it may be determined that the patient has osteoporosis. In addition, when YAM indicated by bone density estimated value 300 indicates a value of 70% or less, determination unit 900 may determine that the patient has osteoporosis.
- the approximator 28 may further include a third neural network 950, as shown in FIG.
- the third neural network 950 can classify the bones of the subject from the estimation image data included in the estimation data 120 based on the learned parameters 960.
- the third neural network 950 outputs, for each pixel data of the input estimation image data, site information indicating a bone site indicated by the pixel data. Thereby, the bones appearing in the X-ray image indicated by the image data for estimation can be classified.
- the site information may be called segmentation data.
- the third neural network 950 determines that the pixel data for each pixel data of the estimation image data is L1 to L5 of the lumbar vertebra. Of these, part information indicating which part is indicated is output. For example, when certain pixel data of the image data for estimation indicates L1 of the lumbar vertebra, the third neural network 950 outputs part information indicating L1 as part information corresponding to the pixel data.
- the third neural network 950 uses the learned parameter 960 according to the type of bone shown in the X-ray image indicated by the image data for estimation.
- the third neural network 950 has a plurality of learned parameters 960 corresponding to a plurality of types of bones, respectively.
- the third neural network 950 uses the learned parameters 960 corresponding to the types of bones appearing in the X-ray image indicated by the input estimation image data, and extracts the bones appearing in the X-ray image indicated by the estimation image data. Classify. For example, when the lumbar vertebra is captured in the X-ray image indicated by the input image data for estimation, the third neural network 950 divides the lumbar vertebra into L1 to L5 using a learning parameter 960 corresponding to the lumbar vertebra. .
- the third neural network 950 uses, for example, the learned parameter 960 instructed by the user through the input unit 50 among the plurality of learned parameters 960.
- the user instructs the learned parameter 960 used by the third neural network 950 according to the type of bone shown in the X-ray image indicated by the estimation image data input to the third neural network 950. .
- the third neural network 950 converts the bones shown in the X-ray image indicated by the input estimation image data into the first site where the implant is embedded, the second site where the tumor is located, and the third site where the fracture is present. It may be divided. In this case, the third neural network 950 outputs, for each pixel data of the image data for estimation, site information indicating which of the first site, the second site, and the third site the pixel data indicates. I do. When the pixel data indicates a part other than the first part, the second part, and the third part, the third neural network 950 determines that the pixel data is not a part other than the first part, the second part, and the third part. The part information indicating the part is output.
- the third neural network 950 divides a bone appearing in the X-ray image indicated by the image data for estimation into a first part, a second part, and a third part, the third neural network 950 performs the estimation. It can also be said that an implant embedded in a bone, a fracture of the bone, and a tumor of the bone are detected in the X-ray image indicated by the image data.
- a plurality of learning image data each showing a plurality of X-ray images showing the same type of bone is used, and a learned parameter 960 corresponding to the type of bone is generated.
- the third neural network 950 divides a bone appearing in the X-ray image indicated by the input estimation image data into a first part, a second part, and a third part, the bones are divided into a plurality of learning image data.
- learning image data showing an X-ray image showing a case in which an implant is embedded learning image data showing an X-ray image showing a case with a tumor in a bone, and an X-ray image showing a case with a fracture.
- image data for learning indicating an image is
- the teacher data includes, for each piece of learning image data, annotation information for dividing a bone indicated by the learning image data.
- annotation information includes, for each pixel data of the corresponding learning image data, site information indicating a bone site indicated by the pixel data.
- the first neural network 200 may estimate the bone density for each of the sections divided in the third neural network 950.
- the first neural network 200 includes estimation image data 121 and the estimation image output from the third neural network 950 based on the estimation image data 121. Part information 965 corresponding to each pixel data of the data 121 is input. Based on the learned parameters 110 corresponding to the types of bones appearing in the X-ray image indicated by the estimation image data 121, the first neural network 200 divides each part divided in the third neural network into The estimated bone density value 300 is output.
- the first neural network 200 calculates the bone density estimated value 300 of L1 and L2 ,
- the bone density estimation value 300 of L3, the bone density estimation value 300 of L4, and the bone density estimation value 300 of L5 are individually output.
- the teacher data includes, for each learning image data, the reference bone density of each part of the bone indicated by the learning image data.
- the first neural network 200 uses, for example, the learned parameters 110 specified by the user through the input unit 50 among the plurality of learned parameters 110.
- the user indicates the learned parameter 110 used by the first neural network 200 according to the type of bone shown in the X-ray image indicated by the estimation image data input to the first neural network 200. .
- the third neural network 950 divides the bones shown in the X-ray image indicated by the image data for estimation into a first part where the implant is embedded, a second part where the tumor is located, and a third part where the fracture is present.
- the luminance of the first partial image data indicating the first part, the second partial image data indicating the second part where the tumor is present, and the luminance of the third partial image data indicating the third part are It may be adjusted.
- FIG. 11 is a diagram showing a configuration example in this case.
- the adjustment unit 968 includes the estimation image data 121 and each pixel data of the estimation image data 121 output by the third neural network 950 based on the estimation image data 121. Is input.
- the adjustment unit 968 specifies the first partial image data, the second partial image data, and the third partial image data included in the estimation image data 121 based on the part information 965. Then, the adjustment unit 968 adjusts the luminance of the specified first partial image data, second partial image data, and third partial image data.
- the adjustment unit 968 stores, for example, the luminance of the first portion in a general X-ray image as the first reference luminance. In addition, the adjustment unit 968 stores the luminance of the second portion shown in the general X-ray image as the second reference luminance. Then, the adjustment unit 968 stores the luminance of the third portion shown in the general X-ray image as the third reference luminance. The adjustment unit 968 adjusts the luminance of the first partial image data by subtracting the first reference luminance from the luminance of the first partial image data. In addition, the adjustment unit 976 adjusts the luminance of the second partial image data by subtracting the second reference luminance from the luminance of the second partial image data.
- the adjusting unit 978 adjusts the luminance of the third partial image data by subtracting the third reference luminance from the luminance of the third partial image data.
- the adjustment unit 968 adjusts the luminance of the first partial image data, the second partial image data, and the third partial image data in the estimation image data as the first neural network as luminance-adjusted estimation image data. Enter 200.
- the first neural network 200 estimates the bone density of the bone appearing in the X-ray image indicated by the estimation image data, based on the estimation image data after the brightness adjustment.
- the luminance of the first partial image data indicating the first part, the second partial image data indicating the second part, and the third partial image data indicating the third part are adjusted to be small, so that the estimation is performed. It is possible to more accurately estimate the bone density of the bone shown in the X-ray image indicated by the image data.
- the adjusting unit 968 uses the estimation image data in which the luminance of the first partial image data, the second partial image data, and the third partial image data is forcibly set to zero as the estimation image data after the luminance adjustment. It may be input to one neural network 200.
- the third neural network 950 may detect only one of an implant, a fracture, and a tumor. Further, the third neural network 950 may detect only two of the implant, the fracture, and the tumor. That is, the third neural network 950 may detect at least one of an implant, a fracture, and a tumor.
- the estimation device 1A may include the first neural network 200 and the third neural network 950 without including the second neural network 900. Further, the estimation device 1A may include at least one of the second neural network 900 and the third neural network 950 without including the first neural network 200.
- the estimation device 1A has been described in detail. However, the above description is an example in all aspects, and the disclosure is not limited thereto. The various examples described above can be applied in combination as long as they do not conflict with each other. And it is understood that countless examples that are not illustrated can be assumed without departing from the scope of the present disclosure.
- FIG. 12 is a diagram illustrating an example of a configuration of the estimation device 1B according to the present embodiment.
- the estimation device 1B has a fracture prediction unit 980.
- the fracture prediction unit 980 can predict the probability of a fracture based on the estimation result 300 of the neural network 200 of the estimation device 1 according to the first embodiment, for example.
- an arithmetic expression 990 showing the relationship between the estimation result (for example, bone density) related to bone density and the probability of fracture is obtained from past literature.
- the fracture prediction unit 980 stores an arithmetic expression 990.
- the fracture prediction unit 980 can predict the probability of a fracture based on the input estimation result 300 and the stored arithmetic expression 990.
- the arithmetic expression 990 may be an arithmetic expression that indicates a relationship between the estimation result related to the bone density and the probability of a fracture after bone screw implantation.
- the estimating device 1B may include the second neural network 900. In addition, the estimation device 1B may include a third neural network 950.
- the estimating apparatus 1B has been described in detail, but the above description is an example in all aspects, and the present disclosure is not limited thereto.
- the various examples described above can be applied in combination as long as they do not conflict with each other. And it is understood that countless examples that are not illustrated can be assumed without departing from the scope of the present disclosure.
- FIG. 13 illustrates a concept of the configuration of the estimation system 801 according to the present embodiment.
- the estimation system 801 of the present disclosure can estimate the future bone mass of a subject from an image of the subject such as an X-ray image, for example.
- the estimation system 801 of the present disclosure includes a terminal device 802 and an estimation device 803.
- the bone mass is an index related to the bone density, and is a concept including the bone density.
- the terminal device 802 can acquire the input information I to be input to the estimation device 803.
- the input information I may be, for example, an X-ray image.
- the terminal device 802 may be any device that allows a doctor or the like to capture an X-ray image of a subject.
- the terminal device 802 may be a simple X-ray imaging device (in other words, a general X-ray imaging device or an X-ray imaging device).
- the terminal device 802 is not limited to a simple X-ray imaging device.
- the terminal device 802 may be, for example, an X-ray fluoroscope, CT (Computed Tomography), MRI (Magnetic Resonance Imaging), SPECT (Single Photon Emission Computed Tomography) -CT, or tomosynthesis.
- the input information I may be, for example, an X-ray fluoroscopic image, a CT (Computed @ Tomography) image, an MRI (Magnetic @ Resonance @ Imaging) image, a bone scintigraphy image, or a tomosynthesis image.
- the estimation system 801 is used, for example, for diagnosing osteoporosis of a patient who goes to a hospital.
- the estimation system 801 of the present disclosure captures a radiograph of a patient using, for example, a terminal device 802 installed in an X-ray room. Then, the image data is transferred from the terminal device 802 to the estimating device 803. Through the estimating device 803, not only the bone mass or the bone density of the patient at the present time but also the future bone mass or the bone density of the patient from the time of imaging is obtained. Can be estimated.
- the terminal device 802 does not have to directly transfer the input information I to the estimation device 803.
- the input information I acquired by the terminal device 802 may be stored in a storage medium, and the input information I may be input to the estimation device 803 via the storage medium.
- FIG. 14 illustrates the concept of the configuration of the estimation device 803 according to the present embodiment.
- the estimation device 803 can estimate the future bone mass or bone density of the subject based on the input information I acquired by the terminal device 802.
- the estimation device 803 can estimate the future bone mass or bone density of the subject from the image data acquired by the terminal device 802, and output the estimation result O.
- the estimation device 803 includes an input unit 831, an approximator 832, and an output unit 833.
- the input unit 831 is for inputting the input information I from the terminal device 802.
- the approximator 832 can estimate a future bone mass or bone density based on the input information I.
- the output unit 833 can output the estimation result O predicted by the approximator 832.
- the estimation device 803 has various electronic components and circuits. As a result, the estimation device 803 can form each component. For example, the estimation device 803 integrates a plurality of semiconductor elements to form at least one integrated circuit (for example, IC: Integrated Circuit or LSI: Large Scale Integration) or the like, or further integrates a plurality of integrated circuits. Each functional unit of the estimation device 803 can be configured by forming at least one unit.
- the plurality of electronic components may be, for example, active elements such as transistors or diodes or passive elements such as capacitors. Note that a plurality of electronic components and an integrated circuit formed by integrating them can be formed by a conventionally known method.
- the input unit 831 is for inputting information used in the estimation device 803. For example, input information I having an X-ray image acquired by the terminal device 802 is input to the input unit 831.
- the input unit 831 has a communication unit, and the input information I acquired by the terminal device 802 is directly input from the terminal device 802. Further, the input unit 831 may include an input device capable of inputting the input information I or other information.
- the input device may be, for example, a keyboard, a touch panel, a mouse, or the like.
- the approximator 832 estimates the future bone mass or bone density of the subject based on the information input to the input unit 831.
- the approximator 832 has AI (Artificial @ Intelligence).
- the approximator 832 has a program functioning as an AI, and various electronic components and circuits for executing the program.
- the approximator 832 has a neural network.
- the approximator 832 has learned in advance the relationship between input and output. That is, by applying machine learning to the approximator 832 using the learning data and the teacher data, the approximator 832 can calculate the estimation result O from the input information I.
- the learning data or the teacher data may be data corresponding to the input information I input to the estimation device 803 and the estimation result O output from the estimation device 803.
- FIG. 15 illustrates the concept of the configuration of the approximator 832 of the present disclosure.
- the approximator 832 has a first neural network 8321 and a second neural network 8322.
- the first neural network 8321 may be any neural network suitable for handling time-series information.
- the first neural network 8321 may be a ConvLSTM network that combines LSTM (Long Short-Term Memory) and CNN (Convolutional Neural Network).
- the second neural network 8322 may be, for example, a convolutional neural network configured by a CNN.
- the first neural network 8321 has an encoding unit E and a decoding unit D.
- the encoding unit E can extract a temporal change of the input information I and a feature amount of the position information.
- the decoding unit D can calculate a new feature amount based on the feature amount extracted by the encoding unit E, the time change of the input information I, and the initial value.
- FIG. 16 illustrates the concept of the configuration of the first neural network 8321 of the present disclosure.
- the encoding unit E has a plurality of ConvLSTM layers (Convolutional Long short-term memory) E1.
- the decoding unit D has a plurality of ConvLSTM layers (Convolutional Long Short-Term Memory) D1.
- Each of the encoding unit E and the decoding unit D may include three or more ConvLSTM layers E1 and D1. Further, the number of the plurality of ConvLSTM layers E1 and the number of the plurality of ConvLSTM layers D1 may be the same.
- the plurality of ConvLSTM layers E1 may have different learning contents.
- the plurality of ConvLSTM layers D1 may have different learning contents. For example, one ConvLTSM layer learns detailed contents such as a change of one pixel, and another ConvLSTM layer learns general contents such as a change of an entire image.
- FIG. 17 shows the concept of the configuration of the second neural network 8322.
- the second neural network 8322 has a conversion unit C.
- the conversion unit C can convert the feature amount calculated by the decoding unit D of the first neural network 8321 into a bone mass or a bone density.
- the conversion unit C has a plurality of convolutional layers C1, a plurality of pooling layers C2, and a fully connected layer C3.
- the all-coupling layer C3 is located before the output unit 33.
- the convolutional layers C1 and the pooling layers C2 are alternately arranged between the first neural network 8311 and the fully connected layer C3.
- the learning data is input to the encoding unit E of the approximator 832 when the approximator 832 learns.
- the teacher data is compared with output data output from the conversion unit C of the approximator 832 when the approximator 832 learns.
- the teacher data is data indicating a value measured using a conventional bone density measuring device.
- the output unit 833 can display the estimation result O.
- the output unit 833 is, for example, a liquid crystal display or an organic EL display.
- the output unit 833 can display various information such as characters, symbols, and graphics.
- the output unit 833 can display, for example, numbers or images.
- the estimation device 803 of the present disclosure further includes a control unit 834 and a storage unit 835.
- the control unit 834 can manage the operation of the estimating device 803 in an integrated manner by controlling other components of the estimating device 803.
- the control unit 834 includes, for example, a processor.
- the processor may be, for example, one or more processors, controllers, microprocessors, microcontrollers, application specific integrated circuits (ASICs), digital signal processors, programmable logic devices, or any of these devices or any of the calibrations. Combinations, or combinations of other base devices or calibrations.
- the control unit 834 includes, for example, a CPU.
- the storage unit 835 includes a non-transitory recording medium readable by the CPU of the control unit 834, such as a RAM (Random Access Memory) or a ROM (Read-Only Memory).
- the storage unit 835 stores a control program for controlling the estimation device 803 such as firmware.
- the storage unit 835 may store input information I to be input, learning data to be learned, and teacher data.
- the processor of the control unit 834 can execute one or more data calculation procedures or processes according to the control program of the storage unit 835.
- Various functions of the control unit 834 are realized by the CPU of the control unit 834 executing a control program in the storage unit 11.
- the control unit 834 may perform other processing as preprocessing of the calculation processing as necessary.
- the input information (hereinafter, also referred to as first input information I1) includes image data in which a bone whose bone mass or bone density is to be estimated is captured.
- the image data may be a simple X-ray image, for example.
- the estimation target of the bone mass or the bone density is, for example, a human.
- the first input information I1 is image data of a simple X-ray image showing a human bone.
- a simple X-ray image is a two-dimensional image, and is also called a general X-ray image or an X-ray image.
- the first input information I1 is preferably a simple X-ray image which is relatively easy to obtain, but is not limited thereto.
- a CT Computed @ Tomography
- MRI Magnetic @ Resonance @ Imaging
- a bone scintigraphy image or a tomosynthesis image as input information, it may be possible to more accurately estimate the bone mass or the bone density. is there.
- the target of estimating bone mass or bone density may be other than human.
- the target for estimating bone mass or bone density may be an animal such as a dog, cat or horse.
- the target bones are mainly cortical bones and cancellous bones of biological origin, and the target bones are artificial bones containing calcium phosphate as a main component or regenerated artificially produced by regenerative medicine or the like. Bone may be included.
- the imaging site of the X-ray image may be, for example, the neck, chest, waist, proximal femur, knee, ankle, shoulder, elbow, wrist, finger, or jaw joint.
- the X-ray image may include a part other than the bone.
- a chest simple X-ray image it may include an image of a lung and an image of a thoracic vertebra.
- the X-ray image may be a front image in which the target portion is seen from the front, or a side image in which the target portion is seen from the side.
- the learning data or the teacher data may be data corresponding to the input information I input to the estimation device 3 and the estimation result O output from the estimation device 3.
- the learning data has the same type of information as the first input information I1. For example, if the first input information I1 is a simple X-ray image, the learning data only needs to have a simple X-ray image. Further, when the first input information I1 is a chest simple X-ray image, the learning data only needs to have a chest simple X-ray image.
- the learning data includes learning image data of a plurality of simple X-ray images showing bones. For example, at least one of a neck, a chest, a waist, a proximal femur, a knee joint, an ankle joint, a shoulder joint, an elbow joint, a wrist joint, a finger joint, and a jaw joint are included in the imaging region of the plurality of learning image data. Is included.
- the learning data may include some types of image data among the 11 types of image data, or may include all types of image data.
- the plurality of learning image data may include a front image or a side image.
- the learning data shows the bones of different people.
- an actual measured value of the bone mass or the bone density of the subject in each of the learning image data is associated as teacher data.
- the actual measurement value of the bone mass or the bone density is measured at substantially the same time as when the learning image data was captured.
- the learning image data of the learning data may be a series of data having different time axes of the same person.
- the learning image data is a first learning data having an X-ray image of a bone and an image of the same person as the first learning data, and has an X-ray image taken after the first learning data.
- Second learning data may be included.
- the learning image data of the learning data may be a data group in which the same part of another person is photographed and whose age is different. Further, the learning image data of the learning data may be a series of data having different time axes when the same person and the same part are imaged.
- the learning data and the first input information I1 reduce grayscale image data representing a simple X-ray image captured by a simple X-ray imaging device (in other words, a general X-ray imaging device or an X-ray imaging device), and A device with a reduced number of gradations may be used.
- a simple X-ray imaging device in other words, a general X-ray imaging device or an X-ray imaging device
- a device with a reduced number of gradations may be used. For example, consider a case where the number of pixel data of the image data is larger than (1024 ⁇ 640) and the number of bits of the pixel data is 16 bits. In this case, the number of pixel data is reduced to, for example, (256 ⁇ 256), (1024 ⁇ 512) or (1024 ⁇ 640) and the number of bits of the pixel data is reduced to 8 bits, It is used as the first input information I1 and learning data.
- the teacher data includes, for each of the plurality of learning image data included in the learning data, a measured value of a bone mass or a bone density having a bone shown in the learning simple X-ray image indicated by the learning image data.
- the bone mass or the bone density may be measured by, for example, a DEXA (dual-energy X-ray absorptiometry) method or an ultrasonic method.
- the control unit 834 executes machine learning using the learning data and the teacher data for the approximator 832 so that the approximator 832 can calculate the estimation result O regarding the bone mass or the bone density from the input information I.
- the approximator 832 is optimized by known machine learning using teacher data.
- the approximator 832 calculates a variable parameter in the approximator 832 so that the difference between the pseudo estimation result calculated from the learning data input to the encoding unit E and output from the conversion unit C and the teacher data is reduced. adjust.
- the control unit 834 inputs the learning data in the storage unit 835 to the encoding unit E.
- the control unit 834 inputs a plurality of pixel data constituting the learning image data to a plurality of artificial neurons constituting the encoding unit E, respectively.
- the control unit 834 determines the error of the estimation result O output from the conversion unit C when the learning image data is input to the encoding unit E with respect to the bone mass or bone density measured value corresponding to the learning image data. Is adjusted so that is smaller.
- the adjusted parameter becomes a learned parameter and is stored in the storage unit 835.
- the parameters include, for example, parameters used in the encoding unit E, the decoding unit D, and the conversion unit C.
- the parameters include the ConvLSTM layers of the encoding unit E and the decoding unit D, and the weighting coefficients used in the convolutional layer and the fully connected layer of the conversion unit C.
- the approximator 832 performs an operation based on the learned parameters for the input information I input to the encoding unit E, and outputs the estimation result O from the conversion unit C.
- the X-ray image data as the input information I is input to the encoding unit E
- a plurality of pixel data configuring the image data is input to a plurality of artificial neurons configuring the input unit 831.
- the ConvLSTM layer, the convolutional layer, and the fully connected layer can perform an operation using the weighting coefficient included in the learned parameter, and output the estimation result O.
- the estimation system 801 learning of the approximator 832 and estimation of the bone mass or the bone density by the approximator 832 are performed using the image data of the simple X-ray image. Therefore, the input information I can be input to the estimation system 801 and the future bone mass or bone density can be output as the estimation result O.
- the estimation result O of the estimation system 801 may be an estimation result of a certain day in the future with respect to the acquisition date of the input information I.
- the estimation system 1 can estimate the bone mass or the bone density from 3 months to 50 years after imaging, more preferably from 6 months to 10 years after imaging.
- the estimation result O may be output as a value.
- it may be represented by at least one of YAM (Young ⁇ Adult ⁇ Mean), T score, and Z score.
- the output unit 833 may output an estimated value represented by YAM, an estimated value represented by YAM, an estimated value represented by a T score, and an estimated value represented by a Z score. May be output.
- the estimation result O may be output as an image.
- an X-ray image-like image may be displayed.
- the X-ray image-like image is an image imitating an X-ray image.
- a temporal change of the image can be predicted. Thereby, a future image can be generated from an X-ray image of another patient at one time.
- the learning data and the input information I may include an image of a visceral organ, a muscle, a fat, or a blood vessel in addition to the bone. Even in that case, highly accurate estimation can be performed.
- the first input information I1 may include the individual data (first individual data) of the subject.
- the first individual data may be, for example, age information, gender information, height information, weight information, or fracture history. As a result, highly accurate estimation can be performed.
- the first input information I1 may include the second individual data of the subject.
- the second individual data may include, for example, information on blood pressure, lipid, cholesterol, neutral fat, and blood sugar level. As a result, highly accurate estimation can be performed.
- the first input information I1 may include lifestyle information of the subject.
- the lifestyle information may be information such as drinking habits, smoking habits, exercise habits, and eating habits. As a result, highly accurate estimation can be performed.
- the first input information I1 may include bone metabolism information of the subject.
- the bone metabolism information may be, for example, bone resorption ability or bone formation ability.
- These include, for example, type I collagen cross-linked N-telopeptide (NTX), type I collagen cross-linked C-telopeptide (CTX), tartrate-resistant acid phosphatase (TRACP-5b), deoxypyridinoline ( DPD), bone formation alkaline phosphatase (BAP), a bone formation marker, collagen type I cross-linked N-propeptide (P1NP), and bone-related matrix marker, undercarboxylated osteocalcin (ucOC). is there.
- the bone resorption marker may be measured using serum or urine as a specimen.
- second input information I2 relating to the future scheduled behavior of the subject may be further input as the input information I.
- the second input information I2 may be, for example, individual data that is scheduled to be improved or has been improved, or information on lifestyle habits, exercise habits, and eating habits that have been planned or improved.
- the second input information I2 includes weight data, drinking habits, smoking habits, hours of sun exposure, steps or walking distance per day, intake of dairy products, or vitamins such as fish and mushrooms after improvement. Information such as the intake amount of a food containing a large amount of D may be used.
- the estimation system 801 can indicate the estimation result O with an improved future bone mass or bone density.
- the second input information I2 may be, for example, information on a lifestyle that is scheduled to deteriorate.
- the estimation system can indicate the estimation result O with the deteriorated future bone mass or bone density.
- third input information I3 relating to therapy for a subject may be further input as input information I.
- the third input information I3 is, for example, information on physical therapy or pharmacotherapy. Specifically, if the third input information I3 is at least one of a calcium drug, a female hormone drug, a vitamin drug, a bisphosphonate drug, a SERM (Selective Estrogen Receptor Modulator) drug, a calcitonin drug, a thyroid hormone drug, and a denosumab drug Good.
- the estimation result O is based on the first result O1 based on only the first input information I1, and at least one of the first input information I1 and the second and third input information I2 and I3.
- the second result O2 may be output.
- the estimation result O may output not only the future bone mass or bone density but also the current result. As a result, changes in bone mass or bone density over time can be compared.
- FIG. 18 illustrates the concept of the configuration of an approximator 832 according to another embodiment of the estimation system 1.
- the estimation device 803 of the estimation system 801 may include a first approximator 832a and a second approximator 832b. That is, a second approximator 832b may be provided in addition to the above-described approximator 832 (first approximator 832a).
- the second approximator 832b may be, for example, a CNN.
- the first approximator 832a outputs the first image and the first value to the first output unit 833a as the first estimation result O1.
- the second approximator 832b outputs a second value from the first image from the first output unit 833a as a second estimation result O2 to the second output unit 833b.
- the first value and the second value can be compared as the estimation result O of the future bone mass or bone density.
- the estimation system 801 may output a third value based on the first value and the second value as the estimation result O.
- a result for example, a result (third value) obtained by correcting the first value based on the second value can be used as the estimation result O.
- the estimation system 801 has been described in detail, but the above description is an example in all aspects, and the present disclosure is not limited thereto.
- the various examples described above can be applied in combination as long as they do not conflict with each other. And it is understood that countless examples that are not illustrated can be assumed without departing from the scope of the present disclosure.
- Reference Signs List 20 storage unit 100 control program 110, 910, 960 learned parameter 120 estimation data 130 learning data 140 teacher data 200 neural network 210 input layer 230 output layer 280, 832 approximator 500 processing unit 600 bone density estimation system 801 estimation system 802 Terminal device 803 Estimation device 831 Input unit 833 Output unit 834 Control unit 835 Storage unit 900, 8322 Second neural network 930 Judgment unit 950 Third neural network 980 Fracture prediction unit 8321 First neural network O Estimation result I Input Information E Encoding section D Decoding section C Conversion section
Abstract
Description
図1は本実施の形態1に係るコンピュータ装置1の構成の一例を示すブロック図である。コンピュータ装置1は、骨密度を推定する推定装置として機能する。以後、コンピュータ装置1を「推定装置1」と呼ぶことがある。
図3は、ニューラルネットワーク200の構成の一例を示す図である。本例では、ニューラルネットワーク200は、例えば、畳み込みニューラルネットワーク(CNN(Convolutional Neural Network))である。図3に示されるように、ニューラルネットワーク200は、例えば、入力層210と、隠れ層220と、出力層230とを備える。隠れ層220は中間層とも呼ばれる。隠れ層220は、例えば、複数の畳み込み層240と、複数のプーリング層250と、全結合層260とを備える。ニューラルネットワーク200では、出力層230の前段に全結合層260が存在している。そして、ニューラルネットワーク200では、入力層210と全結合層260との間において、畳み込み層240とプーリング層250とが交互に配置されている。
推定用データ120は、骨密度の推定対象の骨が写る単純X線像の画像データを含む。骨密度の推定対象は、例えば人である。したがって、推定用データ120は、人の骨が写る単純X線像の画像データを含むと言える。学習用データ130には、人の骨が写る複数の単純X線像の画像データが含まれる。単純X線像は、2次元像であって、一般X線像あるいはレントゲン像とも呼ばれる。なお、骨密度の推定対象は人以外であってもよい。例えば、骨密度の推定対象は、イヌ、ネコあるいはウマ等の動物であってもよい。また、対象とする骨は、主に、生物由来の皮質骨及び海綿骨であるが、対象とする骨に、リン酸カルシウムを主成分とする人工骨、あるいは再生医療等によって人工的に製造された再生骨が含まれてもよい。
図5は、ニューラルネットワーク200の学習の一例を説明するための図である。制御部10は、ニューラルネットワーク200の学習を行う場合には、図5に示されるように、ニューラルネットワーク200の入力層210に学習用データ130を入力する。そして、制御部10は、ニューラルネットワーク200の出力層230から出力される出力データ400についての教師データ140に対する誤差が小さくなるように、ニューラルネットワーク200内の可変のパラメータ110aを調整する。より詳細には、制御部10は、記憶部20内の各学習用画像データを入力層210に入力する。制御部10は、入力層210に学習用画像データを入力する場合には、当該学習用画像データを構成する複数の画素データを、入力層210を構成する複数の人工ニューロンにそれぞれ入力する。そして、制御部10は、入力層210に学習用画像データを入力した場合に出力層230から出力される出力データ400についての、当該学習用画像データに対応する基準骨密度に対する誤差が小さくなるように、パラメータ110aを調整する。パラメータ110aの調整方法としては、例えば、誤差逆伝播法が採用される。調整後のパラメータ110aが学習済みパラメータ110となり、記憶部20に記憶される。パラメータ110aには、例えば、隠れ層220で使用されるパラメータが含まれる。具体的には、パラメータ110aには、畳み込み層240で使用されるフィルタ係数と、全結合層260で使用される重み付け係数とが含まれる。なお、パラメータ110aの調整方法、言い換えればパラメータ110aの学習方法は、この限りではない。
<第1の他の例>
本例では、学習用データ130には、各学習用画像データについて、当該学習用画像データが示す学習用単純X線像に写る骨を有する人の健康状態に関する情報が含まれる。言い換えれば、学習用データ130には、各学習用画像データについて、当該学習用画像データが示す学習用単純X線像の被写体(被検体)の健康状態に関する情報が含まれる。以後、学習用単純X線像の被写体の健康状態に関する情報を「学習用健康関連情報」と呼ぶことがある。また、学習用画像データが示す学習用単純X線像の被写体の健康状態に関する情報を、当該学習用画像データに対応する学習用健康関連情報と呼ぶことがある。
本例では、学習用データ130には、同一の人が有する部位が写り、それらに写る部位の向きが互いに異なるN個(N≧2)の学習用単純X線像の画像データが含まれる。以後、当該N個の学習用単純X線像をまとめて「学習用単純X線像セット」と呼ぶことがある。
図7は、本実施の形態に係る推定装置1Aの構成の一例を示す図である。推定装置1Aでは、近似器280が、さらに第2のニューラルネットワーク900を有している。第2のニューラルネットワーク900は、学習済みパラメータ910に基づいて骨折を検知することができる。なお、本実施の形態に係る推定装置1Aは、第1の実施形態に係る推定装置1と同等の構成を有しており、同等の構成については説明を省略する。また、説明の便宜上、上記の例に記載のニューラルネットワーク200を第1のニューラルネットワーク200という。第2のニューラルネットワーク900は、例えば、第1のニューラルネットワーク200と同等の構成を有している。
図12は、本実施の形態に係る推定装置1Bの構成の一例を示す図である。推定装置1Bは骨折予測部980を有する。骨折予測部980は、例えば、実施形態1に係る推定装置1のニューラルネットワーク200の推定結果300に基づいて、骨折する確率を予測することができる。具体的には、例えば、過去の文献などから、骨密度に関連する推定結果(例えば骨密度など)と骨折する確率との関係を示す演算式990が求められる。骨折予測部980は演算式990を記憶している。骨折予測部980は、入力される推定結果300と、記憶する演算式990とに基づいて、骨折の確率を予測することができる。
図13に、本実施の形態の推定システム801の構成の概念を示す。
入力情報(以下、第1入力情報I1ともいう)は、骨量または骨密度の推定対象の骨が写る画像データを有している。画像データは、例えば単純X線像であればよい。骨量または骨密度の推定対象は、例えば人である。この場合、第1入力情報I1は、人の骨が写る単純X線像の画像データであると言える。単純X線像は、2次元像であって、一般X線像あるいはレントゲン像とも呼ばれる。
制御部834は、近似器832が、入力情報Iから骨量または骨密度に関する推定結果Oを算出可能なように、近似器832に対して学習データと教師データを用いた機械学習を実行する。近似器832は、教師データを用いた公知の機械学習により、最適化される。近似器832は、エンコード部Eに入力された学習データから演算されて変換部Cから出力された疑似推定結果と、教師データとの差が小さくなるように、近似器832内の可変のパラメータを調整する。
20 記憶部
100 制御プログラム
110,910,960 学習済みパラメータ
120 推定用データ
130 学習用データ
140 教師データ
200 ニューラルネットワーク
210 入力層
230 出力層
280,832 近似器
500 処理装置
600 骨密度推定システム
801 推定システム
802 端末装置
803 推定装置
831 入力部
833 出力部
834 制御部
835 記憶部
900,8322 第2のニューラルネットワーク
930 判定部
950 第3のニューラルネットワーク
980 骨折予測部
8321 第1のニューラルネットワーク
O 推定結果
I 入力情報
E エンコード部
D デコード部
C 変換部
Claims (20)
- 骨が写っている画像を有した入力情報が入力される入力部と、
前記入力部に入力された入力情報から、前記骨の骨密度に関連する推定結果を推定可能な近似器と、
を備え、
前記近似器は、
前記骨の骨密度に関連する推定結果を前記入力情報から求めるための学習済みパラメータを有する、推定装置。 - 請求項1に記載の推定装置であって、
前記入力情報は、第1の単純X線像を有している、推定装置。 - 請求項1または2に記載の推定装置であって、
前記推定結果は、単位面積当りの骨ミネラル密度(g/cm2)、単位体積当りの骨ミネラル密度(g/cm3)、YAM、Tスコア、Zスコアの少なくとも1種類によって表される、推定装置。 - 請求項1~3のいずれかに一つに記載の推定装置であって
前記近似器は、骨密度に関連する将来の推定結果を推定可能である、推定装置。 - 請求項4に記載の推定装置であって、
前記近似器は、前記入力情報の時間変化および位置情報の特徴量を抽出するエンコード部と、前記特徴量、前記入力情報の時間変化および初期値をもとに新たな特徴量を算出するデコード部と、前記新たな特徴量を骨密度に変換する変換部を有している、推定装置。 - 請求項1~5のいずれか一つに記載の推定装置であって、
前記推定結果は、画像を有している、推定装置。 - 請求項6に記載の推定装置であって、
前記画像は、X線像様画像を有している、推定装置。 - 請求項6または7に記載の推定装置であって、
前記近似器は、前記推定結果として前記画像および第1の値を推定可能な第1近似器と、前記画像から前記推定結果として第2の値を推定可能な第2近似器と、を有している、推定装置。 - 請求項8に記載の推定装置であって、
前記推定結果として、前記第1の値および前記第2の値に基づいて第3の値を出力する、推定装置。 - 請求項1~9のいずれか一つに記載の推定装置であって、
前記入力情報は、被写体の個体データをさらに有している、推定装置。 - 請求項4~10のいずれか一つに記載の推定装置であって、
前記入力部には、前記入力情報を第1入力情報としたときに、被写体の将来の予定行動に関する第2入力情報がさらに入力される、推定装置。 - 請求項4~11のいずれか一つに記載の推定装置であって、
前記入力部には、被写体に対する療法に関する第3入力情報がさらに入力される、推定装置。 - 請求項11または12に記載の推定装置であって、
前記推定結果は、前記第1入力情報のみに基づく第1結果と、前記第1入力情報と前記第2及び第3入力情報の少なくとも1つとに基づく第2結果とを有している、推定装置。 - 請求項4~13のいずれか一つに記載の推定装置であって、
前記近似器は、前記画像を有した第1学習データと、前記第1学習データと同一人物の画像であり、前記第1学習データよりも後に撮影された前記画像を有した第2学習データと、を用いて学習処理されている、推定装置。 - 請求項1~14のいずれかに一つに記載の推定装置であって、
前記近似器は、前記入力部に入力された入力情報から、前記骨の骨折、腫瘍、インプラントのうち少なくとも1つを検知可能な第3近似器を、さらに有しており、
前記第3近似器は、
前記骨の骨折、腫瘍、インプラントのうち少なくとも1つの検知を前記入力情報から求めるための学習済みパラメータを有する、推定装置。 - 請求項1~15に記載の推定装置であって、
前記近似器での推定結果に基づいて、被写体が骨粗鬆症であるか否かを判定する、判定部を、さらに備える、推定装置。 - 請求項1~16に記載の推定装置であって、
前記近似器での推定結果に基づいて、前記骨の骨折を予測する骨折予測部をさらに備える、推定装置。 - 請求項1~17に記載の推定装置であって、
前記近似器は、前記画像に写る骨を区分けした上で、区分けした骨ごとに骨密度を推定する、推定装置。 - 骨が写っている画像を有した入力情報が入力される入力部と、
前記骨の骨密度に関連する推定結果を前記入力情報から求めるための学習済みパラメータを有し、前記入力部に入力された入力情報から前記骨の骨密度に関連する推定結果を推定可能な近似器と、
を備え、
前記入力部に入力情報が入力されたときに、前記近似器で前記入力情報の演算処理を行なう、推定システム。 - 骨が写っている画像を有した入力情報から前記骨の骨密度に関連する推定結果を求めるための学習済みパラメータに基づく演算を行い、前記画像に写る骨の骨密度の推定値を出力するニューラルネットワークとして、装置を機能させるための推定プログラム。
Priority Applications (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP19858846.9A EP3851048A4 (en) | 2018-09-10 | 2019-09-10 | ESTIMATING DEVICE, ESTIMATING SYSTEM, AND ESTIMATING PROGRAM |
JP2019563116A JP6744614B1 (ja) | 2018-09-10 | 2019-09-10 | 推定装置、推定システム及び推定プログラム |
EP23200806.0A EP4276753A3 (en) | 2018-09-10 | 2019-09-10 | Estimation device, estimation system, and estimation program |
AU2019339090A AU2019339090B2 (en) | 2018-09-10 | 2019-09-10 | Estimation apparatus, estimation system, and estimation program |
US17/274,757 US20220051398A1 (en) | 2018-09-10 | 2019-09-10 | Estimation apparatus, estimation system, and computer-readable non-transitory medium storing estimation program |
CN202311399597.7A CN117393146A (zh) | 2018-09-10 | 2019-09-10 | 推定装置、系统以及推定方法 |
CN201980058486.5A CN112654291A (zh) | 2018-09-10 | 2019-09-10 | 推定装置、推定系统以及推定程序 |
AU2022241515A AU2022241515B2 (en) | 2018-09-10 | 2022-09-28 | Estimation apparatus, estimation system, and estimation program |
AU2023285899A AU2023285899A1 (en) | 2018-09-10 | 2023-12-21 | Estimation apparatus, estimation system, and estimation program |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018-168502 | 2018-09-10 | ||
JP2018168502 | 2018-09-10 | ||
JP2018-220401 | 2018-11-26 | ||
JP2018220401 | 2018-11-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020054738A1 true WO2020054738A1 (ja) | 2020-03-19 |
Family
ID=69777881
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/035594 WO2020054738A1 (ja) | 2018-09-10 | 2019-09-10 | 推定装置、推定システム及び推定プログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220051398A1 (ja) |
EP (2) | EP4276753A3 (ja) |
JP (16) | JP6744614B1 (ja) |
CN (2) | CN117393146A (ja) |
AU (3) | AU2019339090B2 (ja) |
WO (1) | WO2020054738A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20220033373A (ko) * | 2020-09-09 | 2022-03-16 | 울산대학교 산학협력단 | 의료영상 처리 장치와 그 의료영상 학습 방법 및 의료영상 처리 방법 |
EP4056120A1 (en) | 2021-03-12 | 2022-09-14 | FUJI-FILM Corporation | Estimation device, estimation method, and estimation program |
KR20230007090A (ko) * | 2021-07-05 | 2023-01-12 | 순천향대학교 산학협력단 | Ai를 기반으로 하는 골밀도 변화 예측 장치 및 그 방법 |
WO2023224022A1 (ja) * | 2022-05-20 | 2023-11-23 | 国立大学法人大阪大学 | プログラム、情報処理方法、及び情報処理装置 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11741694B2 (en) * | 2020-06-09 | 2023-08-29 | Merative Us L.P. | Spinal fracture detection in x-ray images |
JP7418018B2 (ja) * | 2021-11-12 | 2024-01-19 | iSurgery株式会社 | 診断支援装置、およびコンピュータプログラム |
JP7322262B1 (ja) | 2022-08-30 | 2023-08-07 | ジーイー・プレシジョン・ヘルスケア・エルエルシー | 仮想単色x線画像を推論する装置、ctシステム、学習済みニューラルネットワークの作成方法、および記憶媒体 |
JP7383770B1 (ja) | 2022-08-31 | 2023-11-20 | ジーイー・プレシジョン・ヘルスケア・エルエルシー | 物質密度画像を推論する装置、ctシステム、記憶媒体、および学習済みニューラルネットワークの作成方法 |
WO2024090050A1 (ja) * | 2022-10-27 | 2024-05-02 | 富士フイルム株式会社 | 画像処理装置、方法およびプログラム、並びに学習装置、方法およびプログラム |
WO2024090584A1 (ja) * | 2022-10-28 | 2024-05-02 | 京セラ株式会社 | 画像生成装置、画像生成方法、表示装置、画像生成プログラム及び記録媒体 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09508813A (ja) * | 1993-11-29 | 1997-09-09 | アーチ ディヴェロプメント コーポレイション | コンピュータによる骨のx線写真解析の方法およびシステム |
US20010002925A1 (en) * | 1999-12-01 | 2001-06-07 | Cyberlogic, Inc., | Plain x-ray bone densitometry apparatus and method |
JP2002523204A (ja) | 1998-08-28 | 2002-07-30 | アーチ・デベロップメント・コーポレーション | 骨質量・構造のコンピュータ依拠分析方法および装置 |
US6570955B1 (en) * | 2002-01-08 | 2003-05-27 | Cyberlogic, Inc. | Digital x-ray material testing and bone densitometry apparatus and method |
JP2007052774A (ja) * | 1995-07-25 | 2007-03-01 | Ortho-Clinical Diagnostics Inc | コンピュータ援用疾病診断方法 |
JP2008036068A (ja) | 2006-08-04 | 2008-02-21 | Hiroshima Univ | 骨粗鬆症診断支援装置および方法、骨粗鬆症診断支援プログラム、骨粗鬆症診断支援プログラムを記録したコンピュータ読み取り可能な記録媒体、骨粗鬆症診断支援用lsi |
US20160015347A1 (en) * | 2014-07-21 | 2016-01-21 | Zebra Medical Vision Ltd. | Systems and methods for emulating dexa scores based on ct images |
JP2018011958A (ja) * | 2016-07-21 | 2018-01-25 | 東芝メディカルシステムズ株式会社 | 医用画像処理装置及び医用画像処理プログラム |
KR20180029476A (ko) * | 2016-09-12 | 2018-03-21 | 주식회사 뷰노 | 골밀도 추정 방법 및 장치 |
Family Cites Families (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5259384A (en) * | 1992-07-30 | 1993-11-09 | Kaufman Jonathan J | Ultrasonic bone-assessment apparatus and method |
JPH07289597A (ja) * | 1994-04-22 | 1995-11-07 | Kiyotoshi Oshiro | 寝たきり患者の骨粗鬆症治療装置 |
CA2158033A1 (en) * | 1995-02-13 | 1996-08-14 | Joseph P. Bisek | Method for periprosthetic bone mineral density measurement |
JPH09248292A (ja) * | 1996-03-15 | 1997-09-22 | Asahi Chem Ind Co Ltd | 骨粗しょう症診断装置および骨粗しょう症診断方法 |
US6064716A (en) * | 1997-09-05 | 2000-05-16 | Cyberlogic, Inc. | Plain x-ray bone densitometry apparatus and method |
JP3182558B2 (ja) * | 1998-01-06 | 2001-07-03 | 株式会社センサ | 超音波計測による骨塩量評価法 |
JP2000217874A (ja) | 1999-01-28 | 2000-08-08 | Yaskawa Electric Corp | 関節駆動装置 |
US6430427B1 (en) | 1999-02-25 | 2002-08-06 | Electronics And Telecommunications Research Institute | Method for obtaining trabecular index using trabecular pattern and method for estimating bone mineral density using trabecular indices |
US6246745B1 (en) * | 1999-10-29 | 2001-06-12 | Compumed, Inc. | Method and apparatus for determining bone mineral density |
JP2002238904A (ja) | 2001-02-19 | 2002-08-27 | Tanita Corp | 骨密度推定方法および骨密度推定装置 |
FR2836818B1 (fr) * | 2002-03-05 | 2004-07-02 | Eurosurgical | Procede de visualisation et de controle de l'equilibre d'une colonne vertebrale |
EP1357480A1 (en) | 2002-04-17 | 2003-10-29 | Agfa-Gevaert | Osteoporosis screening method |
EP1605824A2 (en) * | 2003-03-25 | 2005-12-21 | Imaging Therapeutics, Inc. | Methods for the compensation of imaging technique in the processing of radiographic images |
US8290564B2 (en) * | 2003-09-19 | 2012-10-16 | Imatx, Inc. | Method for bone structure prognosis and simulated bone remodeling |
US7545965B2 (en) * | 2003-11-10 | 2009-06-09 | The University Of Chicago | Image modification and detection using massive training artificial neural networks (MTANN) |
KR20040058150A (ko) * | 2004-06-11 | 2004-07-03 | (주)엠텍 | 골 충실도를 이용한 골밀도 측정 방법 및 장치 |
DE602004029211D1 (de) * | 2004-11-10 | 2010-10-28 | Agfa Healthcare | Verfahren zum Vornehmen von Messungen an digitalen Bildern |
JP5426170B2 (ja) * | 2005-11-11 | 2014-02-26 | ホロジック, インコーポレイテッド | 三次元骨密度モデルを使用して将来の骨折の危険性の推定 |
US7746976B2 (en) | 2005-12-30 | 2010-06-29 | Carestream Health, Inc. | Bone mineral density assessment using mammography system |
US20090285467A1 (en) | 2008-05-15 | 2009-11-19 | New Medical Co., Ltd. | Method for assessing bone status |
US8126249B2 (en) * | 2008-05-30 | 2012-02-28 | Optasia Medical Limited | Methods of and system for detection and tracking of osteoporosis |
JP2009285356A (ja) * | 2008-05-30 | 2009-12-10 | Institute Of National Colleges Of Technology Japan | 医療用撮影システム、画像処理装置、画像処理方法、およびプログラム |
WO2010117575A2 (en) * | 2009-04-07 | 2010-10-14 | Virginia Commonwealth University | Accurate pelvic fracture detection for x-ray and ct images |
EP2477503B1 (en) | 2009-09-17 | 2013-07-31 | Danone, S.A. | Process for preparing a pasteurised and fermented dairy product supplemented with calcium and vitamin d |
JP5827801B2 (ja) | 2010-12-29 | 2015-12-02 | 日立アロカメディカル株式会社 | 医療用測定装置 |
JP2013085631A (ja) * | 2011-10-17 | 2013-05-13 | Konica Minolta Medical & Graphic Inc | 関節撮影装置 |
JP2014158628A (ja) * | 2013-02-20 | 2014-09-04 | Univ Of Tokushima | 画像処理装置、画像処理方法、制御プログラム、および記録媒体 |
JP6129145B2 (ja) * | 2014-12-03 | 2017-05-17 | 株式会社日立製作所 | 医療用x線測定装置 |
WO2016129682A1 (ja) * | 2015-02-13 | 2016-08-18 | 株式会社島津製作所 | 骨解析装置 |
FR3035785B1 (fr) * | 2015-05-07 | 2017-06-02 | Arts | Procede d'estimation de la repartition de la densite minerale osseuse dans au moins une partie de squelette d'un individu |
WO2016194161A1 (ja) * | 2015-06-03 | 2016-12-08 | 株式会社日立製作所 | 超音波診断装置、及び画像処理方法 |
JP6280676B2 (ja) * | 2016-02-15 | 2018-02-14 | 学校法人慶應義塾 | 脊柱配列推定装置、脊柱配列推定方法及び脊柱配列推定プログラム |
JP2018005520A (ja) | 2016-06-30 | 2018-01-11 | クラリオン株式会社 | 物体検出装置及び物体検出方法 |
JP7057959B2 (ja) | 2016-08-09 | 2022-04-21 | 住友ゴム工業株式会社 | 動作解析装置 |
GB201702080D0 (en) | 2017-02-08 | 2017-03-22 | Sheffield Hallam Univ | Apparatus and method for quantitative determination of bone density |
CN107485405B (zh) * | 2017-08-18 | 2021-02-19 | 浙江康源医疗器械有限公司 | 一种利用参考模块测量骨密度的装置 |
JP6585869B1 (ja) | 2018-05-09 | 2019-10-02 | 威久 山本 | 将来の骨量を予測する方法、情報処理装置、及びコンピュータプログラム |
JP7016293B2 (ja) * | 2018-06-08 | 2022-02-04 | 富士フイルム株式会社 | 骨塩情報取得装置、方法およびプログラム |
JP2022156223A (ja) * | 2021-03-31 | 2022-10-14 | エドワーズ株式会社 | 真空ポンプ |
-
2019
- 2019-09-10 EP EP23200806.0A patent/EP4276753A3/en active Pending
- 2019-09-10 US US17/274,757 patent/US20220051398A1/en active Pending
- 2019-09-10 AU AU2019339090A patent/AU2019339090B2/en active Active
- 2019-09-10 CN CN202311399597.7A patent/CN117393146A/zh active Pending
- 2019-09-10 CN CN201980058486.5A patent/CN112654291A/zh active Pending
- 2019-09-10 EP EP19858846.9A patent/EP3851048A4/en active Pending
- 2019-09-10 WO PCT/JP2019/035594 patent/WO2020054738A1/ja unknown
- 2019-09-10 JP JP2019563116A patent/JP6744614B1/ja active Active
-
2020
- 2020-07-22 JP JP2020124995A patent/JP2020171785A/ja active Pending
-
2022
- 2022-05-02 JP JP2022075792A patent/JP7157425B2/ja active Active
- 2022-05-02 JP JP2022075793A patent/JP7157426B2/ja active Active
- 2022-09-28 AU AU2022241515A patent/AU2022241515B2/en active Active
- 2022-09-29 JP JP2022156224A patent/JP7260887B2/ja active Active
- 2022-09-29 JP JP2022156223A patent/JP7260886B2/ja active Active
- 2022-10-28 JP JP2022173247A patent/JP7217906B2/ja active Active
-
2023
- 2023-02-17 JP JP2023023058A patent/JP7264364B2/ja active Active
- 2023-02-17 JP JP2023023167A patent/JP7385228B2/ja active Active
- 2023-02-17 JP JP2023023176A patent/JP7385229B2/ja active Active
- 2023-02-17 JP JP2023023155A patent/JP7266230B2/ja active Active
- 2023-03-15 JP JP2023040752A patent/JP7283672B1/ja active Active
- 2023-03-16 JP JP2023042028A patent/JP7283673B1/ja active Active
- 2023-03-29 JP JP2023053286A patent/JP7292667B1/ja active Active
- 2023-03-29 JP JP2023053197A patent/JP7452825B2/ja active Active
- 2023-03-29 JP JP2023053222A patent/JP2023089022A/ja active Pending
- 2023-12-21 AU AU2023285899A patent/AU2023285899A1/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09508813A (ja) * | 1993-11-29 | 1997-09-09 | アーチ ディヴェロプメント コーポレイション | コンピュータによる骨のx線写真解析の方法およびシステム |
JP2007052774A (ja) * | 1995-07-25 | 2007-03-01 | Ortho-Clinical Diagnostics Inc | コンピュータ援用疾病診断方法 |
JP2002523204A (ja) | 1998-08-28 | 2002-07-30 | アーチ・デベロップメント・コーポレーション | 骨質量・構造のコンピュータ依拠分析方法および装置 |
US20010002925A1 (en) * | 1999-12-01 | 2001-06-07 | Cyberlogic, Inc., | Plain x-ray bone densitometry apparatus and method |
US6570955B1 (en) * | 2002-01-08 | 2003-05-27 | Cyberlogic, Inc. | Digital x-ray material testing and bone densitometry apparatus and method |
JP2008036068A (ja) | 2006-08-04 | 2008-02-21 | Hiroshima Univ | 骨粗鬆症診断支援装置および方法、骨粗鬆症診断支援プログラム、骨粗鬆症診断支援プログラムを記録したコンピュータ読み取り可能な記録媒体、骨粗鬆症診断支援用lsi |
US20160015347A1 (en) * | 2014-07-21 | 2016-01-21 | Zebra Medical Vision Ltd. | Systems and methods for emulating dexa scores based on ct images |
JP2018011958A (ja) * | 2016-07-21 | 2018-01-25 | 東芝メディカルシステムズ株式会社 | 医用画像処理装置及び医用画像処理プログラム |
KR20180029476A (ko) * | 2016-09-12 | 2018-03-21 | 주식회사 뷰노 | 골밀도 추정 방법 및 장치 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20220033373A (ko) * | 2020-09-09 | 2022-03-16 | 울산대학교 산학협력단 | 의료영상 처리 장치와 그 의료영상 학습 방법 및 의료영상 처리 방법 |
KR102573893B1 (ko) * | 2020-09-09 | 2023-09-01 | 프로메디우스 주식회사 | 의료영상 처리 장치 및 의료영상 처리 방법 |
EP4056120A1 (en) | 2021-03-12 | 2022-09-14 | FUJI-FILM Corporation | Estimation device, estimation method, and estimation program |
KR20230007090A (ko) * | 2021-07-05 | 2023-01-12 | 순천향대학교 산학협력단 | Ai를 기반으로 하는 골밀도 변화 예측 장치 및 그 방법 |
KR102654088B1 (ko) * | 2021-07-05 | 2024-04-04 | 순천향대학교 산학협력단 | Ai를 기반으로 하는 골밀도 변화 예측 장치 및 그 방법 |
WO2023224022A1 (ja) * | 2022-05-20 | 2023-11-23 | 国立大学法人大阪大学 | プログラム、情報処理方法、及び情報処理装置 |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6744614B1 (ja) | 推定装置、推定システム及び推定プログラム | |
WO2020138085A1 (ja) | 疾患予測システム | |
JP2024063171A (ja) | 推定システム、推定方法、推定装置、プログラム及び記録媒体 | |
JP7479648B1 (ja) | 予測装置、予測システム、予測方法、制御プログラム、および記録媒体 | |
WO2023224022A1 (ja) | プログラム、情報処理方法、及び情報処理装置 | |
Ramos | PRECISION STUDY OF STOCHASTIC PREDICTORS FOR DXA SCANS |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2019563116 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19858846 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2019339090 Country of ref document: AU Date of ref document: 20190910 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2019858846 Country of ref document: EP Effective date: 20210412 |