WO2009148041A1 - 医用画像処理装置及び医用画像処理方法 - Google Patents
医用画像処理装置及び医用画像処理方法 Download PDFInfo
- Publication number
- WO2009148041A1 WO2009148041A1 PCT/JP2009/060043 JP2009060043W WO2009148041A1 WO 2009148041 A1 WO2009148041 A1 WO 2009148041A1 JP 2009060043 W JP2009060043 W JP 2009060043W WO 2009148041 A1 WO2009148041 A1 WO 2009148041A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- calculation unit
- medical
- motion
- medical image
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims description 51
- 238000000034 method Methods 0.000 title description 30
- 238000004364 calculation method Methods 0.000 claims abstract description 105
- 230000033001 locomotion Effects 0.000 claims description 155
- 238000000605 extraction Methods 0.000 claims description 59
- 238000005259 measurement Methods 0.000 claims description 22
- 230000003902 lesion Effects 0.000 claims description 10
- 238000003384 imaging method Methods 0.000 claims description 7
- 238000013500 data storage Methods 0.000 claims description 6
- 230000000877 morphologic effect Effects 0.000 claims description 4
- 238000003672 processing method Methods 0.000 claims description 4
- 239000000523 sample Substances 0.000 claims description 4
- 210000004204 blood vessel Anatomy 0.000 claims description 2
- 230000010349 pulsation Effects 0.000 claims description 2
- 230000000241 respiratory effect Effects 0.000 claims description 2
- 238000006243 chemical reaction Methods 0.000 claims 1
- 238000002059 diagnostic imaging Methods 0.000 claims 1
- 238000002604 ultrasonography Methods 0.000 abstract description 9
- 239000013598 vector Substances 0.000 description 59
- 210000002216 heart Anatomy 0.000 description 17
- 230000000694 effects Effects 0.000 description 14
- 238000004458 analytical method Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 13
- 239000000284 extract Substances 0.000 description 13
- 230000002107 myocardial effect Effects 0.000 description 7
- 210000004165 myocardium Anatomy 0.000 description 7
- 238000003745 diagnosis Methods 0.000 description 5
- 238000002592 echocardiography Methods 0.000 description 5
- 238000000513 principal component analysis Methods 0.000 description 5
- 238000012880 independent component analysis Methods 0.000 description 4
- 230000005856 abnormality Effects 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 201000010099 disease Diseases 0.000 description 3
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000007689 inspection Methods 0.000 description 3
- 230000000747 cardiac effect Effects 0.000 description 2
- 230000008602 contraction Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000000056 organ Anatomy 0.000 description 2
- 230000008054 signal transmission Effects 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000002308 calcification Effects 0.000 description 1
- 210000005242 cardiac chamber Anatomy 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 208000029078 coronary artery disease Diseases 0.000 description 1
- 210000004351 coronary vessel Anatomy 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000142 dyskinetic effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000003483 hypokinetic effect Effects 0.000 description 1
- 208000028867 ischemia Diseases 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 208000010125 myocardial infarction Diseases 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5217—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5235—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30048—Heart; Cardiac
Definitions
- the present invention relates to a medical image processing apparatus and method for classifying cross-sectional information and tissue dynamics of medical images by image recognition.
- the examiner classifies the medical image according to the cross-sectional information of the medical image and the type of tissue dynamics in order to efficiently detect the disease by the image diagnosis. .
- This classification method classifies medical images using image recognition technology because cross-sectional information and tissue dynamics are characterized by the shape of the image. Interpreting images as a result of classification by the examiner is more efficient than comparing them one by one.
- the type of heart cross-section and tissue position information are acquired and classified by image recognition technology using the feature value of the luminance value of a still image. Furthermore, in this heart measurement, the motion analysis of the living tissue is performed, the analysis result is displayed on the screen, and the medical image classified erroneously can be corrected by the examiner (for example, Patent Document 1).
- Patent Document 1 is limited to performing a motion analysis of the living tissue, displaying the analysis result on the screen, and correcting a medical image classified incorrectly by the examiner.
- Patent Document 1 does not consider any improvement in classification accuracy of medical images including moving biological tissues.
- An object of the present invention is to provide a medical image processing apparatus and a medical image processing method capable of improving the classification accuracy of medical images including a moving biological tissue.
- a medical image processing apparatus of the present invention includes an image information acquisition unit that obtains a plurality of medical images obtained by imaging a living tissue of a subject, and pixels that have different time phases of the obtained medical images.
- An image recognition calculation unit that obtains motion information of the living tissue from values and classifies the medical images into predetermined types based on the motion information.
- the image information acquisition unit obtains a plurality of medical images obtained by imaging the biological tissue of the subject, and the image recognition calculation unit obtains the motion information of the biological tissue from pixel values of different time phases of the obtained medical images.
- the image recognition calculation unit obtains the motion information of the biological tissue from pixel values of different time phases of the obtained medical images.
- the image recognition calculation unit obtains the motion information of the biological tissue from pixel values of different time phases of the obtained medical images.
- by classifying the medical images into predetermined types based on the motion information it is possible to improve the accuracy of image recognition of medical images.
- the step of obtaining a plurality of medical images obtained by imaging the living tissue of the subject by the image information acquisition unit and the time phases of the plurality of medical images obtained by the image recognition calculation unit differ from each other.
- the medical image acquisition step obtains a plurality of medical images obtained by imaging the biological tissue of the subject by the image information acquisition unit, and then the medical image classification step includes a time phase difference between the plurality of medical images obtained by the image recognition calculation unit.
- a medical image processing apparatus and a medical image processing method capable of improving the classification accuracy of medical images including a moving biological tissue.
- FIG. 1 is a block diagram showing an outline of a medical image processing apparatus in Embodiment 1 of the present invention.
- 9 is a flowchart showing an outline of the operation of the medical image processing apparatus in Embodiment 2 of the present invention.
- 10 is a flowchart showing an outline of the operation of the medical image processing apparatus in Embodiment 3 of the present invention.
- FIG. 9 is a block diagram showing an outline of a medical image processing apparatus in Embodiment 4 of the present invention.
- FIG. 9 is a block diagram showing an outline of a medical image processing apparatus in Embodiment 5 of the present invention.
- FIG. 9 is a block diagram showing an outline of a medical image processing apparatus in Embodiment 5 of the present invention.
- 10 is a flowchart showing an outline of the operation of the medical image processing apparatus in Embodiment 5 of the present invention.
- FIG. 10 is a diagram showing a display example of stress echo inspection for explaining Example 6 of the present invention.
- 18 is a flowchart showing an outline of an image search operation of the medical image processing apparatus according to the seventh embodiment of the present invention.
- the figure which shows the example of a display in case an image with the highest similarity is displayed on the image from a medical image generation part.
- FIG. 6 is a diagram showing an example of a motion vector calculation method different from FIGS.
- the medical image is an ultrasonic image obtained from an ultrasonic diagnostic apparatus.
- the target of the moving biological tissue image is a heart image.
- an extraction period of heart motion is variably set in the ultrasonic diagnostic apparatus, and ultrasonic images are classified into predetermined types in the variably set extraction period of heart motion.
- FIG. 1 is a block diagram showing an outline of a medical image processing apparatus according to the first embodiment of the present invention.
- the medical image processing apparatus includes an ultrasonic image generation unit 1, an image recognition calculation unit 3 connected to the ultrasonic image generation unit 1 so as to be able to transmit signals, and an image recognition calculation device 3. And an image display unit 5 connected so as to be able to transmit signals.
- “Signal transmission is possible” is defined as being capable of signal transmission between the connected body and the connected body by any means such as electromagnetic and light.
- the ultrasonic image generation unit 1 generates an ultrasonic image using an ultrasonic diagnostic apparatus.
- Ultrasonic images include B-mode images (tomographic images), M-mode images, and 3D mode images.
- the image recognition calculation unit 3 identifies the type of the ultrasonic image, specifically inputs the image output from the ultrasonic image generation device 1, extracts the feature amount of the movement of the entire input image, Processing to classify the types of cross sections.
- the image recognition calculation unit 3 is connected to the ultrasonic image generation unit 1 so as to be able to transmit signals, and a motion extraction calculation unit 31 that extracts the movement of the living tissue, and is connected to the ultrasonic image generation unit 1 so as to be able to transmit signals.
- the luminance extraction calculation unit 32 for extracting the luminance of the tissue, the motion extraction calculation unit 31 and the luminance extraction calculation unit 32 are connected so as to be able to transmit signals and calculate each extraction amount of the extracted movement, A feature extraction calculation unit 33 that stores in a memory that it holds, and an identification calculation unit 36 that is connected to the feature extraction calculation unit 33 so as to be able to transmit signals and that identifies the type of an ultrasonic image input from the feature amount. Yes.
- FIG. 2 is a diagram showing a basic cross section acquired by echocardiography.
- the basic cross-section here is (a) parasternal long-axis image, (b) para-sternal short-axis image, (c) apex 2-chamber image, (d) apex long-axis image, ( e) 4 apical images of the apex. If the above-mentioned classification of each image can be performed by an image recognition technique instead of an examiner's manual operation, it is useful for reducing the burden of the examiner's diagnosis during image measurement.
- the motion extraction computing unit 31 calculates the motion as a set of motion vectors of each point by, for example, an average value by motion extraction computation in the extraction region. For example, a motion calculation method using a block matching method or a gradient method is used.
- a small region including an image whose motion is to be detected is stored as a block.
- image similarity is calculated in an area having the same size as the previous block. Furthermore, the similarity calculation is performed on the entire region of the frame.
- the region with the highest degree of similarity is the region where the image of the object whose motion is to be detected has moved. Therefore, the moving distance and moving direction are calculated from the coordinates of this region and the coordinates of the previously stored block. .
- the processing amount becomes enormous, but the motion can be detected with high accuracy.
- the gradient method searches for corresponding points using the luminance gradient constraint of each pixel in space-time, and is based on the assumption that the shade pattern of the image is kept unchanged with respect to motion.
- This is an analysis method based on the equation that relates the spatial gradient and temporal gradient of the gray distribution in the image.
- the calculation amount is small and high-speed processing is possible, there is a tendency that noise is increased in order to obtain a flow of an object having a large movement.
- the period during which the heart motion is calculated must be able to arbitrarily set the examiner's operation settings because there are individual differences in the subject's heart rate.
- a period with the largest amount of movement may be extracted, for example, from the end diastole to the end systole. If the next image data exists, the movement of the luminance value is similarly extracted.
- FIG. 2 shows an example of the motion vector (first motion vector) of the entire image from the end diastole to the end systole.
- the motion vector indicates how much the luminance vector Aj moves between different phases.
- Each motion vector can be represented by a movement component xj in the horizontal direction (X direction) of the drawing and a movement component yj in the vertical direction (Y direction) of the drawing.
- the feature value of the luminance value and the feature value of the motion are obtained by calculating the motion vector.
- a vector in the j-th image is a vector Aj
- a cross-sectional type is Vj.
- the motion vector Aj is decomposed in the x and y directions of the drawing to be xj and yj.
- the diameter rj and the angle ⁇ j may be used.
- the information Ij obtained from the image j is represented as a vector by using the information Ij as a vector.
- the feature extraction calculation unit 33 extracts the feature amount of each cross section V rendered as an image with respect to the information Ij of the entire image. For example, the features of each cross-section Vj rendered as an image by applying basic statistic calculations such as mean and variance, and the principal component analysis or independent component analysis of the motion vector Aj in each pixel of the entire ultrasound image Extract the amount.
- the target of principal component analysis is xj and yj of each pixel of the entire ultrasound image when the motion vector Aj is indicated by xy coordinates, and the entire ultrasound image when the motion vector Aj is indicated by polar coordinates. Rj and ⁇ j of each pixel.
- the identification calculation unit 36 reads out the feature quantity of each cross section V from the memory, and identifies the type of the inputted ultrasonic image using the read feature quantity. Specifically, as in the example of FIG. 2 (c), the identification calculation unit 36 identifies the type of the ultrasound image as the apex two-chamber image from the feature amount.
- the image display unit 5 displays the ultrasonic images whose types are identified according to the types. Specifically, the image display unit 5 displays the apex 2-chamber image in which the type is identified.
- FIG. 3 is a flowchart showing an outline of the operation of the medical image processing apparatus according to the first embodiment of the present invention.
- the medical image generation unit 1 acquires an ultrasonic image of a predetermined time phase for one frame (step S31).
- the image recognition calculation unit 3 stores the ultrasonic image in a storage unit owned by itself (not shown) (step S32).
- the medical image generation unit 1 acquires one frame of ultrasonic images of a time phase different from the predetermined time phase (for example, a time phase next to the predetermined time phase) (step S33).
- the image recognition calculation unit 3 acquires a motion vector by the method described above from the ultrasonic images of different time phases and the stored ultrasonic image.
- the motion extraction calculation unit 31 of the image recognition calculation unit 3 performs an analysis process of the direction component of the acquired motion vector (step S34).
- the feature extraction calculation unit 33 of the image recognition calculation unit 3 applies a method based on calculation of basic statistics such as mean and variance, principal component analysis, and independent component analysis to the information Ij. Extract feature values.
- the identification calculation unit 36 identifies which basic cross section is an echocardiographic examination from the feature quantity, and generates a display format in which the basic cross section information and the ultrasonic image are associated with each other (step S35).
- the image display unit 5 displays the basic cross-sectional information and the ultrasonic image side by side according to the display format. Note that this display step is not an essential step when the ultrasonic images are classified and stored in the storage unit without being displayed (step 36).
- the unique effect of the present embodiment is the minimum necessary component, and since the path through which the ultrasonic image passes for processing is the shortest, the ultrasonic images of different time phases are classified in real time. Can be processed.
- FIG. 4 is a block diagram showing an outline of the medical image processing apparatus according to the second embodiment of the present invention.
- the medical image processing apparatus further includes a motion information acquisition unit 2 connected to the ultrasonic image generation unit 1 so as to be able to transmit signals as a configuration added to FIG.
- the motion information acquisition unit 2 divides the myocardial region of the heart on the ultrasound image by the computer executing a program.
- the computer stores data such as an ultrasound image from the ultrasound image generation unit 1 (not shown), and a CPU (not shown) executes the computer program to process the data and output it to the image recognition calculation unit 3. (Omitted).
- the image recognition calculation unit 3 inputs a local image divided for each fraction output from the motion information acquisition unit 2 and outputs classified information.
- the examiner can set (select) the entire image and the local region of the image.
- the examiner selects the entire image when diagnosing the movement of the entire organ, and selects a local region of the image when diagnosing calcification or ischemia information of a part of the myocardium.
- FIG. 5 is a diagram showing the principle of division of the myocardial region. As shown in FIG. 5, the myocardial region is divided into 16 fractions or 17 fractions recommended by ASE (AMerican Society of Echocardiography). The division method is performed by manual operation by the examiner or image recognition processing using the shape of the image as a feature amount.
- ASE AMerican Society of Echocardiography
- second motion vector an example of an image local motion vector (second motion vector) is shown.
- This motion vector indicates how much the luminance value Aj moves between different time phases, and is represented by a horizontal movement component (xj) in the drawing and a vertical movement component (y direction) yj in the drawing. What can be expressed is the same as the first motion vector.
- the difference from the first motion vector is that the first motion vector is motion analysis information from the entire image, whereas the second motion vector is (c) the 13th partial image (local) of the apex 2-chamber image It is only motion analysis information.
- a vector in which luminance values are arranged in the order of arbitrary extraction positions in the k-th fraction of the j-th image is Ajk.
- the motion vector of the entire image from the end diastole to the end systole of the heart is different in each cross section.
- Aj is a vector in which luminance values are arranged in the order of arbitrary extraction positions in the j-th pixel of the image.
- the type of cross section is Vj.
- the motion vector is decomposed orthogonally into xj and yj. In the case of polar coordinates, the diameter rj and the angle ⁇ j may be used.
- the information Ij obtained from the image j is represented as a vector as shown in Equation 2 above.
- Ij (Aj) (Equation 3) It is also possible to use the above equation 1 when only movement is used.
- the motion vector is decomposed orthogonally into xjk and yjk.
- the motion vector may be a diameter rjk and an angle ⁇ jk as long as it is expressed in polar coordinates.
- the information Ijk obtained from the fraction k of the image j is expressed as a formula 4 as a vector.
- Equation 5 (Ajk
- the first motion vector is a macro motion analysis for the entire sector image.
- the second motion vector may be combined with the first motion vector and the second motion vector in a form supplemented by a micro motion analysis for the sector image divided image.
- ⁇ S32 and S34 in the present embodiment are the same as those in the first embodiment except that the following program is executed.
- the exercise information acquisition unit 2 stores the ultrasonic image in a storage unit owned by the exercise information acquisition unit 2 (step S32).
- the motion information acquisition unit 2 acquires a motion vector by the method described above from the ultrasonic images of different time phases and the stored ultrasonic images.
- the motion extraction calculation unit 31 of the image recognition calculation unit 3 performs an analysis process of the direction component of the acquired motion vector (step S34).
- the motion information acquisition unit 2 is added, and the motion abnormality in the local region of the organ can be analyzed.
- the medical image processing apparatus further includes an external motion measurement unit 4 connected to the motion information acquisition unit 2 so as to be able to transmit signals to the configuration of the second embodiment.
- the external motion measurement unit 4 obtains a motion measurement value of the living tissue by electromagnetism including an electrocardiogram measurement device, a magnetocardiogram measurement device, a blood vessel pulsation measurement device, and a respiratory motion measurement device.
- Step S34 of this embodiment is the same as that of Embodiment 1 except that the following program is executed.
- the motion information acquisition unit 2 is the above described FIG. 4 from the ultrasonic images of different time phases, the stored ultrasonic images, and the measurement values measured by the external motion measurement unit 4 such as an electrocardiograph. Get the motion vector.
- the measurement value can be added to the calculation of the motion vector because the myocardial contraction / expansion signal by the electrocardiogram is obtained in synchronization with the image measurement.
- the ECG waveform can detect whether the heart beats earlier or later, if the timing to be acquired is an R wave, the time phase different from the predetermined time phase is not a periodic motion Even in this case, the motion vector can be calculated with the image data synchronized with the R wave measured by the external motion measuring unit 4.
- the motion extraction calculation unit 31 performs an analysis process on the direction component of the acquired motion vector (step S34).
- a specific effect of the present embodiment is that a motion vector can be calculated even when the predetermined time phase and the different time phase have an indefinite period.
- the medical image processing apparatus further includes a lesion type estimation unit 6 connected to the image recognition calculation unit 3 so as to be able to transmit signals to the configuration of the first embodiment.
- the lesion type estimation unit 6 has diagnostic information added in advance by image diagnosis for a medical image including a lesion portion of the medical images, and stores the diagnosis information and a medical image including the lesion portion in association with each other. It is.
- ⁇ Steps S35 and S36 of the present embodiment are the same as those of the first embodiment except that the following program is executed.
- the feature extraction calculation unit 33 of the image recognition calculation unit 3 applies a method based on calculation of basic statistics such as mean and variance, principal component analysis, and independent component analysis to the information Ij. Extract feature values.
- the identification calculation unit 36 identifies which basic cross section is an echocardiographic examination from the feature amount, and outputs data in which the basic cross section information and the ultrasonic image are associated to the lesion type estimation unit 6.
- the lesion type estimation unit 6 calculates, for example, an index called a known wall motion score index. This index is used, for example, to measure whether or not the myocardium in the coronary artery control region in coronary artery disease suffers from myocardial infarction.
- the lesion type estimation unit 6 generates a display format in which the disease information obtained using such an index is added to the ultrasound image (step S35).
- the image display unit 5 displays the index and the ultrasonic image side by side according to the display format. Note that this display step is not an essential step when the ultrasonic images are classified and stored in the storage unit without being displayed (step 36).
- a unique effect of the present embodiment is that the classification of the disease can be presented to the examiner.
- the medical image processing apparatus includes a learning calculation unit 34 that is connected to the feature amount extraction unit 33 in a manner capable of transmitting signals in the image recognition calculation unit 3, and a learning calculation in the configuration of the first embodiment.
- the learning data storage unit 35 is further connected to the unit 34 and the identification calculation unit 36 so as to be able to transmit signals.
- the learning calculation unit 34 has a known learning algorithm such as a neural network, and the feature value output from the feature extraction calculation unit 33 is input to perform learning calculation.
- the learning data storage unit 35 is a storage device that stores the learning data calculated by the learning calculation unit 34, and is a hard disk or a memory.
- the identification calculation unit 36 identifies features extracted from the newly input image based on the learning data stored in the learning data storage unit 35.
- FIG. 9 is a flowchart showing an outline of the operation of the medical image processing apparatus according to the fifth embodiment of the present invention.
- the present invention assumes a learning algorithm having a teacher signal, and is divided into a part for learning using a plurality of images and a part for recognizing an input image based on learning data.
- the luminance extraction calculation unit 32 extracts the luminance of the biological tissue portion of the image
- the motion extraction calculation unit 31 extracts the motion vector of the biological tissue portion of the image (step S91).
- luminance extraction calculation unit 32 may extract the luminance as it is, or may take an average value in a certain neighborhood.
- the time phase of the image can be freely selected, and an ECG (electrocardiograM) R-wave time phase image that can detect the time phase most generally can be used.
- ECG electrocardiograM
- the examiner sets the extraction position only in a portion having a high luminance value such as the myocardium, the number of extraction points can be reduced and the calculation time can be shortened.
- the motion of the heart is calculated by the motion extraction calculation unit 31 by the motion extraction calculation at the extraction position.
- a motion calculation method using a block matching method or a gradient method is used.
- the calculation period of the exercise can be arbitrarily set to match the individual difference of the subject.
- the motion calculation period may be a period with the largest amount of movement, such as from the end diastole to the end systole. If the next image data exists, the luminance value and motion are similarly extracted (step S92).
- Feature extraction calculation unit 33 performs feature extraction calculation on information Ij obtained from image j (step S93).
- the feature extraction calculation unit 33 extracts features of each cross section V by applying, for example, basic statistic calculation such as mean and variance, principal component analysis, and independent component analysis.
- the learning calculation unit 34 performs learning calculation on the extracted features.
- the learned data is stored in the learning data storage unit 35, and the learning process ends (step S94).
- the luminance extraction calculation unit 32 first determines the luminance of the biological tissue portion of the image and the motion extraction calculation unit 31 applies the biological image of the image to the medical image input from the medical image generation unit 1 as in the learning.
- a motion vector of the tissue part is extracted (step S95).
- the feature extraction calculation unit 33 performs feature extraction based on luminance and movement as in the learning (step S96).
- the identification calculation unit 36 collates the learning data with the characteristics of the input image and classifies the medical image into the most similar cross section (step S97).
- the image display unit 5 displays the classified section types together with the ultrasonic image, and ends the recognition process (step S98).
- cross-sectional type can be stored in association with the image data.
- a unique effect of the present embodiment is that the recognition rate can be improved by applying a learning algorithm using luminance and motion as a feature amount, as compared with recognition only of luminance.
- a learning algorithm using luminance and motion as a feature amount, as compared with recognition only of luminance.
- the medical image processing apparatus of the sixth embodiment is implemented with the same hardware configuration as that of the fifth embodiment.
- Example 6 is a method of performing recognition processing for each local area of the myocardium, not the entire image.
- FIG. 10 is a diagram showing a display example of a stress echo test for explaining the sixth embodiment of the present invention.
- the myocardium is divided into 16 fractions on the screen 90, the examiner visually confirms the movement of the myocardium, and scores the good or bad, which are indicated by 91 and 92 Display and record in the area.
- the score is not evaluated (no score), norMal (1 point), hypokinetic (2 points), akinetic (3 points), dyskinetic (4 points).
- the total evaluation of wall motion is performed by dividing the total of these points by the number of visualized fractions as the wall motion score index. Since this operation is visually inspected for each fraction while switching the cross section, the operator is forced to perform a very complicated operation.
- the probe is operated by the examiner, the operation man-hours for the operator are reduced by recognizing the cross-sectional type and scoring by calculation processing.
- ⁇ Steps S91 and S95 of the present embodiment are the same as those of the fifth embodiment except that the following program is executed.
- the feature extraction calculation unit 33 recognizes the type of the cross section based on the luminance of the image calculated from the luminance extraction calculation unit 32 and the heart motion calculated from the motion extraction calculation unit 31.
- the myocardium divided into sections by the motion information acquisition unit 2 is divided into existing automatic contour extraction processes or manually for each fraction.
- the luminance extraction calculation unit 32 and the movement extraction calculation unit 31 arrange measurement points in each fraction and calculate the luminance and the movement in the same manner as in the fourth embodiment. That is, the luminance extraction calculation unit 32 extracts the luminance of the biological tissue portion of the image, and the motion extraction calculation unit 31 extracts the motion vector of the biological tissue portion of the image (step S91).
- step 91 that is, the previous processing is added to the first step 95 of the recognition processing.
- a unique effect of the present embodiment is that it is possible to automatically quantify local myocardial motion abnormalities that were difficult with still images by recognizing motion for each myocardial fraction.
- the accuracy of quantification can be improved by using both motion and luminance information.
- an image search function is added to the image recognition calculation unit 3. Except for the additional functions, the hardware configuration is the same as that of the fifth embodiment.
- the image search function of the seventh embodiment performs image search at the time of inspection and image search using a thumbnail search screen.
- the image retrieval method is based on a known image recognition technique.
- FIG. 11 is a flowchart showing an outline of an image search operation of the medical image processing apparatus according to the seventh embodiment of the present invention
- FIG. 12 is a display example when an image having the highest similarity is displayed on the image from the medical image generation unit.
- FIG. 13 is a diagram showing a display example when there are a plurality of images similar to the image from the medical image generation unit.
- the image recognition calculation unit 3 searches the image set by the search item 122 or 136 in a specific range (for example, the same patient, the same examination date, the same cross section), and the search target image is Search for existence.
- the search item 122 or 136 can be arbitrarily set by the examiner by operating a soft switch or pull-down menu on the screen with a pointing device (step S111).
- the image display unit 5 displays it to confirm whether the set search item is added to the image data (step S112).
- the image recognition calculation unit 3 performs recognition processing on the search item according to the first to fourth embodiments (step S113).
- the image recognition calculation unit 3 stores the classified types together with the image data (step S104).
- the image recognition calculation unit 3 collates the image data in which the set search item 122 or 136 and the classified type are stored together (step S115).
- the image recognition calculation unit 3 stores a search result that matches the search item 122 or 136.
- the image display unit 5 displays the image data to which the set search item is added (step S116).
- GUI 12 is a graphical user interface (GUI) 120 at the time of B-mode image inspection in the display example of FIG.
- GUI graphical user interface
- the left search item A122 is acquired as a 2-point image of the apex
- the right search item B122 is specified as a 4-point image of the apex and is acquired in the past.
- the image of each setting item is recalled and displayed, allowing comparison between the two screens.
- the search item is classified as a dynamic abnormality, it can be used for searching for similar lesions. As a result, the examiner can reduce the trouble of visually selecting an image from a large number of image data and diagnostic information.
- the screen 133 is for searching with thumbnail images.
- the upper left image 134 is a reference source image of a predetermined subject.
- a search item is specified in the search item 136
- a group of candidate images that are similar to each other with respect to the search item of the reference source image 134 are displayed on the candidate screen 135.
- the search item is wall Motion score index
- an image having a value close to the index of the reference source image is displayed on the candidate screen 135. This facilitates comparison with images having similar lesions and dynamics, so that the quality of the examination is improved and the burden on the examiner can be reduced.
- the feature amount extraction is performed on the entire pixels of the ultrasonic image, but the following method may be used in which the sample points on the heart wall are used.
- FIG. 14 is a diagram showing an example of a motion vector calculation method different from those in FIGS.
- the entire sector image has extraction positions (x marks) spaced in a grid pattern (the distance between the extraction positions is d) and overlaps the heart wall, heart chamber, etc. in the sector image or First motion vectors A to F from extraction positions located in the vicinity are defined.
- the moving distance and moving direction are calculated for the contraction period and the expansion period of the heartbeat.
- the relationship between the movement distance and the movement direction of the first motion vectors A to F is as follows.
- each embodiment has been described as an example applied to an ultrasonic tomographic image (B mode image)
- various medical image diagnostic apparatuses that capture an elastic imaging image, an X-ray image, an X-ray CT image, and an MRI image
- This technique can be applied directly to a reference image or a reference image of a real-time virtual sonography (RVS) image.
- RVS real-time virtual sonography
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- High Energy & Nuclear Physics (AREA)
- Theoretical Computer Science (AREA)
- Physiology (AREA)
- Quality & Reliability (AREA)
- Multimedia (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Image Processing (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
Description
上記各画像の分類は、検者の手動でなく画像認識技術によりできるようにすれば、画像計測の際に検者の診断の工数の負担軽減に有用である。
量は莫大になるが、精度良く動きを検出できる。
各断面において輝度値の特徴量と動きの特徴量は、この動きベクトルの演算によって得られる。
動きベクトルAjは図面のx方向、y方向に分解してxj、yjとする。直交座標系でなく極座標であれば、径rj、角度θjとしても良い。これにより画像jから得られる情報Ijをベクトルとして式1のように表される。
輝度を併用する場合には、輝度抽出演算部32の出力を用いて式2のように表される。例えば、輝度抽出演算部32の出力は、前記超音波信号の振幅値を前記生体組織の形態情報とする。
運動と輝度を併用すれば、心エコー図検査での何れの基本断面の識別精度の向上が期待できる。
Ij=(Aj)…(式3)
としても良いし、動きだけを用いる場合には前出の式1としても良い。
動きベクトルは直交に分解してxjk、yjkとする。動きベクトルは、また極座標で示すのであれば、径rjk、角度θjkとしても良い。これにより画像jの分画kから得られる情報Ijkをベクトルとして式4のように示される。
輝度だけを用いる場合には式5としても良いし、動きだけを用いる場合には式6としても良い。
Ijk=(xjk|yjk)…(式6)
いわば、第1動きベクトルはセクタ画像全体を対象としたマクロな動き解析である。マクロな分類での精度はさらに向上したい要求がある。そこで、第2動きベクトルはセクタ画像分割画像を対象としたミクロな動き解析で補う形で第1動きベクトルと第2動きベクトルを組み合わせてよい。
定部6はこのような指標を利用して得られた疾患情報を超音波画像に付加した表示フォーマットを生成する(ステップS35)。
演算部31において、抽出位置における運動抽出演算によって計算される。例えば、ブロックマッチング法や勾配法を用いた動き計算法が用いられる。運動の計算期間は被検者の個体差に合わせるため任意に設定可能である。例えば、運動の計算期間は拡張末期から収縮末期のように最も動き量が大きい期間でもよい。次の画像データが存在すれば同様に輝度値と動きの抽出を行う(ステップS92)。
種類の認識と点数付けを演算処理で行うことで、操作者の操作工数の低減を図っている。
画像認識演算部3は、設定された検索項目122または136と分類された種類を一体で記憶した画像データを照合する(ステップS115)。
第1ベクトルB 移動距離:d/2、移動方向:3時の方向
第1ベクトルC 移動距離:d/2、移動方向:11時の方向
第1ベクトルD 移動距離:d/2、移動方向:1時の方向
第1ベクトルE 移動距離:d/2、移動方向:9時の方向
第1ベクトルF 移動距離:d/2、移動方向:9時の方向
これらの類型に計測されたセクタ画像が6つ全て一致すれば心尖部2腔像として当該セクタ画像を分類する。セクタ画像が5つ一致でも心尖部2腔像の候補として当該セクタ画像を分類する。4つ一致以下は心尖部2腔像以外の可能性もあるので、(a)傍胸骨長軸像、(b)傍胸骨短軸像、(d)心尖部長軸像、(e)心尖部4腔像において(c)心尖部2腔像と同様に抽出位置からのベクトル解析を行い、セクタ画像がどの画像に分類されるか判定する。
Claims (12)
- 被検体の生体組織を撮影した医用画像を複数得る画像情報取得部と、
前記複数得られた医用画像の時相が異なる同士の画素値から前記生体組織の運動情報を得、前記運動情報に基づき前記医用画像を所定の種類毎に分類する画像認識演算部と、
を備えたことを特徴とする医用画像処理装置。 - 前記画像情報取得部から複数得られた医用画像の時相が異なる同士の局所領域の画素値から前記生体組織の運動情報を得る運動情報取得部を備え、
前記画像認識演算部は、前記運動情報取得部より得られた運動情報に基づき前記医用画像を所定の種類毎に分類する請求項1に記載の医用画像処理装置。 - 心電計測装置、心磁計測装置、血管拍動計測装置、呼吸動計測装置を含む電磁気により前記生体組織の運動計測値を得る外部運動計測装置をさらに備え、
前記運動情報取得部は、前記外部運動計測装置によって計測される前記生体組織の運動計測値から前記生体組織の運動情報を得る請求項1に記載の医用画像処理装置。 - 前記医用画像のうちの病変部を含む医用画像について診断情報が予め画像診断により付加され、その診断情報と該病変部を含む医用画像を対応付けて記憶する病変種類推定部をさらに備え、
前記画像認識演算部は、前記病変部を含む医用画像と前記生体組織の運動情報とに基づき前記医用画像を前記病変部と推定される種類毎に分類する請求項1に記載の医用画像処理装置。 - 前記生体組織の運動の抽出期間を可変設定する抽出期間可変設定部をさらに備え、
前記画像認識演算部は、前記可変設定された生体組織の運動の抽出期間にて前記医用画像を所定の種類毎に分類する請求項1に記載の医用画像処理装置。 - 前記超音波診断装置において前記生体組織の運動の抽出領域を対象画像の全体/局所の何れかに設定する抽出領域設定部をさらに備え、
前記画像認識演算部は、前記設定された生体組織の運動の抽出領域にて前記医用画像を所定の種類毎に分類する請求項1に記載の医用画像処理装置。 - 前記画像認識演算部は、前記生体組織の形態情報と前記運動情報に基づき前記医用画像を所定の種類毎に分類する請求項1に記載の医用画像処理装置。
- 前記画像認識演算部は、前記医用画像を所定の種類毎に分類するパターンを求める学習演算部と、求めたパターンを記憶する学習データ記憶部と、その後の事象のうちの前記パターンを更新すべき事象があったときに前記学習演算部に再度パターンを求めさせ、その再度求められたパターンを前記学習データ記憶部に更新記憶させる識別演算部と、をさらに備えた請求項1に記載の医用画像処理装置。
- 前記画像認識演算部によって前記所定の種類毎に分類された医用画像を表示する画像表示部をさらに備えた請求項1に記載の医用画像処理装置。
- 前記画像情報取得部は、被検体に超音波信号を送信し、前記被検体からの反射エコー信号を受信する探触子と、前記超音波信号を送信するために前記探触子を駆動する探触子駆動部と、前記反射エコー信号から超音波画像データへ変換する画像変換部と、を具備した超音波診断装置であって、
前記画像認識演算部は、前記超音波信号の振幅値を前記生体組織の形態情報とし、前記生体組織の動き情報と前記形態情報に基づき超音波画像を所定の種類毎に分類する請求項1に記載の医用画像処理装置。 - 前記画像認識演算部は、設定された検索項目により検索された画像を表示することを特徴とする請求項1に記載の医用画像処理装置。
- 画像情報取得部により被検体の生体組織を撮影した医用画像を複数得るステップと、
画像認識演算部により前記複数得られた医用画像の時相が異なる同士の画素値から前記生体組織の運動情報を得、前記運動情報に基づき前記医用画像を所定の種類毎に分類するステップと、を含むことを特徴とする医用画像処理方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN200980120710.5A CN102056547B (zh) | 2008-06-03 | 2009-06-02 | 医用图像处理装置及医用图像处理方法 |
EP09758308A EP2298176A4 (en) | 2008-06-03 | 2009-06-02 | MEDICAL IMAGE PROCESSING DEVICE AND MEDICAL IMAGE PROCESSING METHOD |
US12/996,136 US20110082371A1 (en) | 2008-06-03 | 2009-06-02 | Medical image processing device and medical image processing method |
JP2010515868A JP5438002B2 (ja) | 2008-06-03 | 2009-06-02 | 医用画像処理装置及び医用画像処理方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008145456 | 2008-06-03 | ||
JP2008-145456 | 2008-06-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009148041A1 true WO2009148041A1 (ja) | 2009-12-10 |
Family
ID=41398118
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/060043 WO2009148041A1 (ja) | 2008-06-03 | 2009-06-02 | 医用画像処理装置及び医用画像処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20110082371A1 (ja) |
EP (1) | EP2298176A4 (ja) |
JP (1) | JP5438002B2 (ja) |
CN (1) | CN102056547B (ja) |
WO (1) | WO2009148041A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11049251B2 (en) | 2018-08-31 | 2021-06-29 | Fujifilm Corporation | Apparatus, method, and program for learning discriminator discriminating infarction region, discriminator for discriminating infarction region, and apparatus, method, and program for discriminating infarction region |
JP2021137116A (ja) * | 2020-03-02 | 2021-09-16 | キヤノン株式会社 | 画像処理装置、医用画像診断装置、画像処理方法、プログラム、および学習装置 |
JP2021137115A (ja) * | 2020-03-02 | 2021-09-16 | キヤノン株式会社 | 画像処理装置、医用画像診断装置、画像処理方法、プログラム、および学習装置 |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5586203B2 (ja) * | 2009-10-08 | 2014-09-10 | 株式会社東芝 | 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム |
WO2011122200A1 (ja) | 2010-03-29 | 2011-10-06 | ソニー株式会社 | データ処理装置およびデータ処理方法、画像処理装置および方法、並びに、プログラム |
CN102682091A (zh) * | 2012-04-25 | 2012-09-19 | 腾讯科技(深圳)有限公司 | 基于云服务的视觉搜索方法和系统 |
EP2767233A1 (en) * | 2013-02-15 | 2014-08-20 | Koninklijke Philips N.V. | Device for obtaining respiratory information of a subject |
JP5992848B2 (ja) * | 2013-03-06 | 2016-09-14 | 富士フイルム株式会社 | 体動表示装置および方法 |
CN104156975B (zh) * | 2013-05-13 | 2018-04-24 | 东芝医疗系统株式会社 | 医学图像分析装置和方法以及医学成像设备 |
CN104462149B (zh) * | 2013-09-24 | 2020-03-27 | 上海联影医疗科技有限公司 | 一种图像处理方法及图像处理装置 |
CN104068845B (zh) * | 2014-03-06 | 2016-07-06 | 武汉培威医学科技有限公司 | 一种心电超声信号融合断层扫描成像系统及方法 |
CN106470613B (zh) * | 2014-07-02 | 2020-05-05 | 皇家飞利浦有限公司 | 用来针对特定对象表征病理的病变签名 |
WO2016075586A1 (en) * | 2014-11-14 | 2016-05-19 | Koninklijke Philips N.V. | Ultrasound device for sonothrombolysis therapy |
JP6591195B2 (ja) * | 2015-05-15 | 2019-10-16 | キヤノンメディカルシステムズ株式会社 | 超音波診断装置及び制御プログラム |
CA3021697A1 (en) | 2016-04-21 | 2017-10-26 | The University Of British Columbia | Echocardiographic image analysis |
CN110678116B (zh) * | 2017-06-05 | 2022-11-04 | 索尼公司 | 医疗系统和控制单元 |
CN110504025A (zh) * | 2018-05-16 | 2019-11-26 | 漫迪医疗仪器(上海)有限公司 | 基于生物磁的生物特征码的识别方法及系统、识别终端 |
CN110772280B (zh) * | 2018-07-31 | 2023-05-23 | 佳能医疗系统株式会社 | 超声波诊断装置和方法以及图像处理装置和方法 |
US10751029B2 (en) | 2018-08-31 | 2020-08-25 | The University Of British Columbia | Ultrasonic image analysis |
WO2020257046A1 (en) * | 2019-06-21 | 2020-12-24 | West Virginia University | Cardiac ultrasonic fingerprinting: an approach for highthroughput myocardial feature phenotyping |
CN110647849B (zh) * | 2019-09-26 | 2022-02-08 | 深圳先进技术研究院 | 一种神经调控结果预测方法、装置及终端设备 |
WO2021056342A1 (zh) * | 2019-09-26 | 2021-04-01 | 深圳先进技术研究院 | 一种神经调控结果预测方法、装置及终端设备 |
CA3164059A1 (en) | 2019-12-30 | 2021-07-08 | Medo Dx Pte. Ltd | Apparatus and method for automated analyses of ultrasound images |
CN112100416A (zh) * | 2020-11-09 | 2020-12-18 | 南京诺源医疗器械有限公司 | 一种医学荧光成像影像数据分类云存储系统及其存储方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58180138A (ja) * | 1982-04-16 | 1983-10-21 | 株式会社日立メデイコ | 超音波診断装置 |
JP2002140689A (ja) | 2000-10-31 | 2002-05-17 | Toshiba Corp | 医用画像処理装置及びその方法 |
JP2004313291A (ja) * | 2003-04-14 | 2004-11-11 | Toshiba Corp | 超音波診断装置、医用画像解析装置及び医用画像解析方法 |
JP2006110190A (ja) * | 2004-10-15 | 2006-04-27 | Toshiba Corp | 医用画像データ解析装置及びその方法 |
JP2007530160A (ja) * | 2004-03-23 | 2007-11-01 | シーメンス メディカル ソリューションズ ユーエスエー インコーポレイテッド | 医用画像のための自動決定支援を提供するシステムおよび方法 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6859548B2 (en) * | 1996-09-25 | 2005-02-22 | Kabushiki Kaisha Toshiba | Ultrasonic picture processing method and ultrasonic picture processing apparatus |
JP3502513B2 (ja) * | 1996-09-25 | 2004-03-02 | 株式会社東芝 | 超音波画像処理方法および超音波画像処理装置 |
US5984870A (en) * | 1997-07-25 | 1999-11-16 | Arch Development Corporation | Method and system for the automated analysis of lesions in ultrasound images |
JP4614548B2 (ja) * | 2001-01-31 | 2011-01-19 | パナソニック株式会社 | 超音波診断装置 |
JP4060615B2 (ja) * | 2002-03-05 | 2008-03-12 | 株式会社東芝 | 画像処理装置及び超音波診断装置 |
JP3715580B2 (ja) * | 2002-03-19 | 2005-11-09 | 株式会社東芝 | 医用運動解析装置及びその方法 |
SG165160A1 (en) * | 2002-05-06 | 2010-10-28 | Univ Johns Hopkins | Simulation system for medical procedures |
US7558402B2 (en) * | 2003-03-07 | 2009-07-07 | Siemens Medical Solutions Usa, Inc. | System and method for tracking a global shape of an object in motion |
US7912528B2 (en) * | 2003-06-25 | 2011-03-22 | Siemens Medical Solutions Usa, Inc. | Systems and methods for automated diagnosis and decision support for heart related diseases and conditions |
EP1665987B1 (en) * | 2003-09-12 | 2016-11-09 | Hitachi, Ltd. | Ultrasonograph |
CN1934589A (zh) * | 2004-03-23 | 2007-03-21 | 美国西门子医疗解决公司 | 为医学成像提供自动决策支持的系统和方法 |
WO2006034366A1 (en) * | 2004-09-21 | 2006-03-30 | Siemens Medical Solutions Usa, Inc. | Hierarchical medical image view determination |
JP4912807B2 (ja) * | 2006-09-22 | 2012-04-11 | 株式会社東芝 | 超音波画像診断装置 |
-
2009
- 2009-06-02 WO PCT/JP2009/060043 patent/WO2009148041A1/ja active Application Filing
- 2009-06-02 US US12/996,136 patent/US20110082371A1/en not_active Abandoned
- 2009-06-02 EP EP09758308A patent/EP2298176A4/en not_active Withdrawn
- 2009-06-02 CN CN200980120710.5A patent/CN102056547B/zh not_active Expired - Fee Related
- 2009-06-02 JP JP2010515868A patent/JP5438002B2/ja not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58180138A (ja) * | 1982-04-16 | 1983-10-21 | 株式会社日立メデイコ | 超音波診断装置 |
JP2002140689A (ja) | 2000-10-31 | 2002-05-17 | Toshiba Corp | 医用画像処理装置及びその方法 |
JP2004313291A (ja) * | 2003-04-14 | 2004-11-11 | Toshiba Corp | 超音波診断装置、医用画像解析装置及び医用画像解析方法 |
JP2007530160A (ja) * | 2004-03-23 | 2007-11-01 | シーメンス メディカル ソリューションズ ユーエスエー インコーポレイテッド | 医用画像のための自動決定支援を提供するシステムおよび方法 |
JP2006110190A (ja) * | 2004-10-15 | 2006-04-27 | Toshiba Corp | 医用画像データ解析装置及びその方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2298176A4 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11049251B2 (en) | 2018-08-31 | 2021-06-29 | Fujifilm Corporation | Apparatus, method, and program for learning discriminator discriminating infarction region, discriminator for discriminating infarction region, and apparatus, method, and program for discriminating infarction region |
JP2021137116A (ja) * | 2020-03-02 | 2021-09-16 | キヤノン株式会社 | 画像処理装置、医用画像診断装置、画像処理方法、プログラム、および学習装置 |
JP2021137115A (ja) * | 2020-03-02 | 2021-09-16 | キヤノン株式会社 | 画像処理装置、医用画像診断装置、画像処理方法、プログラム、および学習装置 |
JP7412223B2 (ja) | 2020-03-02 | 2024-01-12 | キヤノン株式会社 | 画像処理装置、医用画像診断装置、画像処理方法、プログラム、および学習装置 |
JP7516072B2 (ja) | 2020-03-02 | 2024-07-16 | キヤノン株式会社 | 画像処理装置、医用画像診断装置、画像処理方法、プログラム、および学習装置 |
Also Published As
Publication number | Publication date |
---|---|
CN102056547B (zh) | 2014-05-14 |
EP2298176A1 (en) | 2011-03-23 |
US20110082371A1 (en) | 2011-04-07 |
JPWO2009148041A1 (ja) | 2011-10-27 |
JP5438002B2 (ja) | 2014-03-12 |
CN102056547A (zh) | 2011-05-11 |
EP2298176A4 (en) | 2012-12-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5438002B2 (ja) | 医用画像処理装置及び医用画像処理方法 | |
JP6640922B2 (ja) | 超音波診断装置及び画像処理装置 | |
JP5670324B2 (ja) | 医用画像診断装置 | |
US8913816B2 (en) | Medical image dianostic device, region-of-interest setting method, and medical image processing device | |
WO2017206023A1 (zh) | 一种心脏容积识别分析系统和方法 | |
JP5242163B2 (ja) | 超音波診断装置 | |
US20100036248A1 (en) | Medical image diagnostic apparatus, medical image measuring method, and medicla image measuring program | |
WO2014080833A1 (ja) | 超音波診断装置、画像処理装置及び画像処理方法 | |
CN110477952B (zh) | 超声波诊断装置、医用图像诊断装置及存储介质 | |
JP5611546B2 (ja) | 自動診断支援装置、超音波診断装置及び自動診断支援プログラム | |
US11864945B2 (en) | Image-based diagnostic systems | |
US9877698B2 (en) | Ultrasonic diagnosis apparatus and ultrasonic image processing apparatus | |
JP5885234B2 (ja) | 疾患判定装置および超音波画像形成装置における画像解析方法 | |
EP4125606B1 (en) | Systems and methods for imaging and measuring epicardial adipose tissue | |
US11707201B2 (en) | Methods and systems for medical imaging based analysis of ejection fraction and fetal heart functions | |
Luo et al. | Registration of coronary arteries in computed tomography angiography images using hidden Markov model | |
JP2020049212A (ja) | 装置、医用情報処理装置、及びプログラム | |
BOSCH | Echocardiographic digital image processing and approaches to automated border detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980120710.5 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09758308 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2009758308 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010515868 Country of ref document: JP Ref document number: 2009758308 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |