WO2015190180A1 - 医用診断装置、医用診断装置の作動方法および医用診断装置の作動プログラム - Google Patents
医用診断装置、医用診断装置の作動方法および医用診断装置の作動プログラム Download PDFInfo
- Publication number
- WO2015190180A1 WO2015190180A1 PCT/JP2015/062606 JP2015062606W WO2015190180A1 WO 2015190180 A1 WO2015190180 A1 WO 2015190180A1 JP 2015062606 W JP2015062606 W JP 2015062606W WO 2015190180 A1 WO2015190180 A1 WO 2015190180A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- classification
- feature amount
- unit
- specimen
- feature
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5207—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
- A61B8/085—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52023—Details of receivers
- G01S7/52033—Gain control of receivers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52023—Details of receivers
- G01S7/52036—Details of receivers using analysis of echo signal for target characterisation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52053—Display arrangements
- G01S7/52057—Cathode ray tube displays
- G01S7/52071—Multicolour displays; using colour coding; Optimising colour or information content in displays, e.g. parametric imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/40—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for data related to laboratory analysis, e.g. patient specimen analysis
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
- G06V2201/032—Recognition of patterns in medical or anatomical images of protuberances, polyps nodules, etc.
Definitions
- the present invention relates to a medical diagnostic apparatus that generates diagnostic image data using a received signal received from a specimen, an operating method of the medical diagnostic apparatus, and an operating program of the medical diagnostic apparatus.
- a medical diagnostic apparatus that generates diagnostic image data using a received signal from a specimen
- a plurality of learning windows are set from images, and texture feature quantities of learning windows set on different texture regions are used as feature spaces.
- a technique for setting a standard for plotting the above and calculating the similarity based on the distribution of the texture feature amount in the feature space is disclosed (for example, see Patent Document 1). According to this technique, it is possible to automatically discriminate all tissues of a specimen, and for example, it is possible to detect the surface of a tubular organ or a boundary of a tissue such as a tumor by a simple method.
- the tissue under examination is automatically discriminated whether it is tissue A, tissue B, tissue C, or lumen.
- this technique first, two types of the four types of tissues are selected.
- the feature quantity of the known specimen whose pathological result is the two types is compared with the feature quantity of the tissue of the unknown specimen under examination.
- the determination is repeated while changing the selection of the combination of the two types of tissues.
- the tissue under examination has been classified as an unknown tissue by determining that the tissue is most frequently discriminated from among the two types of the above-described discriminating results.
- the attribute (organizational property) of the tissue to be differentiated is often narrowed down to some extent.
- classification is performed by comparing with all kinds of tissues of known specimens. Therefore, if the number of types of known tissues is large, the processing amount becomes very large, and a classification result may be output. It was very late. In particular, if the attributes of the organization to be identified are narrowed down, the amount of processing should be narrowed down, but since the processing is executed for all types of tissues, there are many extra processes and the efficiency is low.
- the present invention has been made in view of the above, and is a medical diagnostic device capable of efficiently generating image data in which tissues are classified according to the contents of diagnosis, a method for operating the medical diagnostic device, and a medical diagnostic device.
- the purpose is to provide an operating program.
- the medical diagnostic apparatus includes a feature amount calculation unit that calculates a plurality of types of feature amounts based on received signals received from a specimen, and the feature amount calculation.
- the attribute of the tissue of the specimen is classified using a feature amount determined according to a classification item selected in advance from among the plurality of types of feature amounts calculated by the unit, and visual information corresponding to the classification result is obtained as the received signal
- a classification unit that assigns to each pixel of the image based on the image
- a feature amount image data generation unit that generates feature amount image data on which the visual information assigned to each pixel of the image based on the received signal is superimposed. It is characterized by that.
- the medical diagnostic apparatus associates the attribute of the tissue to be classified corresponding to the classification item, the type of feature quantity to be used for classification, and visual information corresponding to the value of the feature quantity.
- a classification information storage unit that stores classification information including at least one of the information; and the classification unit assigns the classification and the visual information with reference to the classification information storage unit.
- the known specimen information including at least one of a reception signal from a specimen whose attribute is known and information calculated based on the reception signal is associated with the attribute.
- the apparatus further comprises a known specimen information storage section that stores data, and a classification information setting section that sets the classification information using the known specimen information stored in the known specimen information storage section.
- the feature amount calculation unit extracts a plurality of parameters based on the received signal received from a predetermined region of the specimen, and uses the plurality of parameters. The feature amount is calculated.
- the medical diagnosis apparatus is characterized in that, in the above invention, the feature amount calculation unit calculates a statistic amount of the same type of parameters among the plurality of parameters as the feature amount.
- the medical diagnostic apparatus is characterized in that, in the above invention, the visual information is a variable constituting a color space.
- the medical diagnostic apparatus is characterized in that, in the above invention, the visual information is luminance.
- the medical diagnostic apparatus is characterized in that, in the above invention, the medical diagnostic apparatus further includes a display unit for displaying a feature amount image corresponding to the feature amount image data generated by the feature amount image data generation unit.
- the medical diagnostic apparatus is characterized in that, in the above-described invention, the medical diagnostic apparatus further includes an input unit that receives a selection input of the classification item.
- the medical diagnostic apparatus transmits an ultrasonic wave to a specimen and receives an electrical echo signal obtained by converting an ultrasonic echo reflected by the specimen as the reception signal. It is further provided with a tentacle.
- the medical diagnostic apparatus further includes a frequency analysis unit that calculates a frequency spectrum by analyzing a frequency of the echo signal in the above-described invention, and the feature amount calculation unit uses the frequency spectrum to calculate the plurality of frequency spectra. It is characterized by calculating the type of feature amount.
- the medical diagnosis apparatus is characterized in that, in the above invention, the feature amount calculation unit calculates the plurality of types of feature amounts based on luminance of each pixel of the image based on the received signal.
- An operating method of a medical diagnostic apparatus is an operating method of a medical diagnostic apparatus that generates diagnostic image data based on a received signal received from a specimen, wherein a feature amount calculation unit includes a plurality of features in the received signal.
- a feature amount calculating step for calculating a feature amount of a type, and a classification unit classifying the tissue attribute of the specimen using a feature amount determined according to a pre-selected classification item among the plurality of types of feature amounts
- a feature amount image data generation step for generating feature amount image data on which information is superimposed.
- An operation program for a medical diagnostic apparatus includes: a medical diagnostic apparatus that generates diagnostic image data based on a reception signal received from a specimen; and a feature amount calculation unit that calculates a plurality of types of feature amounts in the reception signal.
- the feature amount calculating step for calculating and the classification unit classify the tissue attribute of the specimen using the feature amount determined according to the classification item selected in advance among the plurality of types of feature amounts, and the classification result is obtained.
- a feature amount image data generation step for generating amount image data.
- a plurality of types of feature amounts are calculated based on a received signal from a sample, and a tissue of the sample is used using a feature amount determined according to a classification item selected in advance among the plurality of feature amounts.
- FIG. 1 is a block diagram showing a configuration of an ultrasound observation apparatus that is a medical diagnostic apparatus according to Embodiment 1 of the present invention.
- FIG. 2 is a diagram showing the relationship between the reception depth and the amplification factor in the amplification processing performed by the signal amplification unit of the ultrasonic observation apparatus according to Embodiment 1 of the present invention.
- FIG. 3 is a diagram showing a relationship between the reception depth and the amplification factor in the amplification process performed by the amplification correction unit of the ultrasonic observation apparatus according to Embodiment 1 of the present invention.
- FIG. 4 is a diagram schematically showing a data array in one sound ray of the ultrasonic signal.
- FIG. 5 is a diagram illustrating an example of a frequency spectrum calculated by the frequency analysis unit of the ultrasonic observation apparatus according to Embodiment 1 of the present invention.
- FIG. 6 is a diagram schematically illustrating an outline of processing performed by the feature amount calculation unit of the ultrasound observation apparatus according to Embodiment 1 of the present invention.
- FIG. 7 is a diagram schematically showing classification information stored in the classification information storage unit according to Embodiment 1 of the present invention.
- FIG. 8 is a diagram schematically showing classification and color assignment when the classification item is tumor screening.
- FIG. 9 is a diagram schematically illustrating classification and color assignment when the classification item is malignant / benign discrimination.
- FIG. 10 is a diagram schematically illustrating classification and color assignment when the classification item is follow-up determination 1.
- FIG. 11 is a diagram schematically illustrating classification and color assignment when the classification item is follow-up determination 2.
- FIG. 12 is a flowchart showing an outline of processing performed by the ultrasound observation apparatus according to Embodiment 1 of the present invention.
- FIG. 13 is a diagram illustrating a display example of a selection screen displayed by the display unit when the input unit accepts a selection input of a classification item.
- FIG. 14 is a flowchart showing an outline of processing performed by the frequency analysis unit of the ultrasonic observation apparatus according to Embodiment 1 of the present invention.
- FIG. 15 is a diagram schematically illustrating a display example of the feature amount image on the display unit of the ultrasonic observation apparatus according to Embodiment 1 of the present invention.
- FIG. 12 is a flowchart showing an outline of processing performed by the ultrasound observation apparatus according to Embodiment 1 of the present invention.
- FIG. 13 is a diagram illustrating a display example of a selection screen displayed by the display unit when the input unit accepts a selection
- FIG. 16 is a block diagram showing a configuration of an ultrasonic observation apparatus according to Embodiment 2 of the present invention.
- FIG. 17 is a flowchart showing an outline of processing performed by the ultrasonic observation apparatus according to Embodiment 2 of the present invention.
- FIG. 18 is a flowchart showing an outline of processing performed by the classification information setting unit of the ultrasonic observation apparatus according to Embodiment 2 of the present invention.
- FIG. 19 is a diagram schematically illustrating an outline of processing performed by the classification information setting unit of the ultrasonic observation apparatus according to Embodiment 2 of the present invention.
- FIG. 20 is a diagram schematically illustrating an outline of processing performed by the feature amount calculation unit of the ultrasonic observation apparatus according to another embodiment of the present invention.
- FIG. 1 is a block diagram showing a configuration of an ultrasound observation apparatus that is a medical diagnostic apparatus according to Embodiment 1 of the present invention.
- An ultrasonic observation apparatus 1 shown in FIG. 1 is an apparatus for observing a specimen that is a diagnosis target using ultrasonic waves.
- the ultrasonic observation apparatus 1 transmits and receives electrical signals between the ultrasonic probe 2 that outputs an ultrasonic pulse to the outside and receives an ultrasonic echo reflected from the outside, and the ultrasonic probe 2.
- a transmission / reception unit 3 for performing the calculation a calculation unit 4 for performing a predetermined calculation on the electrical echo signal obtained by converting the ultrasonic echo into an electrical signal, and image processing for generating image data corresponding to the electrical echo signal
- It is realized using the unit 5 and a user interface such as a keyboard, mouse, touch panel, etc., and is realized using an input unit 6 that accepts input of various information and a display panel made up of liquid crystal or organic EL (Electro Luminescence), etc.
- the ultrasonic observation apparatus 1 is a process in which the ultrasonic probe 2 provided with the ultrasonic transducer 21 and the ultrasonic probe 2 are detachably connected, and the above-described portions other than the ultrasonic probe 2 are provided.
- Device processing
- the ultrasonic probe 2 is a form of an external probe that irradiates ultrasonic waves from the body surface of a living body, and a long-axis insertion portion that is inserted into a lumen such as a digestive tract, bile pancreatic duct, or blood vessel. It may be in any form of a miniature ultrasonic probe provided, or an ultrasonic endoscope further including an optical system in the intraluminal ultrasonic probe.
- an ultrasonic transducer 21 is provided on the distal end side of the insertion portion of the intraluminal ultrasonic probe, and the lumen
- the inner ultrasonic probe is detachably connected to the processing apparatus on the proximal end side.
- the ultrasonic probe 2 converts an electrical pulse signal received from the transmission / reception unit 3 into an ultrasonic pulse (acoustic pulse), and converts an ultrasonic echo reflected from an external specimen into an electrical echo signal.
- the ultrasonic vibrator 21 is provided.
- the ultrasonic probe 2 may be one that mechanically scans the ultrasonic transducer 21, or a plurality of elements are arranged in an array as the ultrasonic transducer 21, and the elements involved in transmission and reception are electronically arranged. Electronic scanning may be performed by switching or delaying transmission / reception of each element. In the first embodiment, it is possible to select and use any one of a plurality of different types of ultrasound probes 2 as the ultrasound probe 2.
- the transmission / reception unit 3 is electrically connected to the ultrasound probe 2 and transmits an electrical pulse signal to the ultrasound probe 2, and an echo that is an electrical reception signal from the ultrasound probe 2. Receive a signal. Specifically, the transmission / reception unit 3 generates an electrical pulse signal based on a preset waveform and transmission timing, and transmits the generated pulse signal to the ultrasound probe 2.
- the transmission / reception unit 3 includes a signal amplification unit 31 that amplifies the echo signal. More specifically, the signal amplifying unit 31 performs STC (Sensitivity Time Control) correction in which an echo signal having a larger reception depth is amplified with a higher amplification factor.
- FIG. 2 is a diagram illustrating the relationship between the reception depth and the amplification factor in the amplification process performed by the signal amplification unit 31.
- the reception depth z shown in FIG. 2 is an amount calculated based on the elapsed time from the reception start point of the ultrasonic wave. As shown in FIG.
- the amplification factor ⁇ (dB) increases linearly from ⁇ 0 to ⁇ th (> ⁇ 0 ) as the reception depth z increases.
- the amplification factor ⁇ (dB) takes a constant value ⁇ th when the reception depth z is equal to or greater than the threshold value z th .
- the value of the threshold value z th is such a value that the ultrasonic signal received from the specimen is almost attenuated and the noise becomes dominant. More generally, when the reception depth z is smaller than the threshold value z th , the amplification factor ⁇ may increase monotonously as the reception depth z increases.
- the transmission / reception unit 3 performs processing such as filtering on the echo signal amplified by the signal amplification unit 31, and then generates and outputs a time-domain digital RF signal by performing A / D conversion.
- the transmission / reception unit 3 has a plurality of beams for beam synthesis corresponding to the plurality of elements.
- a channel circuit is included.
- the calculation unit 4 performs amplification correction on the digital high frequency (RF) signal output from the transmission / reception unit 3 and the amplification correction unit 41 that performs amplification correction so that the amplification factor ⁇ is constant regardless of the reception depth.
- a frequency analysis unit 42 that calculates a frequency spectrum by performing a frequency analysis by applying a fast Fourier transform (FFT) to the digital RF signal, and a feature amount calculation unit that calculates a plurality of types of feature amounts in the frequency spectrum 43, and a classification unit 44 that classifies the attribute of the tissue of the specimen using the feature amount corresponding to the classification item of the tissue to be selected that has been selected in advance.
- the calculation unit 4 is realized by using a CPU (Central Procuring Unit), various calculation circuits, and the like.
- FIG. 3 is a diagram illustrating the relationship between the reception depth and the amplification factor in the amplification process performed by the amplification correction unit 41.
- the amplification rate ⁇ (dB) in the amplification process performed by the amplification correction unit 41 takes the maximum value ⁇ th ⁇ 0 when the reception depth z is zero, and the reception depth z is zero from the threshold z th. Decreases linearly until reaching 0 and is zero when the reception depth z is greater than or equal to the threshold z th .
- the amplification correction unit 41 amplifies and corrects the digital RF signal with the amplification factor determined in this way, thereby canceling the influence of STC correction in the signal amplification unit 31 and outputting a signal with a constant amplification factor ⁇ th. .
- the relationship between the reception depth z and the amplification factor ⁇ performed by the amplification correction unit 41 is different depending on the relationship between the reception depth and the amplification factor in the signal amplification unit 31.
- STC correction is a correction process that eliminates the influence of attenuation from the amplitude of the analog signal waveform by amplifying the amplitude of the analog signal waveform uniformly over the entire frequency band and with a gain that monotonously increases with respect to the depth. is there. For this reason, when generating a B-mode image using the amplitude of the echo signal and scanning a uniform tissue, the luminance value becomes constant regardless of the depth by performing STC correction. That is, an effect of eliminating the influence of attenuation from the B-mode luminance value can be obtained.
- the STC correction cannot accurately eliminate the influence of attenuation associated with the propagation of the ultrasonic wave. This is because the amount of attenuation varies depending on the frequency, but the amplification factor of STC correction changes only with distance and remains constant with respect to frequency, as in equation (1) described later. A method of eliminating the influence of attenuation including the frequency dependence of the attenuation will be described later in step S9 of FIGS. 6 and 12 as “attenuation correction processing”.
- the amplification correction unit 41 In order to eliminate the influence of the STC correction on the signal subjected to the STC correction for the B-mode image while maintaining the frame rate of the generated image data, the amplification correction unit 41 To correct the amplification factor.
- the frequency analysis unit 42 performs a fast Fourier transform on each sound ray (line data) of a signal obtained by amplifying and correcting a digital RF signal based on an echo signal, and performing a fast Fourier transform on a plurality of amplitude data groups sampled at a predetermined time interval. A frequency spectrum at a location (data position) is calculated.
- FIG. 4 is a diagram schematically showing a data array in one sound ray of the ultrasonic signal.
- a white or black rectangle means one piece of data.
- the sound ray data SR k is discretized at time intervals corresponding to a sampling frequency (for example, 50 MHz) in A / D conversion performed by the transmission / reception unit 3.
- FIG. 4 shows the case where the first data position of the sound ray data SR k of the number k (described later) is set as the initial value Z (k) 0 in the direction of the reception depth z, but the position of the initial value is It can be set arbitrarily.
- the calculation result by the frequency analysis unit 42 is obtained as a complex number and stored in the storage unit 8.
- the amplitude data group needs to have a power number of 2 data.
- a process for generating a normal amplitude data group is performed by inserting zero data in an insufficient amount. This point will be described in detail when the processing of the frequency analysis unit 42 described later is described (see FIG. 14).
- FIG. 5 is a diagram illustrating an example of a frequency spectrum calculated by the frequency analysis unit 42.
- the “frequency spectrum” illustrated in FIG. 5 means “intensity frequency distribution at a certain reception depth z” obtained by performing fast Fourier transform (FFT operation) on the amplitude data group.
- FFT operation fast Fourier transform
- the term “intensity” as used herein refers to any of parameters such as the voltage of the echo signal, the power of the echo signal, the sound pressure of the ultrasonic echo, the acoustic energy of the ultrasonic echo, the amplitude of these parameters, the time integral value, and combinations thereof. Point to. In FIG. 5, the horizontal axis is the frequency f.
- the decibel expression log 10 (I / I c ) of the intensity obtained by dividing the intensity I by a specific reference intensity I c (constant) and taking the common logarithm is taken.
- the intensity expressed in decibels is also simply referred to as I hereinafter.
- the reception depth z is constant.
- the curve and the straight line are composed of a set of discrete points.
- the lower limit frequency f L and the upper limit frequency f H of the frequency band used for the subsequent calculations are the frequency band of the ultrasonic transducer 21 and the frequency band of the pulse signal transmitted by the transmitting / receiving unit 3.
- f L 3 MHz
- f H 10 MHz.
- the frequency band determined from the lower limit frequency f L and the upper limit frequency f H is referred to as “frequency band F”.
- the frequency spectrum shows a different tendency depending on the attribute of the tissue scanned with the ultrasonic wave. This is because the frequency spectrum has a correlation with the size, number density, acoustic impedance, and the like of the scatterer that scatters ultrasonic waves.
- the “attribute” refers to, for example, a malignant tumor tissue, a benign tumor tissue, an endocrine tumor tissue, a mucinous tumor tissue, a normal tissue, a vessel, and the like.
- the feature amount calculation unit 43 performs an attenuation correction process for correcting the influence of ultrasonic attenuation depending on the ultrasonic reception depth and frequency, and an approximate expression of the frequency spectrum after attenuation correction by regression analysis. And an approximation unit 432 for calculating.
- FIG. 6 is a diagram schematically illustrating an outline of processing performed by the feature amount calculation unit 43.
- FIG. 6 illustrates a case where feature amount calculation is performed on the frequency spectrum C 1 illustrated in FIG. 5.
- the attenuation correction unit 431 corrects the frequency spectrum C 1 by adding the attenuation amount A (f, z) of the equation (1) to the intensity I (f, z) at all frequencies f (I (f, z z) ⁇ I (f, z) + A (f, z)).
- the ultrasonic attenuation amount A (f, z) is attenuation that occurs while the ultrasonic waves reciprocate between the reception depth 0 and the reception depth z, and the intensity change before and after the reciprocation (difference in decibel expression).
- a (f, z) is empirically known to be proportional to the frequency in a uniform tissue, and is expressed by Expression (1), where the proportionality coefficient is ⁇ .
- a (f, z) 2 ⁇ zf (1)
- ⁇ is called an attenuation rate.
- Z is the ultrasonic reception depth, and f is the frequency.
- Frequency spectrum C 2 shown in FIG. 6 is a new frequency spectrum obtained as a result of correcting the influence of the attenuation due to propagation of ultrasonic by attenuation correction.
- Approximation unit 432 by approximating the frequency spectrum C 2 performs a regression analysis in the frequency band F of the frequency spectrum C 2 by a linear equation (regression line), to extract the parameters necessary for calculating the feature quantity.
- Mid-band fit c af M + b.
- the inclination a has a correlation with the size of the ultrasonic scatterer, and it is generally considered that the larger the scatterer, the smaller the inclination.
- the intercept b has a correlation with the size of the scatterer, the difference in acoustic impedance, the number density (concentration) of the scatterer, and the like. Specifically, the intercept b has a larger value as the scatterer is larger, a larger value as the difference in acoustic impedance is larger, and a larger value as the number density (concentration) of the scatterer is larger. It is done.
- the mid-band fit c is an indirect parameter derived from the slope a and the intercept b, and gives the intensity of the spectrum at the center in the effective frequency band.
- the midband fit c is considered to have a certain degree of correlation with the brightness of the B-mode image in addition to the size of the scatterer, the difference in acoustic impedance, and the number density of the scatterers.
- the approximate expression calculated by the approximating unit 432 is not limited to a linear expression, and it is possible to use a second-order or higher order polynomial.
- the feature amount calculation unit 43 can calculate a plurality of types of feature amounts using the parameters of the sample calculated within a predetermined region of interest. Specifically, the feature amount calculation unit 43 calculates the average of the slope a, the intercept b, and the midband fit c calculated by the approximation unit 432 in a plurality of unit regions (also referred to as discrimination windows) set in the region of interest. Calculate the standard deviation.
- the plurality of unit regions in the region of interest have the same size (number of pixels). This size is set in advance when the input unit 6 receives a setting input, and is stored in the storage unit 8.
- the feature amount calculated by the feature amount calculation unit 43 will be described by exemplifying the average and standard deviation of the slope a, the intercept b, and the mid-band fit c. However, like the variance and entropy other than the average and standard deviation, Various statistics may be applied.
- the mean of slope a is Mean S
- the standard deviation is Sd.
- the mean of intercept b is Mean I
- the standard deviation is Sd.
- I the mean of midband fit c is Mean M
- the standard deviation is Sd.
- the classification unit 44 performs classification using a feature amount corresponding to a pre-selected classification item among a plurality of types of feature amounts that can be calculated by the feature amount calculation unit 43, and colors as visual information according to the classification result (Hue) is assigned to each pixel of the image generated from the electrical echo signal.
- the visual information assigned to each pixel by the classification unit 44 is not limited to the hue, and any variable may be used as long as it is a variable constituting the color space.
- a color space for example, a Munsell color system in which brightness and saturation are added to the hue may be adopted, or RGB colorimetrics having R (red), G (green), and B (blue) as variables. A system may be adopted.
- the image processing unit 5 generates a feature image data that displays information corresponding to the feature amount extracted by the B-mode image data generation unit 51 that generates B-mode image data from the echo signal and the feature amount calculation unit 43.
- a quantity image data generation unit 52 generates a feature image data that displays information corresponding to the feature amount extracted by the B-mode image data generation unit 51 that generates B-mode image data from the echo signal and the feature amount calculation unit 43.
- the B-mode image data generation unit 51 performs signal processing using a known technique such as a bandpass filter, logarithmic conversion, gain processing, contrast processing, and the like on the digital signal, and also according to the image display range on the display unit 7.
- B-mode image data is generated by thinning out data in accordance with the data step width determined in advance.
- the B-mode image is a grayscale image in which values of R (red), G (green), and B (blue), which are variables when the RGB color system is adopted as a color space, are matched.
- the storage unit 8 has a classification information storage unit 81.
- the classification information storage unit 81 classifies the attributes of the tissue to be classified, and stores information on the classification result necessary when the feature amount image data generation unit 52 generates the feature amount image data.
- the classification information storage unit 81 also stores information related to the unit area when calculating the feature amount.
- FIG. 7 is a diagram schematically showing the classification information stored in the classification information storage unit 81.
- the purpose of classification is assigned to the column of classification items.
- an attribute of the organization to be separated is associated with a feature amount used for classification.
- a color (hue) as visual information assigned to a pixel when displaying a feature amount image is associated with a feature amount value (value range).
- the attributes (classification attributes) to be classified are normal tissue and malignant tumor tissue, and the feature quantity used for classification is the standard deviation Sd.
- the feature values are 0 ⁇ Sd.
- the pixel is M ⁇ M 11 is assigned red, pink is assigned to the pixels corresponding to the M 11 ⁇ Sd.
- M ⁇ tissue is M 12.
- no color is assigned to a pixel whose feature value is Sd.M ⁇ M 12 (described as “mask” in FIG. 7).
- a tissue corresponding to a feature amount in a range to which no color is assigned is a normal tissue.
- Sd. M M 12 is a threshold value for separating normal tissue and malignant tumor tissue.
- the correspondence between the feature amount and the color is set according to the degree of tumor in the malignant tumor tissue, for example.
- FIG. 8 is a diagram schematically showing classification and color assignment when the classification item is tumor screening.
- the horizontal axis represents the standard deviation Sd. M of the midband fit
- the vertical axis represents the feature quantity distribution of the known specimen in the feature quantity space, which is the mean mean M of the midband fit.
- 3 schematically shows the relationship between the value of and the color assigned to the pixel.
- the known specimen is another specimen whose tissue attribute has been clarified in advance by a pathological examination or the like before examining a new specimen in the first embodiment.
- the arrow A 11 indicates the range of the feature amount Sd. M to which red is assigned
- the arrow A 12 indicates the range of the feature amount Sd. M to which pink is assigned
- the arrow A 13 indicates the feature amount to which no color is assigned.
- the range of Sd.M is shown.
- known specimens are roughly divided into two groups G 11 and G 12 according to attributes.
- Group G 11 is a group of malignant tumor tissue
- a group G 12 is a group of normal tissue.
- the group G 11 and the group G 12 are clearly separated by Sd.
- M M 12 in the direction of the horizontal axis Sd. M, but not separated in the direction of the vertical axis Mean M.
- the classification item is tumor screening
- the tissue attribute of the specimen can be accurately classified by adopting the standard deviation Sd.
- the average Mean M value of the mid-band fit corresponding to M 11 and M 12 described above, the color information for the arrows A 11 , A 12 and A 13 , and the range and visual information (color and mask) of these Mean M A plurality of information in which information (hereinafter referred to as a color table) is associated with a plurality of tissue attributes before classification using a feature amount is started. Based on the distribution of the known specimens in the group, it is calculated by an external processing device (not shown), or one or both of the feature quantity calculation unit 43 and the classification unit 44. Then, it is stored in advance in the classification information storage unit 81 as a part of the classification information. When starting the classification, the classification unit 44 reads these from the classification information storage unit 81 and classifies the attributes of the organization.
- the attributes of the classification target are a malignant tumor tissue and a benign tumor tissue, and the feature amount used for the classification is the average mean M of the midband fit.
- no color is assigned to a pixel whose feature value is 0 ⁇ Mean M ⁇ M 21 .
- blue is assigned to pixels whose feature value is M 21 ⁇ Mean M ⁇ M 22
- light blue is assigned to pixels where M 22 ⁇ Mean M ⁇ M 23.
- M 23 ⁇ Mean M Pixels with ⁇ M 24 are assigned yellow, pixels with M 24 ⁇ Mean M ⁇ M 25 are assigned pink, and pixels with Mean M ⁇ M 25 are assigned red.
- FIG. 9 is a diagram schematically illustrating classification and color assignment when the classification item is malignant / benign discrimination.
- the feature amount distribution of the known specimen in the feature amount space in which the horizontal axis is the standard deviation Sd. M of the midband fit and the vertical axis is the mean mean M of the midband fit is shown as a scatter diagram. 3 schematically shows the relationship between the value of and the color assigned to the pixel.
- the known specimen is another specimen whose tissue attribute has been clarified in advance by a pathological examination or the like before examining a new specimen in the first embodiment.
- FIG. 9 the feature amount distribution of the known specimen in the feature amount space in which the horizontal axis is the standard deviation Sd. M of the midband fit and the vertical axis is the mean mean M of the midband fit is shown as a scatter diagram. 3 schematically shows the relationship between the value of and the color assigned to the pixel.
- the known specimen is another specimen whose tissue attribute has been clarified in advance by a pathological examination or the like before
- an arrow A 21 indicates a range where no color is assigned
- an arrow A 22 indicates a range where blue is assigned
- an arrow A 23 indicates a range where light blue is assigned
- an arrow A 24 indicates a range where yellow is assigned.
- Arrow A 25 indicates a range in which pink is allocated
- arrow A 26 indicates a range in which red is allocated.
- known specimens are roughly divided into two groups G 21 and G 22 according to attributes.
- Group G 21 is a group of benign tumor tissue
- a group G 22 is a group of malignant tumor tissue.
- Mean M M 24 in the direction of the vertical axis Mean M, but not separated in the direction of the horizontal axis Sd.
- the classification item is malignant / benign discrimination
- the tissue attribute of the specimen can be accurately classified by adopting the mean mean M of the midband fit.
- the feature image displays the malignant tumor tissue and the benign tumor tissue so that they can be distinguished. Therefore, in FIG. 9, different colors are assigned to the groups G 21 and G 22 , respectively.
- the average Mean M value of the midband fit corresponding to M 21 , M 22 , M 23 , M 24 , and M 25 described above, and the colors for the arrows A 21 , A 22 , A 23 , A 24 , A 25 , and A 26 The information and the information (color table) that associates the range of Mean M with the visual information (color and mask) before the classification using the feature amount is started. Based on the distribution of a plurality of known specimens whose pathology is revealed by pathological examination or the like, an external processing device (not shown), either the feature amount calculation unit 43 or the classification unit 44, or both It is calculated by. Then, it is stored in advance in the classification information storage unit 81 as a part of the classification information. When starting the classification, the classification unit 44 reads these from the classification information storage unit 81 and classifies the attributes of the organization.
- the attributes of the classification target are the follow-up observation tissue and the benign tumor tissue, and the feature quantity used for the classification is the standard deviation Sd.
- red is assigned to the feature value 0 ⁇ Sd.I ⁇ I 1
- pink is assigned to the pixel having I 1 ⁇ Sd.I ⁇ I 2
- Pixels with 3 are assigned light blue
- pixels with I 3 ⁇ Sd. I ⁇ I 4 are assigned blue.
- no color is assigned to a pixel whose feature value is Sd.I ⁇ I 4 .
- FIG. 10 is a diagram schematically illustrating classification and color assignment when the classification item is follow-up determination 1.
- the known specimen is another specimen whose tissue attribute has been clarified in advance by a pathological examination or the like before examining a new specimen in the first embodiment. Further, in FIG.
- arrow A 31 indicates a range to which red is assigned
- arrow A 32 indicates a range to which pink is assigned
- arrow A 33 indicates a range to which light blue is assigned
- arrow A 34 indicates a range to which blue is assigned
- An arrow A 35 indicates a range to which no color is assigned.
- Group G 31 is a group of essential observation organization
- group G 32 is a group of benign tumor tissue.
- the tissue attribute of the specimen can be accurately classified by adopting the standard deviation Sd.
- the classification item is follow-up determination 1
- the information (color table) that associates the range of I with the visual information (color and mask) belongs to which attribute of multiple types of tissue before classification using the feature quantity is started. Based on the distribution of known samples of a plurality of groups revealed by pathological examination or the like, it is calculated by an external processing device (not shown), or one or both of the feature amount calculation unit 43 and the classification unit 44 . Then, it is stored in advance in the classification information storage unit 81 as a part of the classification information. When starting the classification, the classification unit 44 reads these from the classification information storage unit 81 and classifies the attributes of the organization.
- the attributes of the classification target are the follow-up observation tissue and the malignant tumor tissue, and the feature quantities used for the classification are the mean mean M of the midband fit and the standard deviation Sd. D (Mean M, Sd. I) calculated as a function of (hereinafter simply referred to as d).
- This d is specifically defined as a linear combination of two feature quantities Mean M and Sd.
- red is assigned to pixels whose feature value d is 0 ⁇ d ⁇ d 1
- pink is assigned to pixels where d 1 ⁇ d ⁇ d 2
- d 2 ⁇ d ⁇ d 3 A certain pixel is assigned green, and a pixel where d 3 ⁇ d ⁇ d 4 is assigned yellow.
- no color is assigned to a pixel whose feature value is d ⁇ d 4 .
- FIG. 11 is a diagram schematically illustrating classification and color assignment when the classification item is follow-up determination 2.
- the relationship between the color and the color assigned to the pixel is schematically shown.
- the known specimen is another specimen whose tissue attribute has been clarified in advance by a pathological examination or the like before examining a new specimen in the first embodiment.
- FIG. 11 is a diagram schematically illustrating classification and color assignment when the classification item is follow-up determination 2.
- an arrow A 41 indicates a range for assigning red
- an arrow A 42 indicates a range for assigning pink
- an arrow A 43 indicates a range for assigning green
- an arrow A 44 indicates a range for assigning yellow
- An arrow A 45 indicates a range to which no color is assigned.
- group G 41 is a group of essential observation organization
- group G 42 is a group of malignant tumor tissue.
- the value of the feature quantity d is defined by the distance from the origin along the axis d.
- the feature amount d (Mean M, Sd. I) which is a linear combination of the average Mean M of the midband fit and the standard deviation Sd. I of the intercept is adopted. Can accurately classify the tissue attributes of the specimen.
- the classification item is follow-up determination 2
- Each ratio (that is, the direction of the axis d) with respect to the feature amount d of the mean mean M of the mid-band fit and the standard deviation Sd.
- the value of the quantity d, the color information for the arrows A 41 , A 42 , A 43 , A 44 , A 45 , and the information that correlates the value range of these d with the visual information (color and mask) (color table) Is not shown on the basis of the distribution of a plurality of groups of known specimens whose pathological examination reveals which attribute of a plurality of types of tissue belongs before the classification using the feature quantity is started It is calculated by an external processing device, or one or both of the feature amount calculation unit 43 and the classification unit 44. Then, it is stored in advance in the classification information storage unit 81 as a part of the classification information. When starting the classification, the classification unit 44 reads these from the classification information storage unit
- the storage unit 8 has information necessary for amplification processing (relationship between the amplification factor and the reception depth shown in FIG. 2) and information necessary for amplification correction processing (the amplification factor and the reception depth shown in FIG. 3). ), Information necessary for the attenuation correction process (see Expression (1)), and information on window functions (Hamming, Hanning, Blackman, etc.) necessary for the frequency analysis process.
- the storage unit 8 stores an operation program for executing the operation method of the ultrasonic observation apparatus 1 (medical diagnosis apparatus).
- This operation program can be recorded on a computer-readable recording medium such as a hard disk, a flash memory, a CD-ROM, a DVD-ROM, or a flexible disk and widely distributed. Recording of various programs on a recording medium or the like may be performed when the computer or the recording medium is shipped as a product, or may be performed by downloading via a communication network.
- the storage unit 8 having the above configuration is realized using a ROM (Read Only Memory) in which various programs are installed in advance, and a RAM (Random Access Memory) that stores calculation parameters and data of each process. .
- the various programs described above can also be obtained by downloading via a communication network.
- the communication network here is realized by, for example, an existing public line network, LAN (Local Area Network), WAN (Wide Area Network), etc., and may be wired or wireless.
- the control unit 9 is realized by using a CPU (Central Procuring Unit) having various calculation and control functions, various arithmetic circuits, and the like.
- the control unit 9 reads various programs including information stored and stored in the storage unit 8 and an operation program of the ultrasonic observation apparatus 1 from the storage unit 8, thereby performing various arithmetic processes related to the operation method of the ultrasonic observation apparatus 1.
- the control unit 9 and the calculation unit 4 may be configured using a common CPU or the like.
- FIG. 12 is a flowchart showing an outline of processing performed by the ultrasonic observation apparatus 1 having the above configuration.
- the input unit 6 accepts a selection input for classification items (step S1).
- FIG. 13 is a diagram illustrating a display example of a selection screen displayed by the display unit 7 when the input unit 6 receives a selection input of a classification item.
- the selection screen 101 shown in FIG. 13 displays classification items and attributes (classification attributes) of classification target organizations.
- a frame cursor 102 indicating the currently selected category item is displayed.
- FIG. 13 shows a state where “tumor screening” is selected as the classification item.
- the user completes the selection input by moving the frame cursor 102 to a desired category using a mouse or the like and then clicking the mouse to confirm the selection.
- the input unit 6 outputs the received selection input signal to the control unit 9.
- the ultrasonic observation apparatus 1 first measures a new specimen with the ultrasonic probe 2 (step S2). Specifically, the ultrasonic transducer 21 of the ultrasonic probe 2 converts an electrical pulse signal into an ultrasonic pulse and sequentially transmits it to the specimen. Each ultrasonic pulse is reflected by the specimen and an ultrasonic echo is generated. The ultrasonic transducer 21 converts ultrasonic echoes into electrical echo signals.
- the frequency band of the pulse signal may be a wide band that substantially covers the linear response frequency band of the electroacoustic conversion of the pulse signal to the ultrasonic pulse in the ultrasonic transducer 21. Thus, it is possible to perform accurate approximation in the frequency spectrum approximation process described later.
- the signal amplifying unit 31 that has received the echo signal from the ultrasonic probe 2 amplifies the echo signal (step S3).
- the signal amplifying unit 31 performs amplification (STC correction) of the echo signal based on the relationship between the amplification factor and the reception depth shown in FIG. 2, for example.
- the various processing frequency bands of the echo signal in the signal amplifying unit 31 may be a wide band that substantially covers the linear response frequency band of the acoustoelectric conversion of the ultrasonic echo to the echo signal by the ultrasonic transducer 21. This is also because it is possible to perform accurate approximation in the frequency spectrum approximation processing described later.
- the B-mode image data generation unit 51 generates B-mode image data using the echo signal amplified by the signal amplification unit 31 (step S4).
- the control part 9 may perform control which displays the B mode image corresponding to the produced
- the amplification correction unit 41 performs amplification correction on the signal output from the transmission / reception unit 3 so that the amplification factor is constant regardless of the reception depth (step S5).
- the amplification correction unit 41 performs amplification correction based on, for example, the relationship between the amplification factor and the reception depth shown in FIG.
- step S7 the frequency analysis unit 42 calculates a frequency spectrum by performing frequency analysis by FFT calculation.
- step S7 it is also possible to set the entire region of the image as the region of interest.
- step S6: No when the region of interest is not set (step S6: No), when the input unit 6 receives an input of an instruction to end the process (step S8: Yes), the ultrasound observation apparatus 1 ends the process. To do.
- step S6: No when the region of interest is not set (step S6: No), when the input unit 6 does not accept an input of an instruction to end the process (step S8: No), the ultrasound observation apparatus 1 performs the step Return to S6.
- FIG. 14 is a flowchart showing an overview of the processing executed by the frequency analysis unit 42 in step S7.
- the frequency analysis processing will be described in detail with reference to the flowchart shown in FIG.
- the frequency analysis unit 42 sets a counter k for identifying a sound ray to be analyzed as k 0 (step S21).
- the frequency analysis unit 42 sets an initial value Z (k) 0 of a data position (corresponding to a reception depth) Z (k) that represents a series of data groups (amplitude data group) acquired for FFT calculation.
- FIG. 4 shows a case where the first data position of the sound ray SR k is set as the initial value Z (k) 0 as described above.
- the frequency analysis unit 42 acquires the amplitude data group to which the data position Z (k) belongs (step S23), and applies the window function stored in the storage unit 8 to the acquired amplitude data group (step S24). .
- the window function By applying the window function to the amplitude data group in this way, it is possible to avoid the amplitude data group from becoming discontinuous at the boundary and to prevent occurrence of artifacts.
- the frequency analysis unit 42 determines whether or not the amplitude data group at the data position Z (k) is a normal data group (step S25).
- the amplitude data group needs to have a data number of a power of two.
- the number of data in the normal amplitude data group is 2 n (n is a positive integer).
- the amplitude data groups F 2 and F 3 are both normal.
- step S25 If the result of determination in step S25 is that the amplitude data group at data position Z (k) is normal (step S25: Yes), the frequency analysis unit 42 proceeds to step S27 described later.
- step S25 when the amplitude data group at the data position Z (k) is not normal (step S25: No), the frequency analysis unit 42 inserts zero data as much as the deficient amount into the normal amplitude data group. Generate (step S26). A window function is applied to the amplitude data group determined to be not normal in step S25 (for example, the amplitude data groups F 1 and F K in FIG. 4) before adding zero data. For this reason, discontinuity of data does not occur even if zero data is inserted into the amplitude data group. After step S26, the frequency analysis unit 42 proceeds to step S27 described later.
- step S27 the frequency analysis unit 42 performs an FFT operation using the amplitude data group to obtain a frequency spectrum that is a frequency distribution of the amplitude (step S27). This result is shown in FIG. 5, for example.
- the frequency analysis unit 42 changes the data position Z (k) by the step width D (step S28). It is assumed that the step width D is stored in advance in the storage unit 8.
- the step width D is desirably matched with the data step width used when the B-mode image data generation unit 51 generates the B-mode image data. A value larger than the data step width may be set as the width D.
- the frequency analysis unit 42 determines whether or not the data position Z (k) is larger than the maximum value Z (k) max in the sound ray SR k (step S29).
- the frequency analysis unit 42 increases the counter k by 1 (step S30). This means that the processing is shifted to the next sound ray.
- the frequency analysis unit 42 returns to step S23.
- the frequency analysis unit 42 performs an FFT operation on [(Z (k) max ⁇ Z (k) 0 +1) / D + 1] amplitude data groups for the sound ray SR k .
- [X] represents the maximum integer not exceeding X.
- step S30 the frequency analysis unit 42 determines whether or not the counter k is larger than the maximum value k max (step S31). When the counter k is greater than k max (step S31: Yes), the frequency analysis unit 42 ends the series of FFT processing. On the other hand, when the counter k is equal to or less than k max (step S31: No), the frequency analysis unit 42 returns to step S22.
- the frequency analysis unit 42 performs the FFT operation a plurality of times for each of (k max ⁇ k 0 +1) sound rays in the region of interest.
- the attenuation correction unit 431 performs attenuation correction on the frequency spectrum calculated by the frequency analysis unit 42 (step S9).
- the attenuation correction unit 431 obtains a new frequency spectrum by performing a correction process for adding the attenuation amount A of the above-described equation (1) to the intensity I for all frequencies f. Thereby, the frequency spectrum which reduced the contribution of attenuation accompanying propagation of an ultrasonic wave can be obtained.
- a frequency spectrum C 2 shown in FIG. 6 is a curve obtained as a result of performing attenuation correction processing on the frequency spectrum C 1 .
- the approximation unit 432 approximates the attenuation-corrected frequency spectrum (corrected frequency spectrum) with a linear expression by performing regression analysis in a predetermined frequency band (step S10).
- the approximating unit 432 calculates the slope a, the intercept b, and the midband fit c as parameters necessary for the feature amount calculation, and writes and stores them in the classification information storage unit 81.
- the feature quantity calculation unit 43 refers to the classification information storage unit 81 and calculates a feature quantity necessary for classification according to the classification item selected in step S1 (step S11). For example, when tumor screening is selected as the classification item, the feature amount calculation unit 43 calculates the standard deviation Sd.M of the midband fit obtained in the unit region within the region of interest, as shown in FIG.
- the classification unit 44 classifies the organization attributes using the feature amount calculated by the feature amount calculation unit 43, and assigns a color corresponding to the classification result to each pixel in the region of interest (step S12). For example, as described with reference to FIGS. 8 to 11, the color assignment is determined according to the feature value determined according to the classification item, and is performed for each unit region.
- the feature amount image data generation unit 52 is based on the pixel color assignment information sent from the classification unit 44 via the control unit 9 for each pixel in the B mode image data generated by the B mode image data generation unit 51. Then, feature amount image data is generated by superimposing colors as visual information (step S13).
- the display unit 7 displays a feature amount image corresponding to the feature amount image data generated by the feature amount image data generation unit 52 under the control of the control unit 9 (step S14).
- FIG. 15 is a diagram schematically illustrating a display example of the feature amount image on the display unit 7.
- a feature amount image 201 shown in the figure includes an examination content display area 202, an image display area 203, and a color characteristic display area 204.
- the examination content display area 202 is provided at the top of the screen, and displays information such as classification items, feature amounts used for classification, classification attributes, specimen IDs, and terms representing classification attributes for both poles of the color bar (described later).
- FIG. 15 illustrates a case where the classification item is tumor screening.
- the image display area 203 displays a composite image in which a color based on the classification result is superimposed on each pixel of the B-mode image.
- the image display area 203 displays the region of interest 231 and displays attributes in the region of interest 231 in a corresponding color.
- a red region D 1 and a pink region D 2 are displayed as malignant tumor tissues.
- the color characteristic display area 204 displays a color bar indicating the relationship between the color displayed in the image display area 203 and the feature value.
- ⁇ Terms corresponding to the corresponding classification attribute are displayed at the upper and lower poles of this color bar.
- the classification item is “No. I: tumor screening”, “normal” / “abnormal (malignant)” is displayed as shown in FIG.
- the classification item is “No. II: malignant / benign discrimination”, “benign” / “malignant” is displayed, and when the classification item is “No. III: follow-up judgment 1”, “benign” / “requires observation”. Is displayed, and when the classification item is “No. IV: follow-up observation determination 2”, “observation required” / “malignant” is displayed.
- the term indicating the polarity of the color bar and the information indicating the correspondence between the term and the classification item are also stored in advance as part of the classification information before the classification using the feature amount is started. Stored in the unit 81.
- the image processing unit 5 reads these from the classification information storage unit 81 and classifies the attributes of the tissue.
- step S4 and the processes of steps S5 to S14 may be performed in parallel to display the B-mode image and the feature image in parallel.
- a plurality of types of feature amounts are calculated based on the received signal from the specimen, and among the plurality of feature amounts, they are determined according to a pre-selected classification item.
- Classify tissue attributes to classify specimen tissue attributes using feature values and generate feature image data that assigns visual information corresponding to the classification results to each pixel of the image based on the received signal
- the user can clearly distinguish the attribute of the tissue of the specimen.
- the attribute classification is performed using the feature amount obtained by attenuation correction of the frequency spectrum obtained by frequency analysis as an index. Compared to the case where the feature amount calculated without correction is used, each group region in the feature amount space can be obtained in a more clearly separated state, and different attributes can be distinguished from each other.
- the frequency feature value range necessary for classification, visual information (color and mask) corresponding to the range, and information that associates the range and visual information (color table). are stored in the classification information storage unit 81 in advance as part of the classification information.
- the classification information in the first embodiment is all information derived from a specimen whose tissue attributes are clear by pathological examination or the like before classification by the classification unit 44.
- the medical diagnostic apparatus according to the second embodiment of the present invention updates the classification information while accumulating the feature amount of the examination target part in the specimen whose attribute of the examination target tissue is clarified by pathological examination or the like as the known specimen information. It has the function to do.
- FIG. 16 is a block diagram showing a configuration of an ultrasonic observation apparatus that is a medical diagnosis apparatus according to the second embodiment.
- the ultrasonic observation apparatus 11 shown in the figure has the same configuration as the ultrasonic observation apparatus 1 described in the first embodiment, except for the calculation unit 12 and the storage unit 13. For this reason, the same components as those of the ultrasonic observation apparatus 1 are described with the same reference numerals as those of the ultrasonic observation apparatus 1.
- the calculation unit 12 includes an amplification correction unit 41, a frequency analysis unit 42, a feature amount calculation unit 43, a classification unit 44, and a classification information setting unit 121.
- the classification information setting unit 121 performs a process of updating the classification information when information on a sample whose tissue attribute has been clarified by pathological examination or the like is added as known sample information.
- the storage unit 13 includes a classification information storage unit 81, an attribute-determined image storage unit 131 that stores B-mode image data including an examination target portion whose attribute has been determined by pathological examination, and the like.
- a known specimen information storage unit 132 that stores values in association with attributes of tissues of known specimens;
- FIG. 17 is a flowchart showing an outline of a known specimen information creation process performed by the ultrasound observation apparatus 11 having the above configuration.
- the known specimen information creation process will be described with reference to the flowchart shown in FIG.
- control unit 9 reads out the attribute-determined B-mode image data from the attribute-determined image storage unit 131, and causes the display unit 7 to display a B-mode image corresponding to the B-mode image data (step S41).
- the input unit 6 accepts a setting input of a data acquisition area for creating as known specimen information for the B-mode image being displayed (step S42).
- the data acquisition area is set as an area having a shape such as a circle, an ellipse, a square, a rectangle, or a sector.
- the frequency analysis unit 42 performs frequency analysis of the data acquisition area (step S43).
- the attenuation correction unit 431 performs attenuation correction on the frequency spectrum calculated by the frequency analysis unit 42 (step S44).
- the approximating unit 432 approximates the attenuation-corrected frequency spectrum with a linear expression by performing regression analysis in a predetermined frequency band (step S45).
- the frequency analysis processing, attenuation correction processing, and approximation processing here are performed in the same manner as the frequency analysis processing, attenuation correction processing, and approximation processing described in the first embodiment.
- the feature amount calculation unit 43 calculates a feature amount using a newly calculated parameter added to a group of known samples having the same attribute as a new population (step S46). For example, when the slope a, intercept b, and midband fit c in the data acquisition area are calculated as parameters, the mean Mean S and standard deviation Sd. S of the slope a in the population plus these, the mean Mean I of the intercept b, and Standard deviation Sd.I, average Mean M of midband fit c, and standard deviation Sd.M are calculated as feature quantities.
- control unit 9 writes and stores the feature quantity calculated by the feature quantity calculation unit 43 in the known specimen information storage unit 132 in association with the attribute (step S47).
- a new attribute can be added as an attribute. For this reason, even when a certain medical facility wants to define a new attribute based on a different standard from other medical facilities, information on known samples can be accumulated according to the new attribute. In addition, information on known specimens will be accumulated according to new attributes even if there are changes or additions to disease names / tissue names and disease classification methods / tissue classification methods due to revisions to rules such as academic society guidelines and handling standards. be able to.
- FIG. 18 is a flowchart showing an outline of processing performed by the classification information setting unit 121.
- the classification information update process performed by the classification information setting unit 121 according to the classification items will be described.
- the classification information setting unit 121 refers to the known sample information storage unit 132 and acquires known sample information (step S51). Specifically, the classification information setting unit 121 acquires information on the feature amount of each known specimen.
- the classification information setting unit 121 calculates a direction in which the feature amount of the known specimen is separated (step S52). At this time, the classification information setting unit 121 performs a calculation in which the separation direction is the direction in which the attribute to be determined is best separated when grouped for each tissue attribute of the specimen in the feature amount space.
- FIG. 19 is a diagram schematically illustrating an outline of the separation direction determination process.
- FIG. 19 illustrates a case where the separation directions of two groups G 51 and G 52 having different attributes are determined.
- the point on the feature space to form a group G 51 (indicated by circles), the average value mu 1 distribution 301 as viewed along any axis d 'of the feature space, 1 standard deviation ⁇
- the average value of the distribution 302 when the points on the feature amount space constituting the group G 52 (indicated by x) are viewed along the axis d ′ is ⁇ 2
- the standard deviation is ⁇ 2 .
- the classification information setting unit 121 creates a color table that associates the feature amount with the color assignment (step S53).
- the classification information setting unit 121 determines the ratio of the tissue attribute of the specimen included in each region for each region having a predetermined width along the separation direction set in step S52, and assigns a color according to the determination result. Create a color table.
- the classification information setting unit 121 for example along the axis d 'by dividing the area for every width sigma 1/10, and calculates the ratio of tissue attributes for the sample in each region. Thereafter, the classification information setting unit 121 assigns different colors according to the proportion of the points of the group G 51 in each region. For example, when dividing into three regions according to the proportion of the points of the group G 51 , the region where the points of the group G 51 are included 70% or more of the whole and the points of the group G 51 are 30% or more and less than 70% of the whole A color bar is created by assigning different colors to the included area and the area including less than 30% of the points of the group G 51 . Note that the number of divided areas and the assigned colors can be set as appropriate. It is also possible to set an area (mask area) to which no color is assigned.
- the classification information setting unit 121 writes and stores the created color table as classification information in the classification information storage unit 81 under the control of the control unit 9 (step S54).
- the feature image generation and display processing performed by the ultrasound observation apparatus 11 is the same as in the first embodiment.
- the tissue attribute of the specimen based on the optimum feature amount according to the diagnosis content, and the tissue according to the diagnosis content. Can be generated efficiently.
- the user can clearly distinguish the attribute of the tissue of the specimen, and the classification accuracy can be improved by performing attenuation correction on the ultrasonic signal.
- the same effect as in the first embodiment can be obtained.
- the feature quantity of the examination target part in the specimen whose attribute of the examination target tissue is clarified by pathological examination or the like is accumulated as the known specimen information, and the classification information is updated. Therefore, classification with higher accuracy can be performed.
- the ultrasonic observation apparatus 1 may change the classification items that can be selected according to the type of the ultrasound probe 2 in view of the types of the ultrasound probe 2 depending on the application. Good.
- the correspondence between the type of the ultrasound probe 2 and selectable classification items may be stored in the classification information storage unit 81.
- the type of the ultrasound probe 2 provided in the ultrasound observation apparatus 1 has a configuration that the apparatus itself can recognize in advance.
- the type of the ultrasonic endoscope is set at the end of the ultrasonic endoscope on the processing apparatus connection side. It is only necessary to provide a connection pin for making it discriminate. Thereby, the processing apparatus can determine the kind of ultrasonic endoscope according to the shape of the connection pin of the connected ultrasonic endoscope.
- the color information for the feature quantity (for example, Mean M value) and the color table that associates the feature value range with the visual information are calculated based on the distribution of the known specimen. It had been.
- the classification information storage unit 81 included in the storage unit 8 stores the color information and the color table as part of the classification information.
- the present invention is not limited to this embodiment, and may be configured as follows.
- the classification information storage unit 81 performs A / D conversion from the received signal by the frequency spectrum, the received signal, or the transmission / reception unit 3 of all the known samples necessary for configuring the distributions of FIGS.
- the digital RF signal may be stored in advance.
- the feature amount calculation unit 43 calculates in advance the feature amounts of all known samples from the above information of all known samples, and the classification unit 44 A part of the classification information (that is, the above-described color information and color table) is calculated from the feature amount, and the classification information storage unit 81 stores the classification information again.
- the information used as “known specimen information” as described above may be information calculated based on a received signal from a specimen whose attribute is known, or the received signal itself. By doing so, the same effect as in the first embodiment of the present invention can be obtained. Furthermore, it goes without saying that this embodiment may be applied to the second embodiment of the present invention.
- the ultrasonic probe 2 whose use is limited may be limited to the classification items corresponding to the use.
- Examples of the ultrasonic probe 2 whose use is limited include a miniature probe having a small diameter. This miniature probe is inserted not only into the digestive tract and blood vessels, but particularly into the bile duct and pancreatic duct, and is used for performing the classification item “malignant / benign discrimination” in the table Tb of FIG. For this reason, the ultrasound observation apparatus 1 to which the miniature probe is connected may automatically perform the classification process for the classification item “malignant / benign discrimination”.
- FIG. 20 is a diagram schematically illustrating an outline of processing when the feature amount calculation unit 43 performs attenuation correction processing after performing frequency spectrum approximation processing.
- a straight line L 10 illustrated in FIG. 20 is a regression line obtained as a result of the approximation unit 432 approximating the frequency band F of the frequency spectrum C 1 .
- the attenuation correction unit 431 obtains a straight line L 1 by performing attenuation correction on the straight line L 10 .
- Feature amount calculation unit 43 the slope of the straight line L 1, and calculates the intercept, and the mid-band fit as the feature amount.
- a texture feature quantity can be applied as a feature quantity other than the feature quantity calculated by frequency analysis.
- Texture features include energy of brightness within the region of interest, entropy, correlation, local homogeneity, inertia, short run emphasis, Long run emphasis, concentration level distribution (gray level distribution), run length distribution (run length distribution), run percentage (run percentage), etc. can be mentioned (details are disclosed in, for example, JP-A-4- 236952).
- the organization attributes may be classified by appropriately combining these texture feature amounts and the feature amounts obtained by frequency analysis.
- a color or luminance pattern superimposed on the feature amount image may be analyzed to calculate a texture feature amount based on the pattern, and classification may be performed based on the texture feature amount.
- a luminance table is created that uses luminance as visual information and displays attribute differences depending on the luminance difference. You may make it create.
- a user such as a doctor may be able to switch classification items during the examination.
- the control unit 9 may display the selection screen 101 shown in FIG.
- the classification item “II malignant / benign discrimination” can be performed only with a simple operation during the examination.
- Follow-up Judgment 2 whether the lesion is malignant or benign, whether to follow up, whether to treat immediately Judgment can be made. This eliminates the need for re-examination for screening after screening, and makes medical treatment very efficient.
- FIG. 8 to FIG. 10 the classification with one kind of feature amount has been described. This means that classification is possible even if the feature amount space is taken in one dimension.
- FIG. 11 illustrates the classification with two types of feature amounts. This means that the feature space can be classified in two dimensions.
- the type of feature quantity used for classification that is, the dimension of the feature quantity space is not limited to these examples, and may be three or more high dimensions.
- the separation axis d ′′ shown in FIG. 19 is not a one-dimensional area but a partial space in a two-dimensional (plane) or more feature amount space.
- the average and standard deviation have been described as examples of the statistical quantity serving as the feature quantity.
- dispersion may be used.
- the statistics are not limited to these one-dimensional statistics.
- multi-dimensional statistics such as correlation coefficients, principal axes of inertia, inertia tensors, eigenvalues, and eigenvectors are calculated for the groups G 41 and G 42 . It may be used as a quantity.
- the feature space is 3D or more, it can be considered that mass points with different masses are distributed in a space of 2D or more. Therefore, the sum of squares calculated by regarding the center of gravity, mass moment, and mass of the distribution as amplitude.
- a statistic such as (energy) may be calculated and used as a feature amount.
- X-ray CT Computed Tomography
- MRI Magnetic Resonance Imaging
- external ultrasound diagnostic devices that emit ultrasound from the body surface such as the chest and abdomen It can also be realized as a medical diagnostic apparatus.
- the present invention can include various embodiments and the like without departing from the technical idea described in the claims.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Public Health (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Physics & Mathematics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Theoretical Computer Science (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Remote Sensing (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Vascular Medicine (AREA)
- Physiology (AREA)
- Quality & Reliability (AREA)
- Computing Systems (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
図1は、本発明の実施の形態1に係る医用診断装置である超音波観測装置の構成を示すブロック図である。同図に示す超音波観測装置1は、超音波を用いて診断対象である検体を観測するための装置である。
A(f,z)=2αzf ・・・(1)
ここで、αは減衰率と呼ばれる。また、zは超音波の受信深度であり、fは周波数である。
分類対象の属性(分類属性)は正常組織と悪性腫瘍組織であり、分類に使用する特徴量はミッドバンドフィットの標準偏差Sd. Mである。この場合、特徴量の値が0≦Sd. M<M11である画素には赤が割り当てられ、M11≦Sd. M<M12である組織に対応する画素にはピンクが割り当てられる。これに対して、特徴量の値がSd. M≧M12である画素には色が割り当てられない(図7では「マスク」と記載)。色が割り当てられない範囲の特徴量に対応する組織は、正常組織である。この意味でSd. M=M12は、正常組織と悪性腫瘍組織とを分離するための閾値である。特徴量と色の対応づけは、例えば、悪性腫瘍組織における腫瘍の程度に応じて設定される。
分類対象の属性は悪性腫瘍組織と良性腫瘍組織であり、分類に使用する特徴量はミッドバンドフィットの平均Mean Mである。この場合、特徴量の値が0≦Mean M<M21である画素には色が割り当てられない。これに対し、特徴量の値がM21≦Mean M<M22である画素には青が割り当てられ、M22≦Mean M<M23である画素には水色が割り当てられ、M23≦Mean M<M24である画素には黄色が割り当てられ、M24≦Mean M<M25である画素にはピンクが割り当てられ、Mean M≧M25である画素には赤が割り当てられる。
分類対象の属性は要経過観察組織と良性腫瘍組織であり、分類に使用する特徴量は切片の標準偏差Sd. Iである。この場合、特徴量の値が0≦Sd. I<I1には赤が割り当てられ、I1≦Sd. I<I2である画素にはピンクが割り当てられ、I2≦Sd. I<I3である画素には水色が割り当てられ、I3≦Sd. I<I4である画素には青が割り当てられる。これに対し、特徴量の値がSd. I≧I4である画素には色が割り当てられない。
分類対象の属性は要経過観察組織と悪性腫瘍組織であり、分類に使用する特徴量はミッドバンドフィットの平均Mean Mおよび切片の標準偏差Sd. Iの関数として算出されるd(Mean M, Sd. I)である(以下、単にdと記載する)。このdは、具体的には2つの特徴量Mean MとSd. Iの線型結合として定義される。この場合、特徴量dの値が0≦d<d1である画素には赤が割り当てられ、d1≦d<d2である画素にはピンクが割り当てられ、d2≦d<d3である画素には緑が割り当てられ、d3≦d<d4である画素には黄色が割り当てられる。これに対し、特徴量の値がd≧d4の画素には色が割り当てられない。
検査内容表示領域202は、画面上部に設けられ、分類項目、分類に用いる特徴量、分類属性、検体ID、カラーバーの両極に対する分類属性を表す用語(後述)等の情報を表示する。図15では、分類項目が腫瘍スクリーニングである場合を例示している。
画像表示領域203は、Bモード画像の各画素に対して分類結果に基づく色を重畳した合成画像を表示する。画像表示領域203は、関心領域231を表示するとともに、関心領域231内の属性を対応する色で表示する。図15では、悪性腫瘍組織として、赤色の領域D1およびピンク色の領域D2を表示している。
色特性表示領域204は、画像表示領域203で表示する色と特徴量の値との関係を示すカラーバーを表示する。
本発明の実施の形態1では、分類のために必要な、周波数特徴量の値域と、値域に対応する視覚情報(色とマスク)と、値域と視覚情報との間を関連づける情報(カラーテーブル)とが、分類情報の一部として、予め、分類情報記憶部81に記憶されていた。実施の形態1における分類情報は、全て、分類部44による分類以前に、病理検査等により組織の属性が明らかな検体から導かれた情報であった。本発明の実施の形態2に係る医用診断装置は、病理検査等により検査対象組織の属性が明らかになった検体における該検査対象部位の特徴量を既知検体情報として蓄積しながら、分類情報を更新していく機能を有する。
ここまで、本発明を実施するための形態を説明してきたが、本発明は上述した実施の形態によってのみ限定されるべきものではない。例えば、超音波観測装置1は、用途に応じて超音波探触子2の種類が異なることに鑑み、超音波探触子2の種類に応じて選択可能な分類項目を変更するようにしてもよい。超音波探触子2の種類と選択可能な分類項目の対応は、分類情報記憶部81に記憶させておけばよい。この場合、超音波観測装置1が備える超音波探触子2の種類は、予め装置自身が認識可能な構成を有している。具体的には、例えば超音波探触子2として、超音波内視鏡が用いられていれば、超音波内視鏡における処理装置接続側の端部に超音波内視鏡の種類を処理装置に判別させるための接続ピンを設けておけばよい。これにより、処理装置は、接続された超音波内視鏡の接続ピンの形状に応じて超音波内視鏡の種類を判定することができる。
2 超音波探触子
3 送受信部
4、12 演算部
5 画像処理部
6 入力部
7 表示部
8、13 記憶部
9 制御部
21 超音波振動子
31 信号増幅部
41 増幅補正部
42 周波数解析部
43 特徴量算出部
44 分類部
51 Bモード画像データ生成部
52 特徴量画像データ生成部
81 分類情報記憶部
101 選択画面
102 枠カーソル
121 分類情報設定部
131 属性判定済画像記憶部
132 既知検体情報記憶部
201 特徴量画像
202 検査内容表示領域
203 画像表示領域
204 色特性表示領域
231 関心領域
431 減衰補正部
432 近似部
C1、C2 周波数スペクトル
G11、G12、G21、G22、G31、G32、G41、G42、G51、G52 グループ
Tb テーブル
Claims (14)
- 検体から受信した受信信号に基づいて複数種類の特徴量を算出する特徴量算出部と、
前記特徴量算出部が算出した前記複数種類の特徴量のうち、予め選択された分類項目に応じて定められる特徴量を用いて前記検体の組織の属性を分類し、分類結果に応じた視覚情報を前記受信信号に基づく画像の各画素に対して割り当てる分類部と、
前記受信信号に基づく画像の各画素に割り当てられた前記視覚情報を重畳した特徴量画像データを生成する特徴量画像データ生成部と、
を備えたことを特徴とする医用診断装置。 - 前記分類項目に対応する分類対象の組織の属性、分類に用いるべき特徴量の種類、および該特徴量の値に応じた視覚情報を対応づけた情報のうち少なくとも1つを含む分類情報を記憶する分類情報記憶部をさらに備え、
前記分類部は、
前記分類情報記憶部を参照して前記分類および前記視覚情報の割り当てを行うことを特徴とする請求項1に記載の医用診断装置。 - 前記属性が既知である検体からの受信信号および前記受信信号に基づいて算出された情報のうち少なくとも一方を含む既知検体情報を該属性と関連付けて記憶する既知検体情報記憶部と、
前記既知検体情報記憶部が記憶する前記既知検体情報を用いて前記分類情報を設定する分類情報設定部と、
をさらに備えたことを特徴とする請求項2に記載の医用診断装置。 - 前記特徴量算出部は、
前記検体の所定の領域から受信した前記受信信号をもとに複数のパラメータを抽出し、該複数のパラメータを用いて特徴量を算出することを特徴とする請求項1~3のいずれか一項に記載の医用診断装置。 - 前記特徴量算出部は、
前記複数のパラメータのうち同じ種類のパラメータの統計量を前記特徴量として算出することを特徴とする請求項4に記載の医用診断装置。 - 前記視覚情報は、色空間を構成する変数であることを特徴とする請求項1~5のいずれか一項に記載の医用診断装置。
- 前記視覚情報は、輝度であることを特徴とする請求項1~5のいずれか一項に記載の医用診断装置。
- 前記特徴量画像データ生成部が生成した前記特徴量画像データに対応する特徴量画像を表示する表示部をさらに備えたことを特徴とする請求項1~7のいずれか一項に記載の医用診断装置。
- 前記分類項目の選択入力を受け付ける入力部をさらに備えたことを特徴とする請求項1~8のいずれか一項に記載の医用診断装置。
- 検体に対して超音波を送信するとともに前記検体によって反射された超音波エコーを変換した電気的なエコー信号を前記受信信号として受信する超音波探触子をさらに備えたことを特徴とする請求項1~9のいずれか一項に記載の医用診断装置。
- 前記エコー信号の周波数を解析することによって周波数スペクトルを算出する周波数解析部をさらに備え、
前記特徴量算出部は、
前記周波数スペクトルを用いて前記複数種類の特徴量を算出することを特徴とする請求項10に記載の医用診断装置。 - 前記特徴量算出部は、
前記受信信号に基づく画像の各画素の輝度に基づいて前記複数種類の特徴量を算出することを特徴とする請求項1~9のいずれか一項に記載の医用診断装置。 - 検体から受信した受信信号に基づいて診断用の画像データを生成する医用診断装置の作動方法であって、
特徴量算出部が、前記受信信号における複数種類の特徴量を算出する特徴量算出ステップと、
分類部が、前記複数種類の特徴量のうち、予め選択された分類項目に応じて定められる特徴量を用いて前記検体の組織の属性を分類し、分類結果に応じた視覚情報を前記受信信号に基づく画像の各画素に対して割り当てる分類ステップと、
特徴量画像データ生成部が、前記受信信号に基づく画像の各画素に割り当てられた前記視覚情報を重畳した特徴量画像データを生成する特徴量画像データ生成ステップと、
を有することを特徴とする医用診断装置の作動方法。 - 検体から受信した受信信号に基づいて診断用の画像データを生成する医用診断装置に、
特徴量算出部が、前記受信信号における複数種類の特徴量を算出する特徴量算出ステップと、
分類部が、前記複数種類の特徴量のうち、予め選択された分類項目に応じて定められる特徴量を用いて前記検体の組織の属性を分類し、分類結果に応じた視覚情報を前記受信信号に基づく画像の各画素に対して割り当てる分類ステップと、
特徴量画像データ生成部が、前記受信信号に基づく画像の各画素に割り当てられた前記視覚情報を重畳した特徴量画像データを生成する特徴量画像データ生成ステップと、
を実行させることを特徴とする医用診断装置の作動プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015540935A JP5897227B1 (ja) | 2014-06-11 | 2015-04-24 | 医用診断装置、医用診断装置の作動方法および医用診断装置の作動プログラム |
EP15806921.1A EP3155971A4 (en) | 2014-06-11 | 2015-04-24 | Medical diagnostic device, medical diagnostic device operation method, and medical diagnostic device operation program |
CN201580003164.2A CN105828726B (zh) | 2014-06-11 | 2015-04-24 | 医用诊断装置以及医用诊断装置的工作方法 |
US15/180,569 US9655593B2 (en) | 2014-06-11 | 2016-06-13 | Medical diagnostic apparatus, method for operating medical diagnostic apparatus, and computer-readable recording medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-121002 | 2014-06-11 | ||
JP2014121002 | 2014-06-11 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/180,569 Continuation US9655593B2 (en) | 2014-06-11 | 2016-06-13 | Medical diagnostic apparatus, method for operating medical diagnostic apparatus, and computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015190180A1 true WO2015190180A1 (ja) | 2015-12-17 |
Family
ID=54833287
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/062606 WO2015190180A1 (ja) | 2014-06-11 | 2015-04-24 | 医用診断装置、医用診断装置の作動方法および医用診断装置の作動プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9655593B2 (ja) |
EP (1) | EP3155971A4 (ja) |
JP (1) | JP5897227B1 (ja) |
CN (1) | CN105828726B (ja) |
WO (1) | WO2015190180A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018175226A (ja) * | 2017-04-10 | 2018-11-15 | 富士フイルム株式会社 | 医用画像分類装置、方法およびプログラム |
JP2019076541A (ja) * | 2017-10-26 | 2019-05-23 | コニカミノルタ株式会社 | 医用画像処理装置 |
EP3395257A4 (en) * | 2015-12-24 | 2019-08-07 | Olympus Corporation | ULTRASONIC OBSERVATION DEVICE, METHOD OF OPERATING THE ULTRASONIC OBSERVATION DEVICE AND A PROGRAM FOR OPERATING THE ULTRASONIC OBSERVATION DEVICE |
JP2019535346A (ja) * | 2016-09-29 | 2019-12-12 | ゼネラル・エレクトリック・カンパニイ | Bラインを自動的に検出し、超音波スキャンの画像をスコア付けすることによる代表超音波画像の向上された視覚化および選択のための方法およびシステム |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6339872B2 (ja) * | 2014-06-24 | 2018-06-06 | オリンパス株式会社 | 画像処理装置、内視鏡システム及び画像処理方法 |
US10148972B2 (en) * | 2016-01-08 | 2018-12-04 | Futurewei Technologies, Inc. | JPEG image to compressed GPU texture transcoder |
JP6535694B2 (ja) * | 2017-02-22 | 2019-06-26 | 株式会社ジンズ | 情報処理方法、情報処理装置及びプログラム |
WO2018161257A1 (zh) | 2017-03-07 | 2018-09-13 | 上海联影医疗科技有限公司 | 生成彩色医学影像的方法及系统 |
EP3588120B1 (en) * | 2018-06-26 | 2021-02-24 | Bruker BioSpin GmbH | System and method for improved signal detection in nmr spectroscopy |
US11564633B2 (en) * | 2018-12-21 | 2023-01-31 | Industrial Technology Research Institute | State assessment system, diagnosis and treatment system, and method for operating the diagnosis and treatment system |
JP7100160B2 (ja) * | 2019-01-30 | 2022-07-12 | オリンパス株式会社 | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム |
CN113952031A (zh) | 2020-07-21 | 2022-01-21 | 巴德阿克塞斯系统股份有限公司 | 磁跟踪超声探头及生成其3d可视化的系统、方法和设备 |
CN113702982A (zh) * | 2021-08-26 | 2021-11-26 | 廊坊市新思维科技有限公司 | 一种超声波数据成像算法 |
US20230389893A1 (en) * | 2022-06-03 | 2023-12-07 | Bard Access Systems, Inc. | Ultrasound Probe with Smart Accessory |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011155168A1 (ja) * | 2010-06-07 | 2011-12-15 | パナソニック株式会社 | 組織悪性腫瘍検出方法、組織悪性腫瘍検出装置 |
WO2012011414A1 (ja) * | 2010-07-20 | 2012-01-26 | オリンパスメディカルシステムズ株式会社 | 超音波診断装置、超音波診断装置の作動方法および超音波診断装置の作動プログラム |
JP5079177B2 (ja) * | 2010-11-11 | 2012-11-21 | オリンパスメディカルシステムズ株式会社 | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4858124A (en) * | 1984-08-15 | 1989-08-15 | Riverside Research Institute | Method for enhancement of ultrasonic image data |
JPH0984793A (ja) * | 1995-09-20 | 1997-03-31 | Olympus Optical Co Ltd | 超音波画像処理装置 |
US6238342B1 (en) * | 1998-05-26 | 2001-05-29 | Riverside Research Institute | Ultrasonic tissue-type classification and imaging methods and apparatus |
JP3986866B2 (ja) * | 2002-03-29 | 2007-10-03 | 松下電器産業株式会社 | 画像処理装置及び超音波診断装置 |
US20040209237A1 (en) * | 2003-04-18 | 2004-10-21 | Medispectra, Inc. | Methods and apparatus for characterization of tissue samples |
US7175597B2 (en) * | 2003-02-03 | 2007-02-13 | Cleveland Clinic Foundation | Non-invasive tissue characterization system and method |
US20110257527A1 (en) * | 2010-04-20 | 2011-10-20 | Suri Jasjit S | Ultrasound carotid media wall classification and imt measurement in curved vessels using recursive refinement and validation |
WO2012063976A1 (ja) * | 2010-11-11 | 2012-05-18 | オリンパスメディカルシステムズ株式会社 | 超音波診断装置、超音波診断装置の作動方法および超音波診断装置の作動プログラム |
CN102834059B (zh) * | 2010-11-11 | 2013-12-04 | 奥林巴斯医疗株式会社 | 超声波观测装置、超声波观测装置的动作方法以及超声波观测装置的动作程序 |
EP2599441B1 (en) * | 2010-11-11 | 2019-04-17 | Olympus Corporation | Ultrasonic observation apparatus, method of operating the ultrasonic observation apparatus, and operation program of the ultrasonic observation apparatus |
JP5984243B2 (ja) * | 2012-01-16 | 2016-09-06 | 東芝メディカルシステムズ株式会社 | 超音波診断装置、医用画像処理装置及びプログラム |
US10206661B2 (en) * | 2012-09-07 | 2019-02-19 | Empire Technology Development Llc | Ultrasound with augmented visualization |
KR101993716B1 (ko) * | 2012-09-28 | 2019-06-27 | 삼성전자주식회사 | 카테고리별 진단 모델을 이용한 병변 진단 장치 및 방법 |
-
2015
- 2015-04-24 EP EP15806921.1A patent/EP3155971A4/en not_active Withdrawn
- 2015-04-24 JP JP2015540935A patent/JP5897227B1/ja active Active
- 2015-04-24 CN CN201580003164.2A patent/CN105828726B/zh active Active
- 2015-04-24 WO PCT/JP2015/062606 patent/WO2015190180A1/ja active Application Filing
-
2016
- 2016-06-13 US US15/180,569 patent/US9655593B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011155168A1 (ja) * | 2010-06-07 | 2011-12-15 | パナソニック株式会社 | 組織悪性腫瘍検出方法、組織悪性腫瘍検出装置 |
WO2012011414A1 (ja) * | 2010-07-20 | 2012-01-26 | オリンパスメディカルシステムズ株式会社 | 超音波診断装置、超音波診断装置の作動方法および超音波診断装置の作動プログラム |
JP5079177B2 (ja) * | 2010-11-11 | 2012-11-21 | オリンパスメディカルシステムズ株式会社 | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3155971A4 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3395257A4 (en) * | 2015-12-24 | 2019-08-07 | Olympus Corporation | ULTRASONIC OBSERVATION DEVICE, METHOD OF OPERATING THE ULTRASONIC OBSERVATION DEVICE AND A PROGRAM FOR OPERATING THE ULTRASONIC OBSERVATION DEVICE |
US11176640B2 (en) | 2015-12-24 | 2021-11-16 | Olympus Corporation | Ultrasound observation device, method of operating ultrasound observation device, and computer-readable recording medium |
JP2019535346A (ja) * | 2016-09-29 | 2019-12-12 | ゼネラル・エレクトリック・カンパニイ | Bラインを自動的に検出し、超音波スキャンの画像をスコア付けすることによる代表超音波画像の向上された視覚化および選択のための方法およびシステム |
JP2018175226A (ja) * | 2017-04-10 | 2018-11-15 | 富士フイルム株式会社 | 医用画像分類装置、方法およびプログラム |
JP2019076541A (ja) * | 2017-10-26 | 2019-05-23 | コニカミノルタ株式会社 | 医用画像処理装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2015190180A1 (ja) | 2017-04-20 |
US20160278743A1 (en) | 2016-09-29 |
CN105828726A (zh) | 2016-08-03 |
JP5897227B1 (ja) | 2016-03-30 |
CN105828726B (zh) | 2019-06-18 |
EP3155971A1 (en) | 2017-04-19 |
EP3155971A4 (en) | 2018-03-07 |
US9655593B2 (en) | 2017-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5897227B1 (ja) | 医用診断装置、医用診断装置の作動方法および医用診断装置の作動プログラム | |
JP5433097B2 (ja) | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム | |
WO2012063929A1 (ja) | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム | |
US20120310087A1 (en) | Ultrasonic diagnosis apparatus, operation method of the same, and computer readable recording medium | |
US10201329B2 (en) | Ultrasound observation apparatus, method for operating ultrasound observation apparatus, and computer-readable recording medium | |
JP5974210B2 (ja) | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム | |
WO2018116892A1 (ja) | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム | |
WO2018142937A1 (ja) | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム | |
JP6289772B2 (ja) | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム | |
CN108366782B (zh) | 超声波诊断装置、超声波诊断装置的工作方法以及记录介质 | |
JP2016202567A (ja) | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム | |
US9517054B2 (en) | Ultrasound observation apparatus, method for operating ultrasound observation apparatus, and computer-readable recording medium | |
WO2016181869A1 (ja) | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム | |
US10617389B2 (en) | Ultrasound observation apparatus, method of operating ultrasound observation apparatus, and computer-readable recording medium | |
JP6010274B1 (ja) | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム | |
JP6138402B2 (ja) | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム | |
WO2015198713A1 (ja) | 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム | |
CN113365560A (zh) | 超声波观测装置、超声波观测装置的工作方法以及超声波观测装置的工作程序 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2015540935 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15806921 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2015806921 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015806921 Country of ref document: EP |