WO2007114375A1 - Dispositif de diagnostic a ultrasons et procede de commande - Google Patents

Dispositif de diagnostic a ultrasons et procede de commande Download PDF

Info

Publication number
WO2007114375A1
WO2007114375A1 PCT/JP2007/057219 JP2007057219W WO2007114375A1 WO 2007114375 A1 WO2007114375 A1 WO 2007114375A1 JP 2007057219 W JP2007057219 W JP 2007057219W WO 2007114375 A1 WO2007114375 A1 WO 2007114375A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
ultrasonic
dimensional
images
processing
Prior art date
Application number
PCT/JP2007/057219
Other languages
English (en)
Japanese (ja)
Inventor
Naohisa Kamiyama
Yoko Okamura
Original Assignee
Kabushiki Kaisha Toshiba
Toshiba Medical Systems Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2006100225A external-priority patent/JP5002181B2/ja
Priority claimed from JP2006147265A external-priority patent/JP5165858B2/ja
Application filed by Kabushiki Kaisha Toshiba, Toshiba Medical Systems Corporation filed Critical Kabushiki Kaisha Toshiba
Priority to CN2007800045320A priority Critical patent/CN101378700B/zh
Priority to EP07740655.1A priority patent/EP1982654B1/fr
Publication of WO2007114375A1 publication Critical patent/WO2007114375A1/fr
Priority to US12/178,709 priority patent/US8696575B2/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0825Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the breast, e.g. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52077Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging with means for elimination of unwanted signals, e.g. noise or interference
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems

Definitions

  • the present invention relates to an ultrasonic diagnostic apparatus that extracts and displays a minute structure in a living organ from a tissue echo signal, and an ultrasonic diagnostic apparatus control method.
  • Ultrasound diagnostics provide a real-time display of heart pulsations and fetal movements with a simple operation just by applying an ultrasonic probe to the surface of the body, and because it is highly safe, repeat inspections. It can be carried out. In addition, it can be said that it is a simple diagnostic method that can easily perform inspection while moving to the bedside, where the scale of the system is smaller than other diagnostic devices such as X-ray, CT, and MRI.
  • the ultrasonic diagnostic equipment used for this ultrasonic diagnosis varies depending on the type of functions it has, but small ones that can be carried with one hand have been developed. Can be used in obstetrics and home medical care where exposure is not affected, such as X-rays.
  • One of the ultrasonic diagnoses having such various advantages is early diagnosis of breast cancer. It is known that microcalcification often occurs in breast tissue as a sign of breast cancer. One or several microcalcification lesions are scattered locally. Since lime is harder than biological tissue, it reflects the ultrasonic wave well and is expected to have high brightness on the image. However, it is said that it is difficult to extract even a few hundred microns when actually observing from the image.
  • speckle patterns are generated on an ultrasonic image due to random interference of ultrasonic waves.
  • This speckle pattern is used for the diagnosis of cirrhosis.
  • breast cancer screening it is very similar to a micro structure such as micro calcification, which is often overlooked, and in some cases it is misleading image information. Therefore, there is a need to remove speckle patterns in breast cancer diagnosis and the like, and techniques for this include, for example, spatial compounding, CFAR (Contrast False Alarm Rate) processing, similarity filter, and the like.
  • CFAR Contrast False Alarm Rate
  • Ml P treatment As another technique for extracting a microstructure represented by microcalcification, there is Ml P treatment.
  • the maximum luminance of a plurality of image frames is used as a representative value and projected onto a single frame. This is mainly used when displaying volume data as a 2D image in 3D image processing. Ideally, multiple frames of information are superimposed on one sheet, and an image with a high amount of information can be obtained.
  • the mammary gland to be diagnosed has a complicated structure such as a breast duct and is not a homogeneous organ. Therefore, when conventional filtering is performed, microcalcification is detected and at the same time the mammary gland structure is extracted (as a structure), and the two cannot be sufficiently distinguished.
  • the present invention has been made in view of the above circumstances.
  • a continuous structure such as a mammary gland and a microstructure such as a microcalcified portion can be accurately distinguished and a microstructure can be extracted.
  • An object is to provide an ultrasonic diagnostic apparatus and an ultrasonic diagnostic apparatus control method.
  • a first aspect of the present invention is an ultrasonic wave that transmits an ultrasonic wave to a subject, receives a reflected wave from the ultrasonic wave, and generates an echo signal of a plurality of frames based on the received reflected wave.
  • a transmission / reception unit an image data generation unit that generates three-dimensional image data composed of a plurality of two-dimensional images based on echo signals of the plurality of frames, and a microstructure included in the three-dimensional image data.
  • An ultrasound diagnostic apparatus comprising: an image generation unit that generates a first image by performing an emphasis process; and a display unit that displays the first image.
  • a second aspect of the present invention is to cause an ultrasonic diagnostic apparatus to transmit an ultrasonic wave to a subject, receive a reflected wave from the ultrasonic wave, and perform a plurality of frames based on the received reflected wave. And generating three-dimensional image data composed of a plurality of two-dimensional images based on the echo signals of the plurality of frames, and emphasizing the minute structure included in the three-dimensional image data.
  • a control method of an ultrasonic diagnostic apparatus comprising: generating a first image and displaying the first image by performing processing.
  • FIG. 1 is a block diagram showing a configuration of an ultrasonic diagnostic apparatus according to the first embodiment.
  • FIG. 2 is a diagram showing an example of volume data to be subjected to three-dimensional CFAR processing.
  • FIG. 3 is a diagram showing an example of volume data to be subjected to 3D CFAR processing.
  • FIG. 4 is a diagram showing another example of a kernel pattern used in the three-dimensional CFAR processing.
  • FIG. 5A is a diagram for explaining the effect of the present microstructure extraction process.
  • FIG. 5B is a diagram for explaining the effect of the present microstructure extraction process.
  • FIG. 5C is a diagram for explaining the effect of the present microstructure extraction process.
  • FIG. 6 is a flowchart showing the flow of the microstructure extraction processing according to the first embodiment.
  • FIG. 7A is a diagram showing an example of a kernel pattern used in the two-dimensional CFAR process.
  • FIG. 7B is a diagram showing an example of a kernel pattern used in the two-dimensional CFAR process.
  • FIG. 8 is a diagram for explaining depth calculation processing (difference processing).
  • FIG. 9 is a diagram for explaining depth calculation processing (frequency analysis processing).
  • FIG. 10 is a flow chart showing the flow of the microstructure extraction processing according to the second embodiment.
  • FIG. 11 is a flow chart showing a flow of a microstructure extraction process according to the third embodiment.
  • FIG. 12 is a diagram showing an example of the position detection device 15.
  • FIG. 13 is a diagram for explaining a minute structure extraction process.
  • FIG. 14A shows a schematic diagram of a target image.
  • FIG. 14B shows a schematic diagram of a reference image.
  • FIG. 15 is a diagram showing an example of changes in signal intensity (image brightness) for each pixel.
  • FIG. 16 is a diagram showing an example of changes in signal intensity (image brightness) for each pixel.
  • FIG. 17A is a diagram showing an example of a display form of a microstructure extracted image. [17B] FIG. 17B is a diagram for explaining the microstructure extraction image using the difference image.
  • FIG. 17C The upper part of FIG. 17C shows a normal B-mode image, and the lower part shows a microstructure extracted image.
  • FIG. 18 is a flowchart showing the flow of the microstructure extraction processing according to the first embodiment.
  • FIG. 19 is a flowchart showing the flow of a microstructure extraction process according to the fifth embodiment.
  • FIG. 20 is a flowchart showing the flow of a microstructure extraction process according to the sixth embodiment.
  • FIG. 21 is a block diagram showing a configuration of an image processing apparatus 2 according to the seventh embodiment.
  • FIG. 22 is a diagram showing an example of the minute structure extraction processing dedicated device 52.
  • FIG. 23 is a diagram showing a state in which a tomographic image group 40 having three-dimensional region information is acquired.
  • FIG. 24 is a diagram for explaining a second synthesis method for detecting the representative luminance from the acquired tomographic images.
  • FIG. 25A is a conceptual diagram for explaining an algorithm for representative luminance detection.
  • FIG. 25B is a conceptual diagram for explaining an algorithm for representative luminance detection.
  • FIG. 26 is an explanatory diagram of ROI as a calculation region to which the eighth embodiment is applied.
  • FIG. 1 is a block diagram showing the configuration of the ultrasonic diagnostic apparatus according to the first embodiment.
  • the ultrasonic diagnostic apparatus 11 includes an ultrasonic probe 12, an input device 13, a monitor 14, an ultrasonic transmission unit 21, an ultrasonic reception unit 22, a B-mode processing unit 23, and a Doppler processing unit. 24, an image generation unit 25, an image memory 26, an image composition unit 27, a control processor (CPU) 28, an internal storage unit 29, and an interface unit 30.
  • CPU control processor
  • the ultrasonic probe 12 is provided with a plurality of piezoelectric vibrators that generate ultrasonic waves based on drive signals from the ultrasonic transmission / reception unit 21 and convert reflected waves from the subject into electric signals, and the piezoelectric vibrators.
  • An ultrasonic wave is sent from the ultrasonic probe 12 to the subject P.
  • the transmitted ultrasonic waves are successively reflected by the discontinuous surface of the acoustic impedance of the body tissue and received by the ultrasonic probe 12 as an echo signal.
  • the amplitude of this echo signal depends on the difference in acoustic impedance at the discontinuous surface that results in reflection.
  • the echo when the transmitted ultrasonic pulse force is reflected by the moving blood flow or the surface of the heart wall depends on the velocity component in the ultrasonic transmission direction of the moving body due to the Doppler effect, and the frequency deviation. Receive
  • the ultrasonic probe 12 included in the ultrasonic apparatus is capable of ultrasonically scanning a three-dimensional region of a subject.
  • the ultrasonic probe 12 is configured to mechanically oscillate the vibrator along the direction orthogonal to the arrangement direction thereof and to perform ultrasonic scanning over a three-dimensional region, or two-dimensionally arranged two-dimensional vibration. It has a structure that ultrasonically scans a three-dimensional region by electrical control using elements.
  • the examiner automatically makes a plurality of two-dimensional tomographic images by simply bringing the probe body into contact with the subject. Can be acquired. It can detect both the controlled swing speed force and the exact distance between the sections.
  • a three-dimensional region can be ultrasonically scanned in the same time as acquiring a conventional two-dimensional tomographic image.
  • the input device 13 is connected to the device main body 11, and includes various switches for incorporating various instructions, conditions, region of interest (ROI) setting instructions, various image quality condition setting instructions, etc. from the operator into the device main body 11. It has buttons, a trackball, a mouse, a keyboard and the like. For example, when the operator operates the end button or the FREEZE button of the input device 13, the transmission / reception of the ultrasonic wave is terminated, and the ultrasonic diagnostic apparatus is temporarily stopped.
  • ROI region of interest
  • the monitor 14 Based on the video signal from the scan converter 25, the monitor 14 uses the morphological information (B mode image), blood flow information (average velocity image, dispersion image, power image, etc.) Display the combination as an image.
  • morphological information B mode image
  • blood flow information average velocity image, dispersion image, power image, etc.
  • the external storage device 16 is a recording medium such as a magnetic disk (floppy (registered trademark) disk, hard disk, etc.), optical disk (CD-ROM, DVD, etc.), semiconductor memory, and the like. It is a device that reads information. Information read from various recording media is transferred to the control processor 28 via the interface unit 30.
  • the ultrasonic transmission unit 21 includes a trigger generation circuit, a delay circuit, a pulsar circuit, and the like (not shown).
  • rate pulses for forming transmission ultrasonic waves are repeatedly generated at a predetermined rate frequency fr Hz (cycle: lZfr seconds).
  • the delay circuit the delay time necessary for focusing the ultrasonic wave into a beam and determining the transmission directivity for each channel is given to each rate pulse.
  • the trigger generation circuit applies a drive pulse to the probe 12 at a timing based on this rate pulse.
  • the ultrasonic transmission unit 21 has a function capable of instantaneously changing a transmission frequency, a transmission drive voltage, and the like in order to execute a predetermined scan sequence in accordance with an instruction from the control processor 28.
  • the change of the transmission drive voltage is realized by a linear amplifier type transmission circuit whose value can be switched instantaneously or a mechanism for electrically switching a plurality of power supply units.
  • the ultrasonic receiving unit 22 includes an amplifier circuit, an AZD transformation, an adder, and the like, as illustrated.
  • the amplifier circuit amplifies the echo signal captured via the probe 12 for each channel.
  • the AZD converter the delay time necessary to determine the reception directivity is given to the amplified echo signal, and then the addition processing is performed in the adder.
  • the reflection component from the direction corresponding to the reception directivity of the echo signal is emphasized, and a comprehensive beam for ultrasonic transmission / reception is formed by the reception directivity and the transmission directivity.
  • the B-mode processing unit 23 receives the echo signal from the transmission / reception unit 21, performs logarithmic amplification, envelope detection processing, and the like, and generates data in which the signal intensity is expressed by brightness. This data is transmitted to the scan converter 25, and is displayed on the monitor 14 as a B-mode image representing the intensity of the reflected wave in luminance.
  • the Doppler processing unit 24 also analyzes the velocity information of the echo signal force received from the transmission / reception unit 21 and extracts blood flow, tissue, and contrast agent echo components due to the Doppler effect, and calculates the average velocity, dispersion, power, etc. Obtain blood flow information for multiple points.
  • the image generation unit 25 converts a scanning line signal sequence of an ultrasonic scan into a scanning line signal sequence of a general video format typified by a television or the like (scan compact). Then, an ultrasonic diagnostic image as a display image is generated.
  • the image generation unit 25 executes various image processing other than scan conversion.
  • the image generation unit 25 performs, for example, a method of regenerating an average value image of brightness using a plurality of image frames after scan conversion (smoothing process), in addition to the micro structure extraction process described later, A method using a differential filter (edge enhancement), volume rendering using a 3D reconstruction algorithm (3D image reconstruction), a method using a difference between images (difference calculation), etc.
  • the data before entering the image generation unit 25 may be referred to as “raw data”.
  • the image memory (cine memory) 26 is a memory for storing an ultrasonic image corresponding to a plurality of frames immediately before freezing, for example. By continuously displaying the images stored in the image memory 26 (cine display), an ultrasonic moving image can be displayed.
  • the image synthesis unit 27 synthesizes an image that has received the image generation unit 25 or the force together with character information, scales, and the like of various parameters, and outputs the resulting video signal to the monitor 14.
  • the control processor 28 has a function as an information processing apparatus (computer) and controls the operation of the main body of the ultrasonic diagnostic apparatus.
  • the control processor 28 reads out a dedicated program for realizing the fine structure extraction function from the internal storage unit 29, a control program for executing predetermined image generation / display, etc., and expands it on its own memory. Executes calculation control related to various processes
  • the internal storage unit 29 includes a predetermined scan sequence, a dedicated program for realizing the microstructure extraction function according to each embodiment, a control program for executing image generation and display processing, diagnostic information (patient ID, doctor's Findings, etc.), diagnostic protocol, transmission / reception conditions, CF AR processing control program, body mark generation program, and other data groups. Further, it is also used for storing images in the image memory 26 as required.
  • the data in the internal storage unit 29 can also be transferred to an external peripheral device via the interface circuit 30.
  • the interface unit 30 is an interface related to the input device 13, a network, and a new external storage device (not shown). Data such as ultrasonic images and analysis results obtained by the device can be transferred by the interface unit 30 to another device via a network.
  • Microstructure extraction function Next, the microstructure extraction function of the ultrasonic diagnostic apparatus 1 will be described. Microstructures that are localized in one place, typified by microcalcifications, and continuous structures that have a three-dimensionally continuous structure typified by mammary glands, etc. are in the form of their spatial distribution. However, it is essentially different. This function focuses on this point, for example, in the diagnosis of breast, liver, spleen, etc., distinguishing both based on the form of spatial distribution, and generating images in which microstructures are actively extracted (microstructures) Product extraction image).
  • CFAR processing is adopted as a method for removing the B-mode image force spectrum pattern.
  • the spatial compound method that smoothes the speckle pattern by superimposing transmission and reception signals from different directions, and the similarity that removes the speckle pattern using statistical properties are employable.
  • the term “CFAR processing” is used in the radar field.
  • the term “CFAR” is used for the sake of convenience in order to make the description more specific due to its relevance. Nonetheless, it is not bound by the methods used in the radar field, or those that strictly use statistics.
  • the processing using the microstructure extraction function is targeted for three-dimensional image data.
  • the three-dimensional image data means volume data having a plurality of two-dimensional images, or data having a plurality of different two-dimensional image forces (not necessarily constituting complete volume data).
  • a micro structure extraction process using volume data will be described for specific description.
  • FIG. 2 and FIG. 3 are diagrams showing an example of volume data to be subjected to this CFAR process.
  • the white rectangle is the normal pixel making up the ultrasound image
  • the black rectangle is the pixel of interest P of the pixels making up the ultrasound image
  • the white and black rectangle is the pixel of interest P.
  • Each pixel located in the vicinity and used in the averaging process (1) described later (neighboring pixels) is shown.
  • the pattern of neighboring pixels as shown in each figure is called “kernel”.
  • the CFAR processing using a three-dimensionally defined kernel as in this embodiment is called “three-dimensional CFAR processing”.
  • the CFAR processing according to the present embodiment is executed, for example, according to the following procedures (1) to (3).
  • neighboring pixels are provided in a cross shape in order to shorten the calculation processing time.
  • the arrangement of neighboring pixels is not limited to this.
  • the average value is obtained using neighboring pixels arranged in a wider range, for example, as shown in FIG. You may do it.
  • the average brightness value is obtained.
  • the present invention is not limited to this, and the maximum brightness value may be obtained.
  • the pixel value is determined based on the neighboring pixels in the Z-axis direction).
  • continuous structures such as mammary glands are distributed three-dimensionally including the depth direction.
  • microstructures typified by microcalcifications are distributed only in localized regions. Therefore, by adopting a three-dimensional kernel pattern that includes not only neighboring pixels on the same ultrasound image but also the depth direction, it is possible to select high-luminance pixels having three-dimensional continuity.
  • FIGS. 5A, 5B, and 5C are diagrams for explaining the effect of the microstructure extraction process.
  • the duct structure and microcalcification are depicted, but the visibility is low and difficult to see.
  • the image shown in Fig. 5B was obtained by two-dimensional CFAR processing using a two-dimensionally defined kernel. In this image, speckle pattern is reduced, but part of the mammary gland structure remains in addition to microcalcification, and the visibility is somewhat poor.
  • the image shown in FIG. 5C is an image (microstructure extraction image) obtained by the microstructural extraction process. In this microstructure extracted image, compared with the images shown in FIGS. 5A and 5B, the microcalcification portion is extracted better. This is a technique that can discriminate and remove mammary glands (continuous structures) that are continuous in the depth direction by performing three-dimensional CFAR processing.
  • the CFAR processing is effective in extracting a signal having a luminance that deviates from the speckle fluctuation force.
  • An arithmetic technique with a similar effect is a no-pass filter (signal processing that extracts only high-frequency components).
  • a high-pass filter may be used instead of this CFAR processing, but the CFAR processing may be superior in speckle pattern reduction.
  • FIG. 6 is a flowchart showing the flow of the microstructure extraction process according to the present embodiment.
  • the original image data is received and stored in a predetermined memory (Step Sl, Step S2).
  • the image generation unit 25 sets a kernel having a predetermined three-dimensional pattern with a pixel included in the target tomographic image as a target pixel in the three-dimensional image data, and performs three-dimensional C FAR processing. Run (step S3).
  • a plurality of The luminance power CFAR processing of the cross section that is, the three-dimensional spatial information is performed, and a microstructure extraction image is generated based on the target tomographic image.
  • the generated microstructure extraction image is displayed on the monitor 14 via the image synthesis unit 27 and is automatically saved in the storage device 29 or the like (step S4).
  • the microstructure extracted image can be displayed in Dual display or Triplex together with, for example, a B-mode image before CFAR processing or a B-mode image after CFAR processing.
  • the cursor is arranged so as to correspond to the same position in each image.
  • step S5 steps S1 to S4 are repeatedly executed (step S5).
  • the filter processing performed on the ultrasonic tomographic image is extended three-dimensionally, so The speckle pattern is removed using not only the image but also information on the direction (depth direction) substantially perpendicular to the image. Accordingly, it is possible to discriminate between the continuous structures distributed three-dimensionally and the localized microstructures, and generate a microstructure extracted image from which the microstructures are extracted. By observing the microstructure extraction image, doctors and the like can quickly find a microstructure that is difficult to distinguish from speckle patterns and appears only in a specific cross-sectional image. It becomes possible.
  • a desired image among the B-mode image before removal of speckle pattern, the B-mode image after removal of speckle pattern, and the microstructure extraction image stored in the storage unit. can be displayed in a predetermined form such as dual display or triplex display.
  • the cursor is arranged so as to correspond to the same position in each image. Therefore, an observer such as a doctor can display a microstructure extraction image in a desired display form and at a desired timing according to the purpose, and quickly and easily identify and observe a microstructure with a plurality of types of images. can do.
  • the image generation unit 25 executes a process (micro structure extraction process) related to the micro structure extraction function according to the present embodiment.
  • the control processor 28 reads out a dedicated program for realizing the microstructure extraction function according to the present embodiment from the internal storage unit 29, develops it on its own memory, and executes predetermined arithmetic control, etc. Do
  • the fine structure extraction function performs a fine structure extraction process using a process for removing speckle patterns and a depth calculation process for calculating spatial continuity in the depth direction. .
  • a process for removing speckle patterns is executed for each frame.
  • a two-dimensional kernel defined on the same ultrasonic image as shown in FIGS. 7A and 7B is used as a process for removing the speckle pattern.
  • Execute CFAR processing two-dimensional CFAR processing.
  • 2D CFAR processing instead of being bound by this, instead of 2D CFAR processing, use similarity filter processing, spatial compound processing, etc.
  • the depth calculation process is a process for determining the continuity in the depth direction of the structure (high luminance region) on the ultrasonic image.
  • the following method can be employed.
  • the continuous structure remaining on the ultrasound image after two-dimensional CFAR processing is larger than micro lime and is considered to have continuity in the depth method. From this point of view, it is expected that the continuous structure exists (image is displayed) in the state shown in FIG. 8, for example, on each successive frame image after the two-dimensional CFAR processing. Therefore, a difference image is generated from continuous or adjacent frame images (for example, an image is generated from image F). When the image F is subtracted), only the discontinuous microstructures (for example, microcalcifications) can be extracted from the difference image.
  • the frame used for the difference may be performed every n frames (where n is a natural number) as well as adjacent frames as necessary.
  • the size of the microstructure depends on the individual. Therefore, it is preferable that an operator can arbitrarily select the force (that is, the value of n) for generating a difference image between any frames by a predetermined operation from the input device 13.
  • the ultrasonic probe 12 When the ultrasonic probe 12 is provided with an oscillation circuit, information on the distance between a plurality of tomographic images that are automatically obtained can be obtained simultaneously. Therefore, it is a good idea to identify tomographic images separated by a desired distance (for example, an average of 2 mm) based on the distance between the tomographic images obtained.
  • a desired distance for example, an average of 2 mm
  • the order of the differential image processing and the CFAR processing may be reversed.
  • a difference image is generated from frame images that are continuous or close to each other, and then CFAR processing is performed on the obtained two-dimensional image to remove unnecessary tissue images, thereby discontinuously forming minute structures that exist discontinuously.
  • the artifact may be extracted.
  • FIG. 9 is a graph showing the frequency change with respect to the depth direction for the pixels corresponding to the positions between the N images shown in FIG.
  • the frequency of the pixel value in the depth direction is gradual in the pixel corresponding to other than the minute structure as shown in graph A.
  • the pixel corresponding to the microstructure as shown in graph B, there is a part where the frequency change of the pixel value becomes steep. Therefore, when a high pass filter process is performed in the depth direction, the gradual change is removed, so that only minute structures such as calcified portions can be extracted.
  • Image processing is performed to re-generate the maximum value compared with, as a new pixel.
  • This image processing can be expressed as follows for the pixel value P (x, y) at the coordinates (X, y).
  • a shift vector between the two-dimensional images is corrected using the movement vector between the two two-dimensional images.
  • a technology that corrects image blurring from image frames that are continuous in the time direction (dividing one image into several regions, and for each region, the direction of movement between frames and A method for determining the amount of movement from the correlation of image patterns, etc.) is already installed in commercially available video cameras. If the display position of the image frame is corrected using the motion vector calculated by such a method, the blur in the vertical and horizontal directions is reduced, and the minute structure and other structures are ideally separated. can do.
  • FIG. 10 is a flowchart showing the flow of the microstructure extraction process according to the present embodiment.
  • the image generation unit 25 has N two-dimensional images to be processed. 3D image data composed of images is received and stored in a predetermined memory (step S11).
  • the image generation unit 25 sets a kernel having a predetermined two-dimensional pattern for each two-dimensional image, and executes a two-dimensional CFAR process (step S12).
  • the image generation unit 25 performs depth calculation processing using each two-dimensional image to generate a microstructure extraction image (step S13).
  • the generated microstructure extraction image is displayed on the monitor 14 via the image composition unit 27 and is automatically saved in the storage device 29 or the like (step S14).
  • the depth calculation process is executed after the process of removing the speckle pattern is performed on each two-dimensional image. Therefore, it is possible to extract a high-luminance area in the two-dimensional image, extract a microstructure based on the distribution in the depth direction of the high-luminance area, and visualize this as a microstructure extraction image. As a result, the same effect as the first embodiment can be realized.
  • the depth calculation process is directly performed using N two-dimensional images without performing the process of removing the speckle pattern in the second embodiment.
  • FIG. 11 is a flowchart showing the flow of the microstructure extraction process according to the present embodiment.
  • the image generation unit 25 receives three-dimensional image data composed of N target two-dimensional images and stores it in a predetermined memory (step S21).
  • the image generation unit 25 executes depth calculation processing using each two-dimensional image to generate a microstructure extraction image (step S22).
  • the generated microstructure extraction image is displayed on the monitor 14 via the image composition unit 27 and is automatically stored in the storage device 29 or the like (step S13).
  • depth calculation processing is executed using a plurality of two-dimensional images constituting three-dimensional image data. Therefore, it is possible to extract a minute structure based on the distribution in the depth direction of the structure included in the three-dimensional image data, and to display this as a minute structure extracted image. As a result, the first and second The same effect as the embodiment can be realized.
  • the ultrasonic diagnostic apparatus 1 is provided with a position detection apparatus 15 as necessary.
  • This apparatus detects information related to the position of the ultrasonic probe 12 with respect to an imaging target (that is, a diagnostic site).
  • the information on the position of the ultrasonic probe 12 includes the absolute position information, relative position information of the probe 12, position information and moving speed and time before the probe 12 is moved, and other positions of the ultrasonic probe 12 at the time of scanning. It is information for specifying
  • FIG. 12 is a diagram showing an example of the position detection device 15.
  • the position detection device 15 includes a movable stage 150 and a drive unit 151.
  • the ultrasonic probe 12 can be installed on the movable stage 150 via a dedicated adapter.
  • the driving unit 151 moves the installed ultrasonic probe 12 along the movable stage 150 based on the control from the control processor 28.
  • the drive unit 151 includes a rotary encoder or the like inside, detects the position of the ultrasonic probe 12 on the movable stage 150, and sequentially transmits the detection result to the control processor 28.
  • the breast to be diagnosed is placed at a predetermined position in a state where the water is deep in the water tank 17, and is fixed so as not to move during the examination. Further, the ultrasonic probe 12 and the position detector 15 are arranged on the bottom surface side of the water tank 17.
  • the control processor 28 executes self-propelled ultrasonic scanning by executing ultrasonic transmission / reception while controlling the drive unit 151 so that the ultrasonic probe 12 moves at a predetermined speed.
  • the image from the probe 12 is sent to the apparatus body as in the first embodiment. Further, the position information acquired from the drive unit 151 is used in real time for generating information related to the probe position, which will be described later, and is written and managed as incidental information for each frame.
  • a bright spot (hereinafter simply referred to as a bright spot) that appears to be a fine structure is observed in an ultrasonic tomogram (B-mode image), it is actually a force that is actually a structure like microcalcification. It is difficult to judge whether it is part of a tissue structure like the mammary gland. In particular, it is said that a single still image cannot be diagnosed! /. However, both are different in the following points, for example.
  • composition of fine lime is harder than that of living tissue, and in principle, a larger ultrasonic reflection signal should be obtained.
  • Our results also show that the signal level of the bright spot due to microcalcification is somewhat higher than the maximum value of the surrounding speckle pattern. However, when the brightness is displayed on the monitor, it is difficult to visually determine this difference in signal level.
  • B microcalcifications are localized microstructures, while biological tissues such as mammary glands are continuous structures that have a three-dimensionally continuous structure and have a spatial distribution pattern. It is essentially different. Therefore, if the three-dimensional continuity in the depth direction is taken into account, it can be expected that the difference between the two can be determined.
  • micro structure extraction image is generated.
  • the processing using the present microstructure extraction function targets image group data.
  • the image group data means volume data having a plurality of two-dimensional images or data composed of a plurality of different two-dimensional image forces (not necessarily constituting complete volume data).
  • image group data can be acquired by mechanically oscillating the ultrasonic probe 12 along a direction orthogonal to the transducer arrangement direction and ultrasonically scanning a three-dimensional region. .
  • the same can be obtained by ultrasonically scanning a three-dimensional region by electrical control using the ultrasonic probe 12 in which ultrasonic transducer elements are two-dimensionally arranged.
  • photographing using the device capable of self-propelled scanning shown in FIG. 12, or an ultrasonic probe in which ultrasonic vibration elements are arranged one-dimensionally (if a position sensor is provided if necessary) Can be acquired in the same way even when photographing multiple tomograms manually.
  • FIG. 13 is a diagram for explaining the micro structure extraction process.
  • the target image 31 and the reference image 32 are selected from a plurality of two-dimensional images included in the image data. To do.
  • the target image 31 is an image that is a target of the microstructure extraction process.
  • the reference image 32 is a different tomographic image that is spatially different from the target image 31 (for example, k frames away from the target image), and is used for this microstructure extraction process.
  • These images are preferably cross-sections in the vertical direction of the probe body force, as in the B-mode diagnosis.
  • FIG. 14A shows a schematic diagram of a target image
  • FIG. 14B shows a schematic diagram of a reference image.
  • a difference image is generated by subtracting the reference image from the target image.
  • a representative value of the pixel existing in the reference area Ri set on the reference image is determined.
  • the value of the target pixel (xi, yi) is also subtracted.
  • the reference region is set on the reference image at an arbitrary size such that the coordinates on the reference image include the same pixel (corresponding pixel) as the target pixel.
  • the representative value of the reference region Ri may be any value as long as it represents the characteristics of the reference region Ri. Specific examples include a maximum value, an average value, a median value, and the like. In this embodiment, the maximum value is adopted as the representative value.
  • the generation of the difference image can be expressed as the following equation (1).
  • the difference image generated by the microstructure extraction process removes the continuous structure and the random speckle pattern, and appropriately displays the microstructure.
  • the continuous structure remaining on the two-dimensional ultrasonic image constituting the image group data is a structure larger than micro lime, and is considered to have continuity in the depth method. From this point of view, when focusing on one point in the ultrasonic tomogram, The structure is expected to have a gradual change in signal intensity (image brightness) A as shown in FIG. 15, for example. On the other hand, it is expected that the microstructure is included only in a specific image as signal intensity B in FIG.
  • the discontinuous microstructure is generated by the differential image.
  • the maximum value of the pixels existing in the reference region Ri set on the reference image is determined for each pixel (xi, yi) on the target image.
  • a difference image is generated as a microstructure extraction image. Therefore, for example, even when a spatial position shift occurs between the target image and the reference image, the portion visualized by the target pixel on the target image exists in the reference region on the reference image. Will do. As a result, the extraction performance of the microstructure can be improved.
  • the microstructure extraction process is not limited to the selection of the reference image and the selection of the reference region size, but examples thereof include the following.
  • the reference image can be selected based on the size of the video target (in this case, the calcification site).
  • the difference between images decreases as the distance between frames decreases. Therefore, if the size of the microstructure exceeds the difference between images, the difference result is expected to be zero.
  • the reference area size can be selected based on the size of the positional deviation expected between the target image and the reference image.
  • the reference region has a size that exceeds the size of the positional deviation expected between the target image and the reference image.
  • the reference image can be selected based on the size of the speckle pattern.
  • the difference process when the difference process is performed, if the interval between the target image and the reference image exceeds the size of the speckle pattern, the difference result becomes zero, and the speckle pattern is extracted together with the microstructure.
  • the speckle pattern size depends on the frequency of the transmitted ultrasound. Therefore, it is more preferable to select the reference image even in accordance with the frequency of the transmission ultrasonic wave.
  • the reference area size and the reference image can be selected based on the size of the structure other than the imaging target! In other words, when performing the difference processing, if the interval between the target image and the reference image exceeds the size of the structure other than the video object, the difference result becomes an exit, and the structure is combined with the image object. It will be extracted. In order to solve such problems, it is preferable to select the reference image so that the distance from the target image is smaller than the size of the structure other than the target image.
  • the position of the reference image and the reference region size can be set to arbitrary values by manual operation via the input device 13.
  • the position of the reference image can also be determined by controlling the speed of rocking or scanning in the depth direction of the ultrasonic tomographic plane.
  • the interval between the target image and the reference image and the reference area size are both about several millimeters.
  • the position of the reference image can be automatically determined by the following method.
  • the image generation unit 25 selects the frame Fi-1 as a reference image, and performs difference processing using this as the target image Fi.
  • the sum S1 of luminance values (pixel values) on the difference image obtained as a result is calculated.
  • the same processing is executed using the other frames Fi-2, Fi-3,... As reference images, and the sums S2, S3,.
  • the ultrasonic diagnostic apparatus 1 can set two or more reference images 32. For example, as shown in FIG. 16, the difference between the target image Fi and the reference image Fi + m and the target image Fi and the reference image Fi-m The difference value of is calculated. As a result, if there is a magnitude relationship between the two difference values, it can be estimated that the change in luminance value is in the form shown by signal intensity C in FIG. As a small! /, The difference value of the one is adopted. Even when two or more reference images 32 are set, the selection criteria for each reference image are as described above. In addition, in order to enable more preferable extraction of a fine structure, it is preferable that the reference image is selected symmetrically with respect to the target image.
  • the display form according to the present embodiment displays a differential image as a microstructure extraction image together with information indicating an ultrasonic probe position at the time of obtaining a target image used for generating the differential image.
  • the information indicating the position of the ultrasonic probe may be any information as long as it fulfills its purpose, but a typical example is a body mask as shown in FIG. 17A.
  • a schematic diagram of the ultrasonic probe 12 set on the track can be given.
  • Such information indicating the ultrasonic probe position can be generated based on, for example, probe position information detected by the position detection device 16 shown in FIG. In the case of the body mark in FIG.
  • the image composition unit 27 generates a body mark indicating the probe position under the control of the control processor 28, combines it with the difference image, and sends it out to the monitor 14. Accordingly, the difference image can be displayed together with the information indicating the ultrasonic probe position in the form shown in FIG. 17A. If necessary, based on all probe position information of the two-dimensional images constituting the image group data, the scanning range of the ultrasonic probe 12 or the already displayed area is set as a “trajectory” and the body mark Let's display it by color above.
  • the display form according to the present embodiment uses Ml P processing (Maximan Intensity Projection) using a plurality of difference images (for example, difference images corresponding to image group data) obtained by the microstructure extraction processing. : Maximum value projection processing) and the resulting MIP image is displayed as a microstructure extraction image.
  • Ml P processing Maximum Intensity Projection
  • the resulting MIP image is displayed as a microstructure extraction image.
  • a differential image having a certain reliability of image group data force is extracted using quantitative analysis, and the MIP processing according to the second embodiment is executed using this. That is, a luminance curve is generated for each pixel of the difference image corresponding to the image group data, and a temporal change amount and its standard deviation in a certain period (for example, frame interval) are calculated using the luminance curve. It can be said that a pixel corresponding to a standard deviation (for example, a standard deviation having a predetermined threshold value or more) showing a significantly different value among the obtained results is highly likely to be a microstructure. Therefore, by extracting a difference image having the image and performing MIP processing using these images, the extraction accuracy of the microstructure can be improved.
  • a standard deviation for example, a standard deviation having a predetermined threshold value or more
  • the display form according to the present embodiment is a B-mode image before microstructure extraction processing, a microstructure
  • the MIP image obtained by the MIP processing using the object extraction image and the difference image can be displayed in any of a superimposed display, a dual display, and a triplex display.
  • the B mode image before speckle pattern removal and the new image after removal can be distinguished by superimposing them with different basic colors.
  • parallel display such as Dual display, in each display form in which different types of images are displayed simultaneously, a cursor is arranged so as to correspond to the same position in each image. Therefore, an observer such as a doctor can display a microstructure extracted image in a desired display form and at a desired timing according to the purpose, and can quickly and easily identify and observe a microstructure using a plurality of types of images. can do.
  • FIG. 18 is a flowchart showing the flow of the microstructure extraction process according to the present embodiment.
  • the display mode according to the first embodiment is adopted in the example of FIG.
  • the parameter group necessary for the minute structure extraction processing includes the number of reference images, the distance from the target image, the region for smoothing processing (maximum value calculation), and the like.
  • image group data relating to the breast is acquired by volume scanning using a predetermined method, and stored in the memory (step S33).
  • the image generation unit 25 calculates the representative value of the reference area for each reference image (step S34), and executes the micro structure extraction process described above using this, thereby corresponding to the image group data.
  • a plurality of difference images are generated (step S35).
  • the obtained difference image is displayed on the monitor 14 together with a body mark having probe position information, for example, and is automatically saved (step S36).
  • the diagnostic device repeatedly executes the microstructure extraction process until an image freeze or a command to end the video mode of the present invention is issued.
  • a discontinuous microstructure is related to a direction (depth direction) substantially orthogonal to the image.
  • Information is also extracted.
  • the maximum value smoothing in the micro structure extraction process cannot be removed with a mere difference between the target image and the reference image, but may remain due to changes in speckle patterns and displacements in the cross-sectional direction of the structure. Effective removal is possible.
  • Fig. 17C The upper part of Fig. 17C is a diagram showing a normal B-mode image, and the lower part of Fig. 17C is a diagram showing a microstructure extraction image.
  • the normal B-mode image shown in the upper part of Fig. 17C in addition to the microcalcification site, part of the tissue has been imaged, and there are many dotted high-intensity sites. Scattered. Therefore, it is impossible to identify with the naked eye which point corresponds to the microcalcification site.
  • the microstructure extracted image shown in the lower part of FIG. 17C only the microcalcification site is extracted and displayed as a dot-like high brightness site.
  • this ultrasonic diagnostic apparatus it is possible to arbitrarily select a frame used as a reference image and a size of a reference region used for the microstructure extraction process. Accordingly, by setting the size of the frame and the reference area as the reference image according to the inspection purpose and individual differences, it is possible to obtain a suitable image of the microstructure corresponding to each situation.
  • the difference image obtained by the microstructure extraction process that displays the microstructure extraction image together with the body mark in which the probe position and the scanning range at the time of target image acquisition are set.
  • Various display forms such as displaying a MIP image generated by using a predetermined form, displaying images before and after extraction of a minute structure in a predetermined form, and the like can be adopted. Therefore, it is difficult for doctors and the like to visually distinguish the minute pattern from the speckle pattern by observing the minute structure extracted image in a desired display form or comparing the minute structure extracted image in various display forms. It is possible to find a microstructure that appears only in a specific cross-sectional image in a short time.
  • a case where a plurality of difference images are used in generating a MIP image is taken as an example. However, be sure to generate a MIP image using normal images without being bound by this.
  • the ultrasonic diagnostic apparatus 1 performs the microscopic structure extraction process described in the first embodiment after performing a predetermined speckle reduction process (previous speckle reduction process) on the image group data.
  • the pre-stage speckle reduction process may be any process as long as it aims to remove at least one of the continuous structure and speckle pattern (including random ones).
  • Specific examples include 3D CFAR (Contrast False Alarm Rate) processing using image group data, 2D CFAR processing for each 2D image that composes image group data, and superimposing transmission and reception signals from different directions.
  • Examples include a spatial compound method that smoothes the lu pattern, and a similarity filter method that removes the speckle pattern using statistical properties.
  • FIG. 19 is a flowchart showing the flow of the microstructure extraction processing according to the present embodiment.
  • the necessary parameter group is read, and image group data related to the breast is acquired by scanning and stored in the memory.
  • the image generation unit 25 performs a pre-speckle reduction process on the image group data (step S33 ′), calculates a representative value of the reference area for each reference image (step S34), and uses this.
  • a plurality of difference images corresponding to the image group data are generated by executing the fine structure extraction process described above (step S35).
  • the obtained difference image is displayed on the monitor 14 together with a body mark having probe position information, for example, and is automatically saved (step S36).
  • the ultrasonic diagnostic apparatus 1 includes a two-dimensional image probe including a one-dimensional array element that is not a three-dimensional image swing probe or a three-dimensional image two-dimensional transducer probe.
  • the microstructure extraction process according to the embodiment is executed.
  • the configuration of the ultrasonic diagnostic apparatus according to the present embodiment is substantially the same as that shown in FIG. 1 except that the ultrasonic probe 12 is a two-dimensional imaging probe. Note that, in order to make the following description more specific, the case of performing the microstructure extraction process according to the first embodiment is taken as an example.
  • FIG. 20 is a flowchart showing the flow of the microstructure extraction process according to the present embodiment.
  • the subject is scanned (step S41).
  • the operator acquires different tomographic images while gradually changing the scanning tomographic plane. These images are sequentially recorded in the internal memory (or internal hard disk etc.) (step 42).
  • step S43 and S44 the microstructure extraction image
  • step S45 the parameters for the micro structure extraction process are changed (steps S45 and 46), and steps S43 and S44 are performed again based on the new parameters. Repeat the process.
  • an ultrasonic image is acquired while changing the scanning position little by little using a two-dimensional image probe, and this is stored in a memory. After that, the stored ultrasonic image is read out, and the microstructure extraction process is executed using this. Therefore, even if the ultrasonic probe 12 is not capable of scanning the three-dimensional region by electronic control or mechanical control, the microstructure extraction function can be realized. In addition, although the microstructure extraction image cannot be obtained in real time, the advantage is that the information of the reference images on both sides of the target image frame can be used.
  • the microstructure extraction process according to the fourth embodiment or the microstructure extraction process according to the fifth embodiment is executed using image group data acquired in advance.
  • the image processing apparatus will be described.
  • Such an image processing apparatus installs a program (micro structure extraction program) for executing each process related to the micro structure extraction function in a computer such as a workstation and expands them on a memory.
  • the micro structure extraction program can be stored and distributed in a recording medium such as a magnetic disk, an optical disk, or a semiconductor memory.
  • FIG. 21 is a block diagram showing a configuration of the image processing device 5 according to the seventh embodiment.
  • the image processing apparatus 5 includes an operation unit 51, a microstructure extraction processing dedicated device 52, a display unit 53, a control unit 55, a storage unit 57, an image processing unit 58, and a transmission / reception unit 59. is doing.
  • the operation unit 51 includes a trackball, various switches, a mouse, a keyboard, and the like for taking various instructions, conditions, and the like from the operator into the device 1.
  • the display unit 53 displays an ultrasonic image (B-mode image, microstructure extraction image, etc.), an input screen for performing a predetermined operation, and the like in a predetermined form.
  • the control unit 55 dynamically or statically controls each unit constituting the image processing apparatus 5.
  • the control unit 5 expands the microstructure extraction program stored in the storage unit 57 on its own memory, and controls the display 53, the image processing unit 58, and the like according to the expanded program.
  • the storage unit 57 stores a micro structure extraction program.
  • the storage unit 57 stores image group data acquired by the transmission / reception unit 59 via a network, or image data acquired via a removable storage medium.
  • the image processing unit 58 executes the fine structure extraction processing described above.
  • the transmission / reception unit 59 transmits / receives information including image data to / from an ultrasound diagnostic apparatus and a PACS (Picture Archiving and Communication System) server via a network.
  • PACS Picture Archiving and Communication System
  • the image processing apparatus 5 includes a micro structure extraction processing dedicated device 52.
  • This device is not indispensable when the microstructure processing is performed in the image processing apparatus 5, but is intended to realize better operability when performing the microstructure processing after the fact. is there. Specific examples include those having the following configurations.
  • FIG. 22 is a diagram showing an example of the minute structure extraction processing dedicated device 52.
  • the micro-structure extraction processing dedicated device 52 includes an ultrasonic probe type joystick 521, an interpretation dedicated button 522, and a trackball 523 in addition to a keyboard 520 attached to a general personal computer.
  • the joystick 521 is a lever-type operation tool that can be moved at least back and forth, and can control the position of the frame to be displayed for the image data to be diagnosed.
  • moving image playback, reverse playback, comma forward, fast forward playback, and the like are linked to the position of the joystick 521 moved by the operator.
  • the joystick 521 preferably has the same shape as the ultrasonic probe. This allows the operator to play back a video frame and find a micro structure that is necessary for diagnosis (usually because humans spend time in recognition and cannot stop the frame at the same time as the discovery), they return several frames. Even if the reconfirmation operation is a post-diagnosis diagnosis by a computer, the movement of the probe can be stopped and the scanning direction can be changed with the same feeling as actually scanning the subject.
  • the interpretation dedicated button 522 is assigned various functions for efficiently performing the interpretation combined with the microstructure extraction process.
  • the image displayed on the monitor can be switched by inputting a button between the images before and after the microstructure extraction process.
  • a button for separately saving a desired image frame as a still image, and an operation such as inputting annotation characters and arrows in the image are instructed.
  • the trackball 5 23 is a pointer on the monitor. Needless to say, a mouse may be substituted.
  • steps S34 to S36 shown in FIG. 18 are performed on the image group data stored in the storage unit 57.
  • the process executes the second microstructure extraction process the processes from steps S33 to S36 shown in the figure are similarly performed on the image group data stored in the storage unit 57. Each will be executed.
  • the first or second microstructure extraction process can be executed afterwards at a terminal such as a medical workstation.
  • a terminal such as a medical workstation.
  • the movement of the probe can be stopped while actually scanning the subject, even though the image is observed after the imaging. It is possible to display the microstructure extraction image with the same feeling as in the case of changing the scanning direction. As a result, a suitable microstructure extracted image can be efficiently observed with high operability, and the work burden on a doctor or the like can be reduced.
  • the ultrasonic probe 12 can be classified mainly into the following types according to the array transducer system.
  • D represents “dimension”
  • the arrays (2) to (5) are all arranged in two dimensions. Unlike what is used, it is a common expression in the field of ultrasonic diagnostic equipment.
  • arrays that can form a thin slice sound field are the arrays (2) to (5).
  • a method of acquiring an image using the above array will be described.
  • a tomographic image is acquired by ultrasonic wave transmission / reception by the ultrasonic probe 12 in the same manner as in the normal B mode.
  • the cross-section different from the tomographic image is cut by electronic deflection or mechanical oscillation.
  • FIG. 23 a tomographic image group 40 having three-dimensional region information is acquired.
  • FIG. 23 is exaggerated, the actual distance between each tomographic image is very small.
  • the cross sections of each tomographic image are not strictly parallel, but can be considered to be almost parallel when the fan angle is small.
  • the number of images to be acquired is 5 in FIG. 23, preferably 3 to LO.
  • the acquired tomographic image group is composed into one image, and the result is displayed on the display unit as one tomographic image.
  • image composition unit 27 in the block diagram of the ultrasonic diagnostic apparatus, the acquired tomographic image group is composed into one image, and the result is displayed on the display unit as one tomographic image.
  • the image compositing unit 27 gives the maximum luminance for the pixels at the same spatial position to all the tomographic images of the image group to the pixel at the corresponding position, so that one final image is obtained.
  • Image synthesis is performed as a tomographic image.
  • the effect of thin slice sound field i.e., thin slice thickness
  • the echo signal is detected very high.
  • detection of microscopic structures can be reduced even with a slight movement of the ultrasonic probe 12 or the subject.
  • the image composition unit 27 first performs luminance analysis on the pixel at the corresponding coordinate position on each tomographic plane as shown in FIG. In this case, if a luminance to be determined as a singular point (hereinafter referred to as “single luminance”: details will be described later) is detected, the luminance of the pixel is set as the representative luminance value. If no singular brightness is detected, the power to set the average brightness of all pixels to the representative brightness value of the pixel or the brightness of one of the tomographic images is set as the representative brightness value.
  • FIG. 25A is a diagram showing an example of this method, and assumes five tomographic images.
  • Figure 25A shows the case where singular brightness is detected
  • Figure 25B shows the case where no singular brightness is detected.
  • FIG. 25A when the singular luminance is detected, the luminance level of the representative luminance value indicated by the dotted arrow in the figure is matched with the singular luminance.
  • FIG. 25B when no specific luminance is detected, the luminance level of the representative luminance value indicated by the dotted arrow in the figure is matched with the average luminance level of all pixels.
  • the brightness level of the singular point is determined in advance. Normally, a grayscale image has 256 gradations. Therefore, when the maximum brightness is 256, for example, the brightness of a pixel with a brightness level of 200 or more is set as the singular brightness. . If multiple special luminances are found, the maximum is used.
  • the image processing for regenerating the new luminance with the specific luminance or the average value is performed on all or a part of the tomographic image, and as a result, the new tomographic image is reconstructed and displayed on the display unit. Is done.
  • the speckle pattern which is an interference fringe
  • the maximum value corresponding to the mountain is always detected among the speckle patterns. For this reason, drawability is impaired. This method eliminates this and improves the contrast Ht with microcalcification.
  • the region of interest (ROI) 41 when the region of interest (ROI) 41 is set on the diagnostic image, this embodiment can be applied only to the ROI. Thereby, the calculation processing time can be shortened.
  • the operator can change the ROI size and position from the input device.
  • the region 42 outside the ROI may display any one image (for example, the image acquired first (or last)) out of the plurality of obtained tomographic images as it is.
  • an ultrasonic diagnostic apparatus capable of preferably observing a microstructure such as microcalcification that has been overlooked in breast cancer screening. Can be realized.
  • Each function according to the present embodiment can also be realized by installing a program for executing the processing in a computer such as a workstation and developing the program on a memory.
  • a program that can cause the computer to execute the method is stored in a recording medium such as a magnetic disk (floppy (registered trademark) disk, hard disk, etc.), an optical disk (CD-ROM, DVD, etc.), or a semiconductor memory. It can also be distributed.
  • Various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the embodiments. For example, some components may be deleted from all the components shown in the embodiment. Furthermore, the constituent elements over different embodiments may be appropriately combined.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Image Processing (AREA)

Abstract

Selon la présente invention, une forme de tacheture est éliminée par l'utilisation d'une information uniforme sur une direction sensiblement orthogonale (sens de la profondeur) par rapport à une pluralité d'images ultrasons incluses dans des données d'images tridimensionnelles. Un traitement du CFAR (taux de fausse alarme constant) tridimensionnel, par exemple, ou bidimensionnel et un traitement du fonctionnement dans le sens de la profondeur sont réalisés. Avec le traitement, une structure continue qui est continuellement répartie en trois dimensions est discriminée à partir d'une microstructure localisée, de sorte qu'une image d'extraction de microstructure est générée.
PCT/JP2007/057219 2006-03-31 2007-03-30 Dispositif de diagnostic a ultrasons et procede de commande WO2007114375A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN2007800045320A CN101378700B (zh) 2006-03-31 2007-03-30 超声波诊断装置以及超声波诊断装置控制方法
EP07740655.1A EP1982654B1 (fr) 2006-03-31 2007-03-30 Dispositif de diagnostic a ultrasons et procede de commande
US12/178,709 US8696575B2 (en) 2006-03-31 2008-07-24 Ultrasonic diagnostic apparatus and method of controlling the same

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2006100225A JP5002181B2 (ja) 2006-03-31 2006-03-31 超音波診断装置及び超音波診断装置制御方法
JP2006-100225 2006-03-31
JP2006-147265 2006-05-26
JP2006147265A JP5165858B2 (ja) 2006-05-26 2006-05-26 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/178,709 Continuation US8696575B2 (en) 2006-03-31 2008-07-24 Ultrasonic diagnostic apparatus and method of controlling the same

Publications (1)

Publication Number Publication Date
WO2007114375A1 true WO2007114375A1 (fr) 2007-10-11

Family

ID=38563642

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/057219 WO2007114375A1 (fr) 2006-03-31 2007-03-30 Dispositif de diagnostic a ultrasons et procede de commande

Country Status (3)

Country Link
US (1) US8696575B2 (fr)
EP (1) EP1982654B1 (fr)
WO (1) WO2007114375A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009153925A (ja) * 2007-12-27 2009-07-16 Toshiba Corp 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム
EP2135557A1 (fr) * 2008-06-18 2009-12-23 Kabushiki Kaisha Toshiba Appareil et logiciel de diagnose par ultrason
JP2010094224A (ja) * 2008-10-15 2010-04-30 Toshiba Corp 超音波画像診断装置、画像処理装置及び超音波画像診断支援プログラム
CN110264461A (zh) * 2019-06-25 2019-09-20 南京工程学院 基于超声乳腺肿瘤图像的微小钙化点自动检测方法

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8105239B2 (en) * 2006-02-06 2012-01-31 Maui Imaging, Inc. Method and apparatus to visualize the coronary arteries using ultrasound
US8086010B2 (en) * 2006-06-30 2011-12-27 Kabushiki Kaisha Toshiba Medical image diagnosis apparatus and the control method thereof
EP2088932B1 (fr) 2006-10-25 2020-04-08 Maui Imaging, Inc. Procédé et appareil de production d'images ultrasonores au moyen d'une pluralité d'orifices
US9282945B2 (en) * 2009-04-14 2016-03-15 Maui Imaging, Inc. Calibration of ultrasound probes
JP5666446B2 (ja) 2008-08-08 2015-02-12 マウイ イマギング,インコーポレーテッド マルチアパーチャ方式の医用超音波技術を用いた画像形成方法及びアドオンシステムの同期方法
US9186093B2 (en) * 2008-08-20 2015-11-17 The Regents Of The University Of California Compositions and methods for screening cardioactive drugs
JP5438985B2 (ja) * 2009-02-10 2014-03-12 株式会社東芝 超音波診断装置及び超音波診断装置の制御プログラム
JP5485373B2 (ja) 2009-04-14 2014-05-07 マウイ イマギング,インコーポレーテッド 複数開口の超音波アレイ位置合せ装置
KR102322776B1 (ko) 2010-02-18 2021-11-04 마우이 이미징, 인코포레이티드 초음파 이미지를 구성하는 방법 및 이를 위한 다중-개구 초음파 이미징 시스템
US9737281B2 (en) 2010-03-16 2017-08-22 Konica Minolta Medical & Graphic, Inc. Ultrasound diagnostic equipment
US8934685B2 (en) * 2010-09-21 2015-01-13 General Electric Company System and method for analyzing and visualizing local clinical features
WO2012051305A2 (fr) 2010-10-13 2012-04-19 Mau Imaging, Inc. Appareil interne de sonde à ouvertures multiples et systèmes de câbles
KR101906838B1 (ko) 2010-10-13 2018-10-11 마우이 이미징, 인코포레이티드 오목한 초음파 트랜스듀서들 및 3d 어레이들
KR20140040679A (ko) * 2010-11-15 2014-04-03 인디언 인스티튜트 오브 테크놀로지 카라그푸르 향상된 초음파 이미징 시스템의 스펙클 저감/억제를 위한 향상된 초음파 이미징 방법/기술
CN104105449B (zh) 2011-12-01 2018-07-17 毛伊图像公司 使用基于声脉冲和多孔多普勒超声的运动检测
EP2797515A4 (fr) 2011-12-29 2015-07-22 Maui Imaging Inc Imagerie par ultrasons en mode m des trajets arbitraires
JP5984243B2 (ja) * 2012-01-16 2016-09-06 東芝メディカルシステムズ株式会社 超音波診断装置、医用画像処理装置及びプログラム
JP5852597B2 (ja) * 2012-02-13 2016-02-03 富士フイルム株式会社 光音響画像化方法および装置
EP2816958B1 (fr) 2012-02-21 2020-03-25 Maui Imaging, Inc. Détermination de la raideur d'un matériau à l'aide d'ultrasons à multiples ouvertures
KR102103137B1 (ko) 2012-03-26 2020-04-22 마우이 이미징, 인코포레이티드 가중 인자들을 적용함으로써 초음파 이미지 품질을 향상시키는 시스템들 및 방법들
KR102176193B1 (ko) 2012-08-10 2020-11-09 마우이 이미징, 인코포레이티드 다중 어퍼처 초음파 프로브들의 교정
KR102176319B1 (ko) 2012-08-21 2020-11-09 마우이 이미징, 인코포레이티드 초음파 이미징 시스템 메모리 아키텍처
JP6292836B2 (ja) * 2012-12-28 2018-03-14 キヤノン株式会社 被検体情報取得装置、表示方法、プログラム、処理装置
WO2014160291A1 (fr) 2013-03-13 2014-10-02 Maui Imaging, Inc. Alignement de groupements de transducteurs à ultrasons et ensemble de sonde à ouvertures multiples
US9883848B2 (en) 2013-09-13 2018-02-06 Maui Imaging, Inc. Ultrasound imaging using apparent point-source transmit transducer
JP6501452B2 (ja) * 2014-04-04 2019-04-17 キヤノン株式会社 画像処理装置及びシステム、画像処理方法、並びにプログラム
JP6274001B2 (ja) * 2014-05-08 2018-02-07 コニカミノルタ株式会社 超音波画像処理方法及びそれを用いた超音波診断装置
US9770216B2 (en) 2014-07-02 2017-09-26 Covidien Lp System and method for navigating within the lung
JP6722656B2 (ja) 2014-08-18 2020-07-15 マウイ イマギング,インコーポレーテッド ネットワークベース超音波イメージングシステム
EP3037041B1 (fr) 2014-12-23 2022-12-14 Samsung Medison Co., Ltd. Procédé et appareil de génération de marqueur anatomique
US10417529B2 (en) 2015-09-15 2019-09-17 Samsung Electronics Co., Ltd. Learning combinations of homogenous feature arrangements
WO2017132517A1 (fr) 2016-01-27 2017-08-03 Maui Imaging, Inc. Imagerie par ultrasons avec sondes de réseaux épars
JP6734079B2 (ja) * 2016-03-11 2020-08-05 キヤノンメディカルシステムズ株式会社 医用診断装置、および医用解析プログラム
JP7010504B2 (ja) * 2017-10-02 2022-01-26 株式会社Lily MedTech 医用画像装置
KR20200017791A (ko) * 2018-08-09 2020-02-19 삼성메디슨 주식회사 초음파 진단 장치, 초음파 영상을 표시하는 방법, 및 컴퓨터 프로그램 제품
JP2022178316A (ja) * 2021-05-20 2022-12-02 コニカミノルタ株式会社 超音波診断装置及びプログラム
CN114463365B (zh) * 2022-04-12 2022-06-24 中国空气动力研究与发展中心计算空气动力研究所 红外弱小目标分割方法、设备及介质
CN115670510B (zh) * 2023-01-05 2023-03-17 深圳迈瑞动物医疗科技股份有限公司 一种超声成像设备和超声c图像的成像方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000279416A (ja) * 1998-12-09 2000-10-10 General Electric Co <Ge> 3次元イメージング方法及びシステム
JP2002102223A (ja) * 2000-10-03 2002-04-09 Mitani Sangyo Co Ltd 超音波断層画像における面座標検出方法ならびにシステムおよび同方法がプログラムされ記録された記録媒体
JP2003061964A (ja) * 2001-08-24 2003-03-04 Toshiba Corp 超音波診断装置
JP2004129773A (ja) * 2002-10-09 2004-04-30 Hitachi Medical Corp 超音波イメージング装置及び超音波信号処理方法
WO2004081864A2 (fr) * 2003-03-13 2004-09-23 Koninklijke Philips Electronics N.V. Systeme d'imagerie en trois dimensions et procede pour signaler un objet d'interet dans un volume de donnees
JP2005205199A (ja) * 2003-12-26 2005-08-04 Fuji Photo Film Co Ltd 超音波画像処理方法及び超音波画像処理装置、並びに、超音波画像処理プログラム

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4016751A (en) * 1973-09-13 1977-04-12 The Commonwealth Of Australia Care Of The Department Of Health Ultrasonic beam forming technique
JPS61189476A (ja) 1985-02-18 1986-08-23 Mitsubishi Electric Corp レ−ダ装置
JP2001238884A (ja) 2000-02-29 2001-09-04 Toshiba Corp 超音波診断装置及び超音波による組織性状の定量解析方法
US6620103B1 (en) * 2002-06-11 2003-09-16 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging system for low flow rate contrast agents
JP4373698B2 (ja) 2003-04-25 2009-11-25 株式会社東芝 超音波診断装置及び超音波診断支援プログラム
US20050131295A1 (en) * 2003-12-11 2005-06-16 Koninklijke Philips Electronics N.V. Volumetric ultrasound imaging system using two-dimensional array transducer
US8021301B2 (en) * 2003-12-26 2011-09-20 Fujifilm Corporation Ultrasonic image processing apparatus, ultrasonic image processing method and ultrasonic image processing program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000279416A (ja) * 1998-12-09 2000-10-10 General Electric Co <Ge> 3次元イメージング方法及びシステム
JP2002102223A (ja) * 2000-10-03 2002-04-09 Mitani Sangyo Co Ltd 超音波断層画像における面座標検出方法ならびにシステムおよび同方法がプログラムされ記録された記録媒体
JP2003061964A (ja) * 2001-08-24 2003-03-04 Toshiba Corp 超音波診断装置
JP2004129773A (ja) * 2002-10-09 2004-04-30 Hitachi Medical Corp 超音波イメージング装置及び超音波信号処理方法
WO2004081864A2 (fr) * 2003-03-13 2004-09-23 Koninklijke Philips Electronics N.V. Systeme d'imagerie en trois dimensions et procede pour signaler un objet d'interet dans un volume de donnees
JP2005205199A (ja) * 2003-12-26 2005-08-04 Fuji Photo Film Co Ltd 超音波画像処理方法及び超音波画像処理装置、並びに、超音波画像処理プログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YAMAGUCHI T. ET AL.: "Extraction of Quantitative Three-Dimensional Information from Ultrasonic Volumetric Images of Cirrhotic Liver", JPN. J. APPL. PHYS., vol. 39, no. 5B, PART 1, May 2000 (2000-05-01), pages 3266 - 3269, XP003018340 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009153925A (ja) * 2007-12-27 2009-07-16 Toshiba Corp 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム
EP2135557A1 (fr) * 2008-06-18 2009-12-23 Kabushiki Kaisha Toshiba Appareil et logiciel de diagnose par ultrason
US9568598B2 (en) 2008-06-18 2017-02-14 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus and program
JP2010094224A (ja) * 2008-10-15 2010-04-30 Toshiba Corp 超音波画像診断装置、画像処理装置及び超音波画像診断支援プログラム
CN110264461A (zh) * 2019-06-25 2019-09-20 南京工程学院 基于超声乳腺肿瘤图像的微小钙化点自动检测方法

Also Published As

Publication number Publication date
EP1982654B1 (fr) 2018-10-03
EP1982654A1 (fr) 2008-10-22
US8696575B2 (en) 2014-04-15
EP1982654A4 (fr) 2010-08-18
US20080319317A1 (en) 2008-12-25

Similar Documents

Publication Publication Date Title
WO2007114375A1 (fr) Dispositif de diagnostic a ultrasons et procede de commande
JP5002181B2 (ja) 超音波診断装置及び超音波診断装置制御方法
JP5395371B2 (ja) 超音波診断装置、超音波画像の取得方法及びプログラム
JP4921826B2 (ja) 超音波診断装置及びその制御方法
JP6222811B2 (ja) 超音波診断装置及び画像処理装置
JP5284123B2 (ja) 超音波診断装置および位置情報取得プログラム
JP5165858B2 (ja) 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム
JP6288996B2 (ja) 超音波診断装置及び超音波イメージングプログラム
WO2014081006A1 (fr) Dispositif de diagnostic échographique, dispositif de traitement d&#39;image, et procédé de traitement d&#39;image
EP1715360B1 (fr) Appareil de diagnostic à ultrasons et programme de traitement d&#39;images à ultrasons
JP5417048B2 (ja) 超音波診断装置、及び超音波診断プログラム
EP2253275A1 (fr) Appareil de diagnostic à ultrasons, appareil de traitement d&#39;images à ultrasons et procédé de traitement d&#39;images à ultrasons
JP5259175B2 (ja) 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム
JP4764209B2 (ja) 超音波信号解析装置、超音波信号解析方法、超音波解析プログラム、超音波診断装置、及び超音波診断装置の制御方法
JP2006314689A (ja) 超音波診断装置及び超音波診断装置制御プログラム
JP2011045708A (ja) 超音波診断装置、超音波画像処理装置、超音波診断装置制御プログラム及び超音波画像処理プログラム
JP2012176232A (ja) 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム
JP5606025B2 (ja) 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム
JP5366372B2 (ja) 超音波診断装置及び超音波画像データ生成プログラム
JP2012245092A (ja) 超音波診断装置
JP5060141B2 (ja) 超音波診断装置
JP5738822B2 (ja) 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム
JP2024093190A (ja) 超音波診断装置及び超音波診断方法
JP2008220662A (ja) 超音波診断装置及びその制御プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07740655

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2007740655

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 200780004532.0

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE