WO2004100795A1 - 超音波画像生成方法 - Google Patents
超音波画像生成方法 Download PDFInfo
- Publication number
- WO2004100795A1 WO2004100795A1 PCT/JP2004/007123 JP2004007123W WO2004100795A1 WO 2004100795 A1 WO2004100795 A1 WO 2004100795A1 JP 2004007123 W JP2004007123 W JP 2004007123W WO 2004100795 A1 WO2004100795 A1 WO 2004100795A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- ultrasonic
- body cavity
- reference position
- dimensional tomographic
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/12—Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5269—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
- A61B8/5276—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts due to motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/66—Analysis of geometric attributes of image moments or centre of gravity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30061—Lung
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
Definitions
- the present invention relates to an ultrasonic image generating method for generating an ultrasonic image based on an ultrasonic echo obtained by transmitting and receiving ultrasonic waves to and from a subject.
- ultrasonic diagnostic apparatuses are widely used in the medical field and the industrial field.
- An ultrasonic diagnostic apparatus performs non-invasive diagnosis inside a subject by transmitting and receiving ultrasound to and from the subject.
- the conventional ultrasonic diagnostic apparatus described above uses a radial image (a cross section perpendicular to the scanning axis of the ultrasonic transducer) or a linear image based on ultrasonic echo data obtained by transmitting and receiving ultrasonic waves to and from a subject. It has a method for generating a three-dimensional (three-dimensional) ultrasonic image constructed from two-dimensional tomographic images such as (horizontal section with respect to the scanning axis of the ultrasonic transducer).
- Such a conventional ultrasonic image generation method for example, as described in Japanese Patent Application Laid-Open No. 2000-2504123, in order to make it easy to grasp the ultrasonic image stereoscopically, A device that generates a three-dimensional image from the surface of a body cavity has been proposed.
- the problem is the “pulsation” of the human body.
- an ultrasonic diagnostic apparatus performs radial scanning (vertical scanning with respect to the scanning axis of the ultrasonic transducer) and linear scanning (horizontal scanning with respect to the scanning axis of the ultrasonic transducer).
- the helicopter force scan is performed to obtain the radial image and the linear image.
- a two-dimensional tomographic image has been obtained.
- the above-mentioned helical scan requires a time of about 2 to 4 seconds, for example.
- the position of the region of interest is constant at the start and end of the scan due to the pulsation generated during about 2 to 4 seconds during the helical scan. It is common not to do it.
- This “displacement” mainly causes a problem in the linear image. If the degree is too large, there is a possibility that a clear image will not be obtained even when the linear image is constructed.
- the ultrasonic image generation method described in the above-mentioned Japanese Patent Application Laid-Open No. 2000-2504123 discloses that, when an image of an organ near a blood vessel is made three-dimensional, the organ moves under the influence of the pulsation. However, there is a problem that the image is distorted.
- the ultrasonic image generation method described in the above-mentioned Japanese Patent Application Laid-Open No. 2000-316686 has obtained a certain effect regarding pulsation removal, pulsation removal is not yet sufficient. . Because of this, ultrasonic As an image generation method, a more effective method is required.
- the conventional ultrasonic image generation method seeks to stabilize the ultrasonic image by finding the center of gravity of the body cavity as described in, for example, Japanese Patent No. 331626. What has been proposed.
- the ultrasonic image generation method described in the above-mentioned Japanese Patent No. 3316628 has only an effect of stabilizing a two-dimensional tomographic image on a monitor, and is intended to eliminate the influence of pulsation. It has no effect if applied directly to an ultrasound image that has a three-dimensional spread (that is, a three-dimensional image).
- the present invention has been made in view of the above circumstances, and an object of the present invention is to provide an ultrasonic image generation method capable of removing an influence of pulsation and obtaining an ultrasonic image with good image quality. Disclosure of the invention
- An ultrasonic image generating method for generating an ultrasonic image based on ultrasonic echo data obtained by transmitting and receiving ultrasonic waves to and from a subject comprises: A reference position setting step for obtaining the upper reference position, and a regular continuous two-dimensional tomographic image in which the irregularity of the reference position of each of the two-dimensional tomographic images obtained in the base position setting step is corrected. And an ultrasonic image generating step of generating an ultrasonic image based on the regular continuous two-dimensional tomographic image corrected in the correcting step.
- FIG. 1 is an overall configuration diagram of an ultrasonic diagnostic system according to a first embodiment of the present invention.
- FIG. 2 is a diagram showing a specific example of a two-dimensional tomographic image (radial image) Gr.
- FIG. 3 is a schematic diagram showing an ultrasonic image (three-dimensional image) G s.
- FIG. 4 is a flowchart showing an ultrasonic image generation method in the ultrasonic diagnostic system of the first embodiment.
- FIG. 5 is a diagram showing an image when surface coordinates are extracted to determine a surface extraction point.
- FIG. 6 is a diagram showing an image when a polygon connecting the surface extraction points is generated to determine the center of gravity. '
- FIG. 7 is a diagram showing an image when the position of the body cavity surface is shifted even if two radial images depict the same body cavity.
- FIG. 8 is a diagram illustrating an image when the positions of the two radial images are adjusted to match the centers of gravity.
- FIG. 9 is a diagram showing an image when the body cavity surface becomes smooth as a result of the smoothing.
- FIG. 10 is a view showing an image when the surface positions of the body cavity surface do not match before and after smoothing.
- FIG. 11 is a diagram showing a specific example of an ultrasound image (three-dimensional image) Gs.
- FIG. 12 is a diagram showing that the radial image Gr is formed by several hundred pieces of line data.
- FIG. 13 is an overall configuration diagram of an ultrasonic diagnostic system according to the second embodiment of the present invention.
- FIG. 14 is a schematic diagram showing a sound ray when scanning the surface position of the body cavity.
- FIG. 15 is a histogram showing the frequency with respect to luminance created by sampling a predetermined number of all sound rays.
- FIG. 16 is a first schematic diagram when a target is determined to be a living tissue with respect to a scan line of a sound ray.
- FIG. 17 is a second schematic diagram when a target is determined to be a living tissue with respect to a scan line of a sound ray.
- FIG. 18 is a schematic diagram for explaining the correction processing of the erroneous extraction.
- FIG. 19 is a first explanatory diagram showing a specific example of obtaining a body cavity center (X, Y) as a reference position.
- FIG. 20 is a second explanatory diagram showing a specific example of obtaining the body cavity center (X, Y) as the reference position.
- FIG. 21 is a schematic diagram for explaining a process of smoothing the position of the surface of the body cavity using the images before and after.
- FIG. 9 is a schematic diagram when performing a distance correction process so as to coincide with FIG.
- FIG. 23 is a flowchart illustrating an ultrasonic image generation method in the ultrasonic diagnostic system according to the second embodiment.
- FIG. 24 is a flowchart showing a subroutine for determining the surface extraction points in FIG.
- FIG. 25 is an example of a first ultrasonic image obtained conventionally.
- FIG. 26 is a first example of an ultrasonic image processed by the flowchart of FIG. .
- FIG. 27 is an example of a second ultrasound image obtained conventionally.
- FIG. 28 is an example of a second ultrasonic image processed by the flowchart of FIG. BEST MODE FOR CARRYING OUT THE INVENTION
- an ultrasonic diagnostic system 1 includes an ultrasonic probe 2 having an ultrasonic vibrator 2a described later and an ultrasonic echo signal received by the ultrasonic probe 2.
- the main unit (ultrasonic image processing device) 3 that generates ultrasonic images by signal processing, and the monitor 4 that inputs the output image signal output from the main unit 3 and displays real-time ultrasonic images Unit is configured.
- the ultrasonic probe 2 includes an elongated and flexible insertion portion 11 that can be inserted into a body cavity, and a drive portion 12 that has a proximal end side of the insertion portion 11 that is detachably connected to the insertion portion 11. Is done.
- the input unit 11 has a built-in ultrasonic vibrator 2a for transmitting and receiving ultrasonic waves at the distal end 11a.
- the ultrasonic vibrator 2 a is attached to the tip of the flexible shaft 21.
- the ultrasonic probe 2 has an end 1 la covered with an acoustic cap that transmits ultrasonic waves.
- the area around the ultrasonic transducer 2a is filled with an ultrasonic propagation medium (not shown) for transmitting (propagating) ultrasonic waves.
- the ultrasonic transducer 2a has a signal line (not shown) extending therefrom. The signal line is connected to an ultrasonic observation unit 31 described later in the apparatus body 3 via the drive unit 12. I have.
- the ultrasonic probe 2 drives a first motor (not shown) incorporated in the drive unit 12 so that the ultrasonic transducer 2a is driven to rotate to perform radial scanning. Further, the ultrasonic probe 2 drives a second motor (not shown) built in the driving unit 12 so that the flexible shaft 21 can be moved in the axial direction of the insertion unit 11 (for example, Z in the longitudinal direction). By moving back and forth in the axial direction), the ultrasonic transducer 2a moves back and forth and can perform linear scanning.
- the ultrasonic probe 2 synchronizes the first mode and the second mode of the driving unit 12 with the second mode, and simultaneously rotates the first mode and the second mode.
- the device main body 3 can obtain a large number of two-dimensional tomographic images having slightly different coordinate positions in the Z-axis direction, and can construct an ultrasonic image (three-dimensional image) from these two-dimensional tomographic images. It is like that.
- the drive unit 12 is connected to the apparatus main body 3 by the cable 12 a.
- the apparatus main body 3 transmits and receives an ultrasonic signal to and from the ultrasonic transducer 2a, and obtains an ultrasonic echo image in a three-dimensional area.
- a position detection unit 32 that obtains the position data of the insertion part 11a of the ultrasonic probe 2 in the body cavity (for example, in the stomach 20), a position detection unit 32, and an ultrasonic observation unit Ultrasound image data is obtained based on the data of 31.
- the position data from the position detection unit 32 is associated with the two-dimensional tomographic image to identify the position of the radial scan plane, and the An image processing unit 33 having a CPU for generating an ultrasonic image (three-dimensional image) based on a sound image generating method and an ultrasonic image generating program is provided.
- the image processing unit 33 is provided with a mouse 34 which is a screen operation means capable of interactively operating images as shown in FIGS. It is connected.
- the mouse 34 has a function for performing a screen operation on an image displayed on the display screen of the monitor 4 using a mouse.
- the image processing unit 33 has a function of a control means for relatively moving the selected operation target with the movement of the mouse 34 based on the selection of the operation target of the screen operation by the mouse 34. are doing.
- the image processing section 33 may be connected to a large-capacity external storage device (not shown) for recording image data and the like via an interface (not shown).
- the image processing unit 33 performs helical scanning of the ultrasonic transducer 2 a at a predetermined pitch unit, performs coordinate conversion on the ultrasonic echo obtained by the ultrasonic observation unit 31, and inserts the insertion unit 1.
- a plurality of two-dimensional tomographic images (hereinafter referred to as radial images) G r as shown in Fig. 2 with a cross section almost perpendicular to the axial direction of 1 (that is, the Z-axis direction) are generated.
- These generated radial images Gr correspond to, for example, the ultrasonic tomographic plane 2OA in the stomach 20 shown in FIG.
- the image processing unit 33 converts the plurality of radial images Gr continuously obtained in a predetermined pitch unit into the position of the leading end 11 a of the ultrasonic probe 2 detected by the position detection unit 32.
- Figure 3 in relation to the data A pseudo ultrasound image (three-dimensional image) Gs is generated as shown in Fig. 2.
- the organ moves due to the pulsation, and the ultrasound image (3D image) G s is distorted.
- the influence of the pulsation is removed by the ultrasonic image generation method and the ultrasonic image generation program described below.
- the ultrasonic image generating method and the ultrasonic image generating program will be described with reference to FIGS. 5 to 11 based on the flowchart shown in FIG.
- the ultrasonic probe 2 is configured to rotate the first motor and the second motor of the drive unit 12 at the same time by rotating them synchronously so that the ultrasonic vibrator 2 a has a predetermined bite. Perform helical scanning in units.
- the apparatus main body 3 inputs the 'ultrasonic echo signal in the three-dimensional region received by the ultrasonic transducer 2a to the ultrasonic observation unit 31.
- the ultrasonic observation unit 31 receives a single ultrasonic echo signal in the three-dimensional region from the ultrasonic transducer 2a, converts the ultrasonic echo signals into coordinates, and generates a plurality of continuous radial images Gr. I do.
- the image processing unit 33 sequentially obtains a plurality of radial images Gr by sequentially inputting the image data of the radial images Gr from the ultrasonic observation unit 31 (step ⁇ ). Step S1).
- the position detector 32 acquires the position data of the distal end portion 11a of the ultrasonic probe 2 in the body cavity, and sequentially outputs the acquired position data to the image processor 33.
- the image processing unit 33 identifies the position of the radial scan plane by associating the position data from the position detection unit 32 with the continuous radial images Gr from the ultrasonic observation unit 31. I do.
- the image processing unit 33 performs a reference position setting step of obtaining a reference position on each image for a plurality of continuous radial images Gr.
- the following steps S2 and S3 are performed.
- the image processing unit 33 extracts the surface coordinates of each image from a plurality of continuous radial images Gr, and determines a surface extraction point (step S2).
- the image processing unit 33 generates a search line 40a from the center of the image, extracts a portion having a large luminance change as surface coordinates, and extracts the surface coordinates. Determine the extraction point 4 1.
- the image processing unit 33 obtains the center of gravity as a reference position on each of the continuous radial images G (step S3).
- the image processing unit 33 generates, for example, a polygon 42 connecting the surface extraction points 41 as shown in FIG. 6, and obtains a center of gravity 43.
- the center of gravity 43 of the polygon is obtained using the physical center of gravity.
- a geometric center of gravity may be used as the center of gravity 43.
- the image processing unit 33 approximates a circle (not shown) in which the polygon 42 is almost inscribed, finds a center point of the circle, and sets it as a center of gravity. You may do it.
- the image processing unit 33 corrects the irregularity of the center of gravity 43 with respect to each of the radial images Gr obtained in the reference position setting step (S 2, S 3), and obtains a continuous radial image G r.
- the following steps S4 to S7 are performed as correction steps for obtaining r.
- a plurality of continuous radial images Gr are displaced in the images as shown in FIG. 7, for example, due to the influence of the pulsation or the mechanical shake of the probe itself.
- the positions of the body cavity surfaces 51 B and 52 B are shifted between the radial image Gr 51 and the radial image Gr 52 even though the same body cavity is depicted.
- the image processing unit 33 includes a plurality of radial The position of these images is adjusted by moving the image Gr, and processing such as matching the center of gravity on each image determined in S3 (aligning the center of gravity) is performed (step S4).
- the image processing unit 33 adjusts the position of the image by moving the radial image Gr 51 and the radial image Gr 52 in the vertical and horizontal directions, for example, as shown in FIG. Match 5 2 A.
- the image processing unit 33 may move the image position of the radial image Gr 52 in the vertical and horizontal directions while fixing the image position of the radial image Gr 51 specified in advance by the user. And vice versa. As a result, the image processing unit 33 can reduce large mechanical “blurring”.
- the image processing unit 33 smoothes the surface extraction points of the predetermined radial image Gr sandwiched between these images using the preceding and succeeding images. Performs a process of smoothing the position of the surface of the body cavity (step S5).
- the image processing unit 33 obtains the number of images to be referred to for smoothing by the following formula (1).
- the pitch is a distance between each two-dimensional tomographic image (radial image G r).
- the number of images to be referred to for smoothing is 16 at 0.25 millipitch, and 4 at 1 millipitch.
- the image to be referred to is determined by the following formula (2).
- n is a numerical value starting from 0 and ending with 1 for the number of reference images.
- the image processing unit 33 smoothes the surface extraction points 41 of a predetermined radial image sandwiched between these images by using the preceding and following images, and smoothes the position of the body cavity surface.
- an image 55 formed by the radial image Gr 51 and the radial image Gr 52 becomes a body cavity.
- Surface 5 5 B becomes smooth.
- Reference numeral 55 A is the center of gravity of the image 55.
- the radial image Gr is, for example, the surface position of the body cavity surface 61 before the smoothing (the position of the extracted body cavity surface) and the smoothed body cavity surface 62 May not match the surface position.
- Reference numeral 63 denotes a difference between the surface position of the body cavity surface 61 before smoothing (the position of the extracted body cavity surface) and the surface position of the smoothed body cavity surface 62.
- the side section 71 cannot be generated from each radial image Gr in accordance with the smoothed body cavity surface.
- the image position shown in FIG. 10 is adjusted so that the surface position of the body cavity surface 61 before smoothing (the position of the extracted body cavity surface) and the surface position of the smoothed body cavity surface 62 match each other. 0 needs to be transformed.
- the image processing unit 33 calculates the difference between the surface position of the body cavity surface before smoothing (the extracted body cavity surface) and the surface position of the smoothed body cavity surface.
- a process for expanding or contracting a predetermined radial image Gr is performed based on this (steps S6 and S7).
- the image processing unit 33 calculates the position of the body cavity surface 6 1 before smoothing (the position of the extracted body cavity surface) and the position of the smoothed body cavity surface 6. Calculate the difference 6 3 from the position of 2, The image is expanded or contracted based on the calculated difference 63.
- the image processing unit 33 can generate a radial image Gr that can be matched with the smoothed body cavity surface. Then, the image processing unit 33 obtains the continuous radial images Gr corrected in the correction steps (S4 to S7), and the position detecting unit 32 detects the continuous radial images Gr based on these continuous radial images Gr.
- An ultrasonic image generation step (step S8) for generating an ultrasonic image (three-dimensional image) Gs in association with the position data is performed.
- the image processing section 33 can generate an ultrasonic image (three-dimensional image) G s having a smooth body cavity surface 72 as shown in FIG.
- the radial image Gr is formed as a raw image of the ultrasonic echo data, for example, as shown in an image 80 shown in FIG. It is represented by the echo intensity of the sound ray data 81.
- the image processing unit 33 converts the sound ray data into a digital coordinate data from a coordinate scan system DSC (a polar coordinate system generated by a radial scan) into an image data in a rectangular coordinate system.
- a coordinate scan system DSC a polar coordinate system generated by a radial scan
- the position of the extracted body cavity surface is aligned with the polygon surface (the position of the smoothed body cavity surface), so that each sound ray data Perform a flexible expansion and contraction.
- the image processing unit 33 applies the effect of smoothing to the sound ray distance obtained from the coordinate conversion table of the sound ray data (that is, the difference of the surface position in S6 described above).
- the expansion / contraction processing is performed taking into account the difference between the distances before and after the smoothing corresponding to 63).
- the ultrasonic image generation method and the ultrasonic image generation program according to the first embodiment are used not only during the examination, but also when reproducing the data stored after the ultrasonic examination (for image treatment for future treatment policy, etc.). It can also be used for you and volume measurement.
- the ultrasonic image generating method and the ultrasonic image generating program in the first embodiment are configured such that the position detecting section 32 obtains the position data of the insertion section distal end 11 a of the ultrasonic probe 2.
- the present invention is not limited to this.
- the ultrasonic echo data of a three-dimensional area is manually acquired in a pseudo manner, and is displayed as a 3D image (an oblique surface construction for displaying a body cavity surface) as an ultrasonic image.
- applying the ultrasonic image generating method and the ultrasonic image generating program of the present invention is effective in improving the image quality.
- the ultrasonic image generation method and the ultrasonic image generation program according to the first embodiment provide a high-quality two-dimensional tomographic image having no image distortion that is not affected by pulsation or mechanical vibration of the probe itself. Images and ultrasound images (three-dimensional images) can be acquired.
- the first embodiment is configured to generate a polygon 42 connecting the surface extraction points 41 as a reference position on each image for a plurality of continuous radial images Gr, and obtain a center of gravity 43.
- the surface of the body cavity is converted from polar coordinates to rectangular coordinates as a reference position on each image, and placed at 3, 6, 9, 12 o'clock on the rectangular coordinates. It is configured to find the body cavity center based on the four extracted points.
- the rest of the configuration is the same as in the first embodiment, and a description thereof will be omitted.
- the same components will be described with the same reference numerals. That is, as shown in FIG.
- the ultrasonic diagnostic system 1B of the second embodiment includes an ultrasonic probe 2B having an ultrasonic vibrator 2a built therein and an ultrasonic probe 2B received by the ultrasonic probe 2B.
- a main unit (ultrasonic image processing device) 3B that processes the echo signal to generate an ultrasonic image, and displays an ultrasonic image in real time by inputting the output image signal output from the main unit 3B.
- the main part is composed of the monitor 4 that operates.
- the ultrasonic probe 2B comprises an elongated and flexible insertion portion 11 that can be inserted into a body cavity, and a drive portion 12 that has a proximal end side of the insertion portion 11 that is detachably connected. Is done.
- the insertion section 11 has a built-in ultrasonic vibrator 2a for transmitting and receiving ultrasonic waves at the distal end 11a.
- the ultrasonic transducer 2a is attached to the tip of the flexible shaft 21.
- the ultrasonic probe 2B has a tip 11a covered with an acoustic cap that transmits ultrasonic waves.
- the area around the ultrasonic transducer 2a is filled with an ultrasonic wave propagation medium (not shown) for transmitting (propagating) ultrasonic waves.
- a signal line (not shown) extends, and this signal line is connected to the ultrasonic observation unit 31 in the apparatus main body 3B via the drive unit 12.
- the ultrasonic probe 2B drives the first motor (not shown) built in the drive unit 12 so that the ultrasonic transducer 2a is driven to rotate and performs radial scanning. . Further, the ultrasonic probe 2B drives the second motor (not shown) built in the driving section 12 so that the flexible shaft 21 can be moved in the axial direction (longitudinal direction) of the insertion section 11. (For example, the Z-axis direction), the ultrasonic transducer 2a can move forward and backward to perform linear scanning.
- the ultrasonic probe 2B synchronously drives the first motor and the second motor of the drive unit 12 to rotate simultaneously, whereby the ultrasonic transducer 2a transmits and receives ultrasonic waves in a spiral manner.
- This allows helical scanning of a three-dimensional area in the subject.
- the apparatus main body 3B can obtain a large number of two-dimensional sectional images whose coordinate positions in the Z-axis direction are slightly different from each other, and can construct an ultrasonic image (three-dimensional image) from these two-dimensional tomographic images. And so on.
- a drive section 12 is connected to the apparatus main body 3B by a cable 12a.
- the apparatus main body 3B transmits and receives an ultrasonic signal to and from the ultrasonic transducer 2a, and obtains ultrasonic echo data in a three-dimensional area by an ultrasonic observation section 31 and an ultrasonic observation section 31.
- CPU that generates ultrasonic image data based on the obtained ultrasonic echo data and generates an ultrasonic image (three-dimensional image) based on an ultrasonic image generating method and an ultrasonic image generating program described later.
- an image processing section 33 B having the following.
- the image processing unit 33 B has an interface not shown by the mouse 34, which is a screen operation means capable of interactively operating images as shown in FIGS. 25 and 26 described below. Connected through. This mouse
- Reference numeral 34 has a function for performing a screen operation on the image information displayed on the display screen of the monitor 4 using a mouse cursor.
- the image processing unit 33B has a function of a control unit that relatively moves the selected operation target with the movement of the mouse 34 based on the selection of the operation target of the screen operation using the mouse 34. are doing. It should be noted that a large-capacity external storage device (not shown) for recording image data and the like may be connected to the image processing section 33B via an interface (not shown).
- the image processing unit 33B performs helical scanning of the ultrasonic transducer 2a in predetermined pitch units, and performs coordinate transformation of the ultrasonic echo data obtained by the ultrasonic observation unit 31.
- These generated radial images Gr correspond to, for example, the ultrasonic tomographic plane 2OA in the stomach 20 shown in FIG.
- the image processing unit 33 B converts a plurality of radial images Gr continuously obtained in a predetermined pitch unit into, for example, a pseudo ultrasonic image (three-dimensional image) as shown in the upper right diagram of FIG. Image) G s is generated.
- the images displayed on the upper side of the radial image Gr and on the lower side of the ultrasonic image (three-dimensional image) Gs are input sections 11 generated based on the radial image Gr. This is a linear image of a cross section that is almost horizontal in the axial direction (that is, the Z-axis direction).
- the ultrasound image (three-dimensional image) Gs is distorted as shown in Fig. 25.
- the influence of the pulsation is removed by the ultrasonic image generation method and the ultrasonic image generation program described below.
- the surface extraction process is a process for discriminating water such as an ultrasonic transmission medium or a body fluid from a living tissue.
- the radial image Gr is composed of, for example, hundreds of sound rays called sound rays, and is represented by the eco-luminance of these sound rays.
- each sound ray is searched from the probe center (ultrasonic transducer) in the peripheral direction, and the part that changes from water to living tissue is searched. The following processing is used to improve the accuracy of this search.
- the original gradation 0 (low reflection) to 255 (strong reflection) is divided into 0 (low reflection) to 15 (strong reflection) for smoothing.
- the peak is determined from the differential value (slope) and the absolute value (height) of each point on the graph. Normally, the peaks on the histogram have the following two patterns.
- the threshold value is an intermediate value between these two peaks.
- the threshold value exceeds 80 (the threshold value in the smoothed gradations 0 to 15 is approximately 5) in the original gradations 0 to 255, good judgment between water and living tissue cannot be made in many cases. . Therefore, the upper limit of the threshold is set to 80 If it exceeds that, it rounds to 80.
- the most common method for removing the noise is frame correlation, but in order to avoid smoothing the image itself as much as possible, the present embodiment measures and judges the thickness of the object. .
- the noise that is actually erroneously recognized has a thickness of the object of 0.2 mm or less.
- a certain thickness is set, and if the target object is equal to or larger than the set value, it is determined to be a living tissue. If the set value is small, erroneous extraction increases, and if it is too large, the layer structure of the living tissue is determined to be noise.
- the target when the thickness of the target is about 0.5 mm as shown in FIG. 16, the target is determined to be a living tissue.
- the above method is effective for very small noise. However, it is ineffective for noise in a broad sense, that is, noise that actually has a certain thickness, such as suspended matter in a body cavity.
- a luminance average of a certain thickness of the object is obtained, and if the obtained luminance average is equal to or larger than the ⁇ value, it is determined to be a living tissue.
- the thickness of the object for which the average brightness is to be determined is about 2 mm, and when the average brightness of the thickness of about 2 mm is equal to or greater than the threshold, the tissue is determined to be a living tissue.
- the target extraction point is set at the probe center (super The distance from the sound wave transducer) is compared with the distance between the sound rays before and after it, and correction processing for erroneous extraction is performed.
- the distance difference is calculated for the four points before and after, and the legitimacy of the target point is evaluated from the tendency.
- Lt be the actual extraction point of the ray N.
- the sound ray N determines the body cavity surface before the actual body cavity surface due to erroneous extraction.
- the expected extraction point Lx of the sound ray N is calculated.
- Ll length of sound ray N_2
- L2 length of sound ray N—1
- L3 length of ray N + 1
- L4 length of ray N + 2
- the average of the above D12 and D34 indicates a tendency to increase or decrease the distance from the probe center (ultrasonic transducer) to the surface of the body cavity.
- the value obtained by adding this tendency to the average surface distance is the expected extraction point of the target point.
- the mis-extraction point is either before or after the body cavity surface position. Then, the expected extraction point Lx is compared with the actual extraction point Lt, and if the actual extraction point Lt is more than 3 mm away from the above calculated expected extraction point LX, the actual extraction point Lt is calculated as above. By replacing it with the extraction point Lx, erroneous extraction is corrected. Note that, although not shown, the correction processing of the erroneous extraction is performed at six sound rays N-3, N-2, N-1, Based on the N + 1 5 N + 2 5 N + 3, may be calculated expected extraction point L x sound ray N.
- the extracted body cavity surface is converted from polar coordinates (sound ray number and distance) to rectangular coordinates, and is placed clockwise on the rectangular coordinates at positions 3, 6, 9, 12 o'clock. From the surface distance of the sound ray (the distance from the probe center (ultrasonic transducer) to the extraction point), determine the body cavity center. Specifically, the center of the body cavity (X, Y) is
- the center of the body cavity (X, ⁇ ) is the position of the X mark which is the center of the substantially circle.
- the center of the body cavity (X, ⁇ ) is the position of the X mark which is the center of the substantially rhombus.
- the calculated body cavity center (X, ⁇ ) may exist outside the living tissue.
- the positions of these radial images Gr are adjusted by moving each of the plurality of radial images Gr in the same manner as described in the first embodiment.
- the center of the body cavity ( ⁇ , ⁇ ) is matched (the center of the body cavity is aligned).
- the number of radial images Gr (data) to be referred to for smoothing is 8 mm in a plane (0.25 mm pitch) between 2 mm in the Z-axis direction (longitudinal direction).
- the surface position of the surface of the body cavity is smoothed by averaging 4 surfaces in the case of 0.5 mm millipitch.
- the surface position of the body cavity surface before smoothing (the position of the extracted body cavity surface) and the surface position of the smoothed body cavity surface are the same as described in the first embodiment.
- the difference between the surface position of the body cavity surface before smoothing (extracted body cavity surface) and the surface position of the smoothed body cavity surface is calculated, and a predetermined value is determined based on the calculated difference. Perform processing to expand and contract the radial image G ⁇ .
- sound ray data is converted into a radial image, and the surface position of the body cavity surface before smoothing (surface coordinates before smoothing) is changed to the smoothed body cavity surface.
- the distance correction processing is performed so as to match the surface position of (surface coordinates after smoothing).
- the distance correction processing is performed according to the procedure described below.
- the distance difference between the surface position of the smoothed body cavity surface (surface coordinates after smoothing) and the surface position of the body cavity surface before smoothing (surface coordinates before smoothing) is calculated.
- the distance D between the surface position of the body cavity surface before smoothing (surface coordinates before smoothing) and the surface position of the smoothed body cavity 'surface (surface coordinates after smoothing) is calculated.
- This distance D is determined by the intersection R between the n-th sound ray and the surface position of the body cavity surface before smoothing (surface coordinates before smoothing), and the surface position of the n-th sound ray and the smoothed body cavity surface (smooth surface position). From the intersection Q with the surface coordinates after
- the origin of the probe center (ultrasonic transducer) is set to 0, and the line segment of the smoothed surface position (surface coordinates after smoothing) including the intersection point Q with the straight line 0 Pn is P 1 P Assuming 2, the intersection Q (x, y) can be obtained from the following equation.
- a difference D between the intersections QP can be calculated from the intersections Q and the intersections R.
- the sound ray data that determines an arbitrary point P (x, y) on the radial image—the point in the evening is obtained by the following formula.
- L the distance from the origin 0 to the body cavity surface
- D the surface position of the smoothed body cavity surface
- D the surface position of the smoothed body cavity surface (surface coordinates after smoothing) is adjusted to match the surface position of the body cavity surface before smoothing (surface coordinates before smoothing). ) be able to. That is, processing for expanding and contracting a predetermined radial image Gr can be performed.
- the ultrasonic probe 2 B synchronizes the first motor and the second motor of the drive unit 12 and simultaneously rotates and drives the ultrasonic probe 2 B, so that the ultrasonic transducer 2 a Scans the helical force in specified pitch units.
- the ultrasonic echo signal in the three-dimensional region received by the ultrasonic transducer 2a is input to the ultrasonic observation unit 31.
- the ultrasonic observation unit 31 receives the ultrasonic echo signals in the three-dimensional region from the ultrasonic transducer 2a, converts the ultrasonic echo signals into coordinates, and generates a plurality of continuous radial images Gr. I do.
- the image processing unit 33B sequentially obtains the image data of the radial image Gr from the ultrasonic observation unit 31 to acquire a plurality of continuous radial images G ⁇ (step S 1 1).
- the image processing unit 33B discriminates water such as an ultrasonic transmission medium or a body fluid from living tissue by the surface extraction processing described with reference to FIGS. 15 to 17, and obtains the radial image Gr. Is generated.
- the image processing unit 33B performs the following steps S12 and S13 as a reference position setting step for obtaining a reference position on each image for a plurality of continuous radial images Gr. .
- the image processing unit 33B extracts surface coordinates of each image from a plurality of continuous radial images Gr, and determines a surface extraction point. Step S1 2).
- the image processing unit 33B determines the surface extraction points by the erroneous extraction correction processing described in FIG. 18 based on the flow chart shown in FIG.
- step S21 two (N-2, N-1, N + 1, N + 2) before and after the target sound ray N are specified (step S21).
- the difference D12 between the sound ray N-2 and the sound ray N-1 and the difference D2t between the sound ray N-1 and the sound ray N are calculated, and the sound ray N + 1 and the sound ray N + 2 are calculated.
- the difference D34 between the sound ray N and the sound ray N + 1 is calculated (step S22), and the length (expected extraction point) Lx of the target sound ray N is calculated (step S22). Step S23).
- the surface extraction point is determined by substituting the length of the sound ray N (step S24).
- the image processing unit 33B can extract the surface coordinates of each of the plurality of continuous radial images Gr.
- the image processing unit 33B calculates the center of the body cavity as a reference position on each image from the body cavity surface coordinates obtained by the above-described erroneous extraction correction process for a plurality of continuous radial images Gr (step S13).
- the image processing unit 33B determines the reference position by the body cavity center calculation process described with reference to FIG. 19 and FIG.
- the image processing unit 33 B is arranged clockwise at positions 3, 6, 9, and 12 o'clock when the polar coordinates (sound ray number and distance) are converted into rectangular coordinates as described above.
- the center of the body cavity is calculated from the surface distance of the sound ray (the distance from the probe center (ultrasonic transducer) to the extraction point). Therefore, the above body cavity center calculation processing is performed at four points: 3, 6, 9, and 12 o'clock. Since the center of the body cavity is calculated only at the position of, the calculation process is much more complicated than generating the polygon connecting the surface extraction points and finding the center of gravity as the reference position as in the first embodiment.
- the calculation time is short and the reference position can be calculated quickly.
- the reference position can be calculated faster than in the first embodiment, and speed-up can be achieved.
- the image processing unit 33B moves the plurality of radial images G ⁇ ⁇ and adjusts the positions of these images in the same manner as described in the first embodiment. Then, for each image, the center of body cavity (X, Y) obtained by the above processing is matched (the center of body cavity is aligned).
- the image processing unit 33 B performs smoothing in the Z-axis direction (longitudinal direction) on the surface extraction point of a predetermined radial image Gr sandwiched between these images using the preceding and following images. Then, a process for smoothing the position of the body cavity surface is performed (step S15).
- the image processing unit 33B After the smoothing, the image processing unit 33B performs the smoothing on the surface position of the body cavity surface before the smoothing (the position of the extracted body cavity surface) in the same manner as described in the first embodiment.
- the difference between the surface position of the body cavity surface before smoothing (extracted body cavity surface) and the surface position of the smoothed body cavity surface is calculated so that the surface position of the body cavity surface matches, and the calculated difference is calculated.
- a process of expanding and contracting a predetermined radial image Gr based on is performed.
- the image processing unit 33 B converts the surface position of the body cavity surface before smoothing (the surface coordinate before smoothing) into the surface position of the smoothed body cavity surface according to the procedure described in FIG. (Surface coordinates after smoothing)
- the surface position of the smoothed body cavity surface is corrected by distance correction processing so that it matches the surface position of the body cavity surface before smoothing (smoothed surface coordinates). Match (match) to the previous surface coordinates.
- the radial image Gr subjected to the distance correction processing is used. This makes it possible to construct a linear image with reduced pulsation.
- the radial image Gr and the linear image shown in FIGS. 26 and 28 are different from the radial image Gr and the linear image shown in FIGS. (Jagged portion of the body cavity surface) disappears, and the body cavity surface becomes clear.
- the image processing unit 33B obtains the continuous radial images Gr corrected in the correction steps (S14 to S17), and based on the continuous radial images Gr, the ultrasonic image (3 An ultrasonic image generation step (step S18) for generating Gs is performed.
- the image processing unit 33 B generates an ultrasonic image (three-dimensional image) G s shown in FIGS. 26 and 28, and an ultrasonic image (three-dimensional image) G s shown in FIGS. 25 and 27. It is possible to generate an ultrasound image (3D image) G s whose body cavity surface is smoother than that of Gs.
- FIGS. 25 and 27 are examples of conventionally obtained ultrasonic images
- FIGS. 26 and 28 are examples of ultrasonic images processed by the flowchart of FIG.
- FIGS. 27 and 28 are images obtained by removing the multiple echo portions from FIGS. 25 and 26.
- the ultrasonic image generation method and the ultrasonic image generation program in the second embodiment are used not only during the inspection but also when reproducing the data stored after the ultrasonic inspection as in the first embodiment (to be described later). Needless to say, it can also be used for image reviews for treatment policies and for volume measurement.
- the ultrasonic image generating method and the ultrasonic image generating program according to the second embodiment are more sensitive to the influence of pulsation than the first embodiment, and have high image quality and no image distortion.
- a tomographic image and an ultrasonic image (three-dimensional image) can be acquired.
- the present invention is not limited to only the above-described embodiments, and various modifications can be made without departing from the spirit of the invention.
- it is configured by partially combining the above-described embodiments.
- Embodiments also belong to the present invention.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04733985.8A EP1625829B8 (en) | 2003-05-19 | 2004-05-19 | Ultrasonic image generating method |
US11/280,751 US8162835B2 (en) | 2003-05-19 | 2005-11-16 | Ultrasonic image generating method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-140823 | 2003-05-19 | ||
JP2003140823 | 2003-05-19 | ||
JP2003387281A JP4468677B2 (ja) | 2003-05-19 | 2003-11-17 | 超音波画像生成方法及び超音波画像生成プログラム |
JP2003-387281 | 2003-11-17 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/280,751 Continuation US8162835B2 (en) | 2003-05-19 | 2005-11-16 | Ultrasonic image generating method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004100795A1 true WO2004100795A1 (ja) | 2004-11-25 |
Family
ID=33455513
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/007123 WO2004100795A1 (ja) | 2003-05-19 | 2004-05-19 | 超音波画像生成方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8162835B2 (ja) |
EP (1) | EP1625829B8 (ja) |
JP (1) | JP4468677B2 (ja) |
WO (1) | WO2004100795A1 (ja) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070299345A1 (en) * | 2004-10-27 | 2007-12-27 | Hideo Adachi | Capacitive Ultrasonic Transducer and Endo Cavity Ultrasonic Diagnosis System Using the Same |
DE102005019369B3 (de) * | 2005-04-26 | 2006-11-16 | Siemens Ag | Vorrichtung zur Aufnahme von Querschnittsbildern |
US8502821B2 (en) * | 2008-02-04 | 2013-08-06 | C Speed, Llc | System for three-dimensional rendering of electrical test and measurement signals |
CN102016957B (zh) * | 2008-02-25 | 2015-01-14 | 发明医药有限公司 | 医疗训练方法及设备 |
EP2107389B1 (en) * | 2008-04-02 | 2014-05-14 | Samsung Medison Co., Ltd. | Formation of an elastic image in an ultrasound system |
WO2010007860A1 (ja) * | 2008-07-15 | 2010-01-21 | 株式会社 日立メディコ | 超音波診断装置とその探触子操作ガイド表示方法 |
KR101182999B1 (ko) * | 2009-11-25 | 2012-09-18 | 삼성메디슨 주식회사 | 초음파 영상 처리를 수행하는 초음파 시스템 및 방법 |
US9443303B2 (en) * | 2011-09-09 | 2016-09-13 | Calgary Scientific Inc. | Image display of a centerline of tubular structure |
US9443317B2 (en) * | 2011-09-09 | 2016-09-13 | Calgary Scientific Inc. | Image display of a centerline of tubular structure |
US9196058B2 (en) * | 2013-11-26 | 2015-11-24 | Saudi Arabian Oil Company | Automated workflow for 3D core digital modeling from computerized tomography scanner (CTS) images |
JP6717801B2 (ja) * | 2015-03-02 | 2020-07-08 | テルモ株式会社 | 画像診断装置および画像構築方法 |
WO2018133098A1 (zh) * | 2017-01-23 | 2018-07-26 | 上海联影医疗科技有限公司 | 血管壁应力应变状态获取方法及系统 |
CN110710990B (zh) * | 2019-09-29 | 2021-07-02 | 华中科技大学 | 一种螺旋超声断层成像方法及系统 |
WO2022202310A1 (ja) * | 2021-03-26 | 2022-09-29 | テルモ株式会社 | プログラム、画像処理方法及び画像処理装置 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06285066A (ja) * | 1993-04-01 | 1994-10-11 | Toshiba Corp | 超音波診断装置 |
JPH06285065A (ja) * | 1993-03-31 | 1994-10-11 | Toshiba Corp | 超音波ドプラ診断装置 |
JPH08332187A (ja) * | 1995-06-08 | 1996-12-17 | Aloka Co Ltd | 超音波画像処理装置 |
JPH10192A (ja) * | 1996-04-15 | 1998-01-06 | Olympus Optical Co Ltd | 超音波画像診断装置 |
JPH10248844A (ja) * | 1997-03-12 | 1998-09-22 | Olympus Optical Co Ltd | 超音波画像診断装置 |
JP2000316864A (ja) * | 1999-05-11 | 2000-11-21 | Olympus Optical Co Ltd | 超音波診断装置 |
JP2002204790A (ja) * | 2000-11-04 | 2002-07-23 | Koninkl Philips Electronics Nv | 画像を重ね合わす方法及び装置 |
JP2002526226A (ja) * | 1998-10-01 | 2002-08-20 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 空間的に合成された超音波診断画像のブレを補正する方法 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5622174A (en) * | 1992-10-02 | 1997-04-22 | Kabushiki Kaisha Toshiba | Ultrasonic diagnosis apparatus and image displaying system |
JP2785636B2 (ja) * | 1993-02-25 | 1998-08-13 | 株式会社エス.エス.ビー | 生体組織多次元可視装置 |
JP3413379B2 (ja) | 1993-08-05 | 2003-06-03 | オリンパス光学工業株式会社 | 超音波画像診断装置 |
JP3316268B2 (ja) | 1993-08-30 | 2002-08-19 | アロカ株式会社 | 管腔内超音波診断装置 |
US5830145A (en) * | 1996-09-20 | 1998-11-03 | Cardiovascular Imaging Systems, Inc. | Enhanced accuracy of three-dimensional intraluminal ultrasound (ILUS) image reconstruction |
US6106466A (en) * | 1997-04-24 | 2000-08-22 | University Of Washington | Automated delineation of heart contours from images using reconstruction-based modeling |
US6095976A (en) * | 1997-06-19 | 2000-08-01 | Medinol Ltd. | Method for enhancing an image derived from reflected ultrasound signals produced by an ultrasound transmitter and detector inserted in a bodily lumen |
US5885218A (en) * | 1997-11-07 | 1999-03-23 | Scimed Life Systems, Inc. | Method and apparatus for spatial filtering in an intravascular ultrasound imaging system |
US6210328B1 (en) * | 1998-10-01 | 2001-04-03 | Atl Ultrasound | Ultrasonic diagnostic imaging system with variable spatial compounding |
JP2000316865A (ja) | 1999-05-12 | 2000-11-21 | Olympus Optical Co Ltd | 超音波画像診断装置 |
-
2003
- 2003-11-17 JP JP2003387281A patent/JP4468677B2/ja not_active Expired - Fee Related
-
2004
- 2004-05-19 WO PCT/JP2004/007123 patent/WO2004100795A1/ja active Application Filing
- 2004-05-19 EP EP04733985.8A patent/EP1625829B8/en not_active Expired - Lifetime
-
2005
- 2005-11-16 US US11/280,751 patent/US8162835B2/en not_active Expired - Fee Related
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06285065A (ja) * | 1993-03-31 | 1994-10-11 | Toshiba Corp | 超音波ドプラ診断装置 |
JPH06285066A (ja) * | 1993-04-01 | 1994-10-11 | Toshiba Corp | 超音波診断装置 |
JPH08332187A (ja) * | 1995-06-08 | 1996-12-17 | Aloka Co Ltd | 超音波画像処理装置 |
JPH10192A (ja) * | 1996-04-15 | 1998-01-06 | Olympus Optical Co Ltd | 超音波画像診断装置 |
JPH10248844A (ja) * | 1997-03-12 | 1998-09-22 | Olympus Optical Co Ltd | 超音波画像診断装置 |
JP2002526226A (ja) * | 1998-10-01 | 2002-08-20 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 空間的に合成された超音波診断画像のブレを補正する方法 |
JP2000316864A (ja) * | 1999-05-11 | 2000-11-21 | Olympus Optical Co Ltd | 超音波診断装置 |
JP2002204790A (ja) * | 2000-11-04 | 2002-07-23 | Koninkl Philips Electronics Nv | 画像を重ね合わす方法及び装置 |
Also Published As
Publication number | Publication date |
---|---|
US20060079772A1 (en) | 2006-04-13 |
JP2005000638A (ja) | 2005-01-06 |
JP4468677B2 (ja) | 2010-05-26 |
EP1625829B1 (en) | 2019-02-27 |
US8162835B2 (en) | 2012-04-24 |
EP1625829B8 (en) | 2019-04-17 |
EP1625829A1 (en) | 2006-02-15 |
EP1625829A4 (en) | 2010-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8162835B2 (en) | Ultrasonic image generating method | |
JP4864547B2 (ja) | 超音波診断装置およびその制御処理プログラム | |
US9826959B2 (en) | Ultrasonic diagnostic device | |
JP5294340B2 (ja) | 超音波診断装置 | |
JP4676334B2 (ja) | 生体信号モニタ装置 | |
JP2006510413A (ja) | 動脈壁の動きを決定する超音波ドップラシステム | |
JP5166154B2 (ja) | 超音波診断装置 | |
JP5400015B2 (ja) | 超音波診断装置およびその作動方法 | |
KR20130080640A (ko) | 초음파 영상 제공 방법 및 초음파 영상 제공 장치 | |
US20160051232A1 (en) | Ultrasound diagnostic apparatus and method and computer readable storage medium | |
JP2010233956A (ja) | 超音波診断装置 | |
JP4119497B2 (ja) | 超音波画像診断装置 | |
JP4764209B2 (ja) | 超音波信号解析装置、超音波信号解析方法、超音波解析プログラム、超音波診断装置、及び超音波診断装置の制御方法 | |
KR20160064442A (ko) | 의료영상처리장치 및 이를 이용한 의료영상정합방법 | |
JP2006167450A (ja) | 超音波心臓動態画像のランの部分の選定方法 | |
JP2010125025A (ja) | 超音波診断装置 | |
WO2016024422A1 (ja) | 超音波画像処理装置 | |
JP6853372B2 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
JP5346555B2 (ja) | 動脈硬化リスク表示機能を備えた超音波診断装置 | |
JP2003079627A (ja) | 心臓壁運動評価装置 | |
JP4098266B2 (ja) | 超音波画像診断装置 | |
WO2021192925A1 (ja) | 超音波診断装置、超音波診断装置の制御方法および超音波診断装置用プロセッサ | |
JP2007289720A (ja) | 超音波画像診断装置 | |
JP4445744B2 (ja) | 超音波画像生成装置の作動方法及び超音波画像生成プログラム | |
WO2022080190A1 (ja) | 超音波診断装置および超音波診断装置の制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004733985 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11280751 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2004733985 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 11280751 Country of ref document: US |