WO2007135884A1 - 超音波診断装置及び超音波診断方法 - Google Patents
超音波診断装置及び超音波診断方法 Download PDFInfo
- Publication number
- WO2007135884A1 WO2007135884A1 PCT/JP2007/059848 JP2007059848W WO2007135884A1 WO 2007135884 A1 WO2007135884 A1 WO 2007135884A1 JP 2007059848 W JP2007059848 W JP 2007059848W WO 2007135884 A1 WO2007135884 A1 WO 2007135884A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- boundary
- ultrasonic
- image
- diagnostic apparatus
- ultrasonic diagnostic
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0883—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
- A61B8/469—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/486—Diagnostic techniques involving arbitrary m-mode
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5269—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30048—Heart; Cardiac
Definitions
- the present invention relates to an ultrasonic diagnostic apparatus and an ultrasonic diagnostic method, and more particularly to an ultrasonic diagnostic apparatus and an ultrasonic diagnostic method that can accurately depict a boundary (contour) of a target tissue such as an organ. .
- boundary (contour) information of a focused organ or the like is useful diagnostic information.
- the boundary (contour) of the left ventricle is drawn, the area of the region surrounded by the boundary (contour) is obtained, or the volume of the left ventricle is estimated from the boundary (contour). It is considered useful for diagnosis.
- Patent Document 1 JP-A-8-206117.
- speckle noise is mixed in the ultrasonic image.
- This speckle noise is sufficiently small compared to the ultrasonic wavelength! (As known techniques for the scan Pekkurunoizu, see Patent Document 2.) Which are believed to emerge by scattering waves by reflector group in ⁇ living tissue interference occurs in various phases 0
- Patent Document 2 JP-A-9 94248.
- the boundary (contour) of the organ is defined by the gradient. Therefore, the effect of blur caused by speckle noise is not taken into account. For this reason, when the boundary (outline) of the organ is extracted by the blur width, the pixel density is high, and the region extends to the low pixel density region, the problem arises. For example, when trying to extract the boundary (contour) of the left ventricle of the heart, the heart wall is extracted inward by the blur width, which causes a problem that the size of the left ventricle is smaller than the actual size.
- an ultrasonic probe that transmits / receives ultrasonic waves to / from a subject, and an ultrasonic signal that is connected to the ultrasonic probe and obtained by the ultrasonic probe is used.
- An image generation unit that generates an ultrasonic image
- a control unit that is connected to and controls the ultrasonic probe and the image generation unit, and is connected to the image generation unit and the control unit.
- the ultrasonic diagnostic apparatus including the display unit that displays the ultrasonic image generated by the image generation unit based on the control by the unit,
- a selection means for selecting a part for detecting the position of the boundary of the organ of the subject, and two regions spaced by a predetermined interval on the ultrasound image
- a boundary extraction filter setting means for setting a boundary extraction filter comprising: and analyzing the pixel data in the boundary extraction filter set by the boundary extraction filter setting means in the vicinity of the part selected by the selection means.
- a boundary position detecting means for detecting the position of the boundary, and the boundary position detected by the boundary position detecting means is displayed on the display means under the control of the control means.
- An ultrasound diagnostic apparatus is provided.
- an ultrasonic diagnostic method capable of extracting the position of an organ boundary appearing on an ultrasonic image!
- step (3) a step of setting two regions having an interval according to the blur width calculated in step (2) as a boundary extraction filter
- step (4) a step of extracting the position of the boundary by obtaining the position and / or inclination at which the boundary strength becomes maximum or a predetermined value or more;
- the method includes a step of calculating the area of the region surrounded by the boundary or the volume of the organ representing the region surrounded by the boundary based on the boundary position obtained in the step (5).
- An ultrasonic diagnostic method featuring the above is provided.
- An object of the present invention is to extract a contour with high accuracy in consideration of a blur width appearing on an image in an ultrasonic diagnostic apparatus that extracts an organ boundary (contour) using an ultrasonic image.
- FIG. 1 is a block diagram showing the overall configuration of an ultrasound diagnostic apparatus according to Embodiment 1 of the present invention.
- FIG. 2 is a diagram showing the inside of the control unit 3.
- FIG. 3 is a diagram illustrating an example of image data to be subjected to boundary extraction processing according to the first embodiment.
- FIG. 5 is a flowchart of boundary extraction processing according to the first embodiment.
- FIG. 6 is a diagram showing what speckles are generated depending on the type of ultrasonic probe used.
- FIG. 8 A diagram explaining how the boundary between the two regions 28-1 and 28-2 is blurred by speckle appearing on the image.
- FIG. 9 A diagram showing a boundary extraction filter created when a boundary formed in a closed region is accurately extracted.
- FIG. 10 A diagram for calculating the statistical value for calculating the boundary strength using the degree of separation.
- FIG. 11 is a diagram showing a boundary extraction filter that is set by bending an area of a gap sandwiched between two areas according to an assumed boundary being bent.
- FIG. 13 is a flowchart of boundary extraction processing according to the second embodiment.
- FIG. 14 is a diagram showing an example of changing the interval between filter regions without changing the shape of two filter regions.
- FIG. 15 is a diagram showing an example of changing the shape of two filter regions.
- Example 1 An ultrasonic diagnostic apparatus according to an embodiment of the present invention will be described with reference to the drawings.
- Example 1 An ultrasonic diagnostic apparatus according to an embodiment of the present invention will be described with reference to the drawings.
- FIG. 1 is a block diagram showing an overall configuration of an ultrasonic diagnostic apparatus according to Embodiment 1 of the present invention.
- the ultrasonic diagnostic apparatus includes an ultrasonic probe 1 that transmits and receives ultrasonic waves, and an ultrasonic wave that is connected to the ultrasonic probe 1 and is based on an ultrasonic signal received by the ultrasonic probe.
- An image generation unit 2 that generates an image
- a control unit 3 such as a CPlXCentral Processing Unit (central processing unit) that is connected to each component of the ultrasound diagnostic apparatus and controls the operation of each component and arithmetic processing
- An operation unit 4 that is connected to each component of the ultrasonic diagnostic apparatus and that allows an operator such as a medical staff to operate the ultrasonic diagnostic apparatus using input devices (keyboard, mouse, trackball, touch panel, etc.)
- Storage unit 5 connected to each component of the ultrasound diagnostic device and storing image data, programs, etc., CRT and liquid crystal display device etc. connected to each component of the ultrasound diagnostic device and displaying images, measurement results, etc. Display unit 6 and .
- the ultrasonic probe 1 transmits and receives ultrasonic waves to and from the living body of a subject, and is a linear type in which vibrators are linearly arranged, and drives the vibrator with a time difference. This is a sector type that can change the angle of the beam. For example, there is a convex type that scans while moving.
- the ultrasonic probe 1 converts the ultrasonic wave (ultrasonic echo) reflected and returned from the living body of the subject into an electrical signal and sends it to the image generation unit 2.
- the image generation unit 2 generates a B-mode image using the signal received by the ultrasonic probe 1 and converted into an electrical signal as an input signal.
- the input signal is converted into a B-mode image through a phasing adder, a logarithmic amplifier, an envelope detector, an A / D converter, and a scan converter in the image generation unit 2.
- the control unit 3 is executed by loading a control program for the ultrasonic diagnostic apparatus stored in the storage unit 5 or the like.
- the control unit 3 gives an operation instruction to each component of the ultrasonic diagnostic apparatus, and performs timing control and arithmetic processing.
- the operation unit 4 is an input device such as a keyboard, a mouse, a trackball, and a touch panel on the ultrasonic diagnostic apparatus, and a diagnostician such as a medical staff adjusts image quality, gives instructions for measurement, inputs information, etc. Used for.
- the storage unit 5 is a device that stores image data, a control program, and the like, and is a hard disk, a general-purpose memory, a frame memory, or the like.
- the image data stored in the storage unit 5 is an acquired B-mode image or an image format file that can be displayed on a general PC.
- the display unit 6 is a CRT, a liquid crystal display device, or the like that displays image data, measurement values, and an image obtained by graphing the measurement values on a screen.
- FIG. 2 shows the inside of the control unit 3 according to the first embodiment of the present invention.
- the boundary extraction part designating means 7 for designating which part on the ultrasonic image the boundary of the organ is extracted
- the boundary extraction part Based on the boundary extraction calculation means 8 that extracts the boundary of the organ in the vicinity of the part specified by the specification means 7, and the boundary extracted by the boundary extraction calculation means 8, various physical quantities, that is, organs
- An organ measuring means 9 for calculating a distance such as a size, an area on an organ image, an estimated value of an organ volume, and the like is provided.
- the boundary extraction part designating means 7 is a means for the operator to designate the vicinity near the boundary extraction target on the image displayed on the screen of the display unit 6 by the input device.
- the boundary extraction part designating means 7 automatically processes the pixel values of the acquired image data by performing signal processing.
- the extraction part may be determined.
- the boundary extraction calculation means 8 calculates the image blur width of the portion of the image to be extracted by the indexing by the boundary extraction part designating means 7, and considers the image blur width and performs an appropriate filter.
- the image blur width calculating means 10, the filter shape creating / deforming means 11, the boundary strength calculating means 12, and the boundary position detecting means 13 are selected.
- the image blur width calculating means 10 is a means for calculating the speckle size using the pixel values in the vicinity of the boundary extraction part designated by the boundary extraction part designation means 7.
- a concentration co-occurrence matrix or an autocorrelation function is used as a method for calculating the size of the spectrum.
- the filter shape creation / deformation means 11 creates and transforms the boundary extraction filter.
- the filter shape creation / deformation means 11 creates a filter having a two-region force spaced by a distance based on the speckle size calculated by the image blur width calculation means 10.
- the filter shape creation / deformation means 11 can also deform the shape of the filter according to the shape of the boundary of this shape.
- the boundary strength calculation means 12 applies the boundary extraction filter created by the filter shape 'deformation means 11 to 2 at each position and / or inclination while being driven by an arbitrary position and / or inclination. Using the pixel values in the region, for example, the boundary strength is calculated by calculating the degree of separation described later.
- the boundary position detection means 13 has the boundary strength calculated by the boundary strength calculation means 12 while the position and / or inclination of the boundary extraction filter is moved and scanned to the maximum value or a predetermined value or more. Detect the position and / or inclination of the boundary extraction filter. Based on the detected position of the boundary extraction filter, the coordinate value of the target boundary position can be obtained.
- the organ measuring means 9 calculates various physical quantities related to the organ from which the boundary is extracted, for example, distance, area, volume, and the like, using the coordinate value of the extracted boundary position.
- the visceral measuring means 9 calculates a physical quantity such as the size of the tumor of the affected area that is the target region with high accuracy.
- FIG. 3 shows an example of image data to be subjected to boundary extraction processing in the first embodiment.
- Figure 3 14 is image data
- 15 is a beam direction
- 16 is a beam depth
- 17-1 to 17-3 are speckles
- 18 is a boundary position.
- 17-1 to 17-3 speckle is 17-1 ⁇ 17-2
- FIG. 4 shows a comparison between the boundary position 18 calculated by the conventional method and the true boundary position 19. According to this, it can be seen that there is a large error between the boundary position 18 calculated by the conventional method and the true boundary position 19.
- the ultrasonic probe 1 and the image generation unit 2 acquire image data 14 obtained by imaging a patient's organ and the like, and start boundary extraction for a target region such as the organ.
- the ultrasonic diagnostic apparatus selects and inputs a part to be subjected to boundary extraction manually or automatically by the boundary extraction part designating means 7.
- the ultrasonic diagnostic apparatus calculates the blur width on the image at the boundary extraction site specified in step 20 by the image blur width calculation means 10. Specifically, texture analysis is performed on the pixel value data in the vicinity of the boundary extraction region, and the speckle that appears on the image is approximated by an ellipse shape, and the width of the approximated ellipse (long axis or short axis) The distance half the length is calculated as the blur width.
- FIG. 6 shows what kind of speckle is generated depending on the type of the ultrasonic probe to be used.
- Figure 6 (a) shows an example of a linear ultrasonic probe 25-1.Speckle 17 when approximated by an ellipse has its major and minor axes in the horizontal and vertical directions on the screen. Match. Therefore, when extracting the speckle size by approximating it with an ellipse, the blur width is calculated so that the major and minor axes are aligned with the horizontal and vertical directions on the screen.
- FIG. 6 (b) is an example of the sector-type ultrasonic probe 25-2, and the speckle 17 when approximated by an ellipse has one of its long axis and short axis slanted on the screen.
- the direction is the direction of transmitting and receiving ultrasonic waves. Therefore, when extracting the size of speckles by approximating with an ellipse, either the major axis or the minor axis is made to coincide with the oblique direction on the screen, that is, the direction in which ultrasonic waves are transmitted and received. Calculate the blur width.
- the filter shape / deformation means 11 creates a boundary extraction filter having two regions as shown in 26-1 and 26-2 of 07 (a) and (b).
- the boundary extraction filters (27-1 and 27-2) in this step consist of two areas 26-1 and 26-2, and the two sides 28-1 and 28-2 facing each other. Distance between 29 forces is equal to the blur width obtained in step 21
- FIG. 8 (a) shows two regions and a true boundary position 19 located between them.
- Fig. 8 (b) shows one of the speckles 17 appearing on the image, and 17-4 is the width of the speckle in the horizontal direction of the drawing.
- Fig. 8 (c) shows the profile of the ultrasound image and shows the pixel distribution on the line segment that crosses the true boundary position. According to this, it can be seen that a region (30-2) having a high pixel value protrudes to the left side of the drawing because the speckle has a width in the horizontal direction of the drawing.
- Fig. 7 (a) and (b) the lower side is shown in Fig. 7 (a) where the lower side is the drawing direction and the right side is high in pixel value.
- the boundary extraction filter created in this step has the same profile as shown in Fig. 8 (c) when the pixel value on the left side is high.
- the distance between the opposite sides 28-1 and 28-2 in -2 is equal to the blur width of the profile on the lower side of Figs. 7 (a) and (b).
- This blur width is obtained in step 21 and is set to, for example, 1/2 of the speckle width in FIG. 8 (b).
- FIG. 9 is a boundary extraction filter created when the boundary formed in the closed region is accurately extracted. More specifically, as shown in the lower pixel value profile in FIG. 9, the surrounding high pixel value regions (31-1 and 31-2) and the high pixel value region (31-1 And the boundary position with the low pixel value region (31-3) surrounded by 31-2).
- the inner circular area 32-1 and the outer ring-shaped area 32-2 are separated from each other by the gap width calculated in step 21. Create a boundary extraction filter with space 33.
- the boundary strength calculation means 12 performs boundary strength calculation by scanning the boundary creation filter created in step 22 within the image.
- the boundary strength calculation will be described in the case of using a degree of separation that is a ratio of the inter-class variance and the total variance of two regions described later.
- the inter-class variance and total variance in the two regions are as described in IEICE Transactions Vol. J63-D No.4, p349-p356, and the inter-class variance is the pixel in the two regions.
- the variance value is obtained.
- the total variance is obtained by using the pixel data in the two areas as they are.
- FIG. 10 is a diagram for calculating the statistical value for calculating the boundary strength using the degree of separation, and shows two regions 26-1 and 26-2 and a blur width 29. .
- the numbers of pixels in regions 26-1 and 26-2 are N and N, respectively.
- the boundary extraction filters as shown by 26-1 and 26-2 in Fig. 7 are scanned on the image by moving them to different positions and / or inclinations, and then using the sequential equations (1) to (3) Calculate the force that changes the degree of separation (boundary strength). Then, the distribution of the boundary strength as shown in FIG. 8 (d) is obtained, and the position where the boundary strength is maximum is obtained as the true boundary position 19 and detected.
- the boundary extraction filters shown in FIGS. 7 (a) and (b) have the same shape, the direction of the region with a high force pixel value differs between FIGS. 7 (a) and 7 (b).
- the boundary position to be extracted is changed according to each case. For example, in the case of Fig. 7 (a), the pixel value is high on the right side in the drawing direction, so the region 26-2 set on the higher pixel value side of the two regions constituting the boundary extraction filter At the edge, near the other area (26-1), the side position is used to detect the boundary position. Extracted as a reference point (33). In the case of Fig.
- the region with the higher pixel value in FIG. 9
- the position inside the outer ring-shaped area 32-2) is extracted as a reference point (35) for detecting the true boundary position 19.
- the obtained boundary position (boundary information) is displayed on the display unit 6, and the organ measuring means 9 Using the boundary information of the target region from which the boundary is extracted, a physical quantity such as the size of the target region is calculated, and the calculated value is displayed and recorded.
- the blur width is calculated from the ultrasound image, the shape and size of the boundary extraction filter are set based on the blur width, and the boundary of the target region is set. Since the position is extracted, the boundary of the target region can be extracted with high accuracy. More specifically, the boundary extraction filter used in the above embodiment has two region forces, and the width of the gap sandwiched between the two regions is equal to the blur width calculated by the ultrasonic image force. The true boundary position can be extracted with high accuracy in consideration of the blur width (that is, a distance corresponding to, for example, half of the size of speckle appearing on the ultrasonic image).
- boundary extraction is performed assuming that the pixel value changes stepwise at the boundary, so the boundary position detection accuracy increases as the boundary appearing on the actual image becomes dull due to the influence of speckle or the like. Was getting worse.
- the boundary is extracted on the assumption that the boundary is dull due to the influence of speckle or the like. Therefore, the boundary position can always be extracted with the same accuracy as when the boundary is not dull on the image. It becomes possible. Then, by using the calculated coordinates of the true boundary position, a physical quantity such as the size of the target region can be accurately calculated, and accurate ultrasonic diagnosis can be performed using the value.
- the boundary extraction filter is set according to the image acquisition depth. It may be desirable to change the distance between the two regions that make up the structure.
- the boundary extraction filter it is considered desirable to set the boundary extraction filter to be set so as to reflect the boundary shape of the target part.
- a boundary extraction filter with the shape shown in Fig. 7 or Fig. 9 may be used. However, if the presumed boundary is bent or if a specific part of the heart is to be extracted, adjust it accordingly. If you use a boundary extraction filter with the shape shown in Fig. 11 and Fig. 12. In Fig. 11, 36-1 and 36-2 are forces that are two regions constituting the boundary extraction filter, respectively. In the region of the gap sandwiched between the two regions, the assumed boundary is bent. It is set according to the bending.
- FIG. 12 (a) shows the four-chamber disconnection image of the heart.
- Each filter indicated by the forces 37-1 to 37-5 represents the four-chamber disconnection image of the heart. This is for suitably extracting the boundary positions of the locations 38-1 to 38-5.
- the diagram shown in Fig. 12 (b) shows that each of the filters indicated by the forces 39-1 to 39-2, which are short-axis images of the heart, is located at each location of the short-axis image of the heart. This is for suitably extracting the boundary positions for -1 to 40-2.
- the contour of the heart is blurred inward when measuring the volume of the heart cavity. It is possible to prevent underestimation of the value of the cavity volume.
- Example 2 of the present invention will be described with reference to FIGS.
- the boundary extraction filter according to the present invention may be used.
- the boundary position is extracted in a search using.
- Step 40 For example, if the speckle distribution in the target area of the acquired image is not uniform, the statistical properties of the speckle itself are not reflected in the concentration co-occurrence matrix or autocorrelation function. It cannot be calculated correctly. In such a case, the boundary position can be searched according to the flow chart shown in FIG. Hereinafter, each step of the flowchart of FIG. 13 will be described in order. (Step 40)
- the ultrasonic diagnostic apparatus designates the boundary extraction part manually or automatically by the boundary extraction part designation means 7.
- the filter shape creation 'deforming means 11 creates a boundary extraction filter having an appropriate initial shape composed of two regions.
- the boundary strength is calculated sequentially by the method described in step 23. Find the distribution.
- FIG. 14 is a diagram illustrating an example of changing the interval between the filter regions without changing the shape of the two filter regions.
- the filter area interval is changed from 44-1 to 44-2. Change it as shown in 45-2.
- FIG. 15 is a diagram illustrating an example in which the shapes of the two filter regions are changed.
- the boundary strength is calculated while changing any of the filter region interval, shape, position, or inclination of the boundary extraction filter, and the filter region interval, shape, and position that maximize the boundary strength are obtained.
- the filter area interval matches the blur width of the target area of the ultrasound image
- the shape matches the shape of the boundary to be extracted
- the boundary strength becomes maximum.
- the filter The true boundary position can be detected by scanning the image while changing any of the region spacing, shape, and position, and searching for the position where the boundary strength is maximized.
- the present invention is not limited to the above-described embodiments, and various modifications can be made without departing from the spirit of the present invention.
- the present invention can also be applied to image measurement performed off-line on an electronic computer such as a personal computer that is connected only by an ultrasonic diagnostic apparatus.
- the boundary extraction filter may be a force circular region having two rectangular region forces. If the size of the two areas is arbitrary, it may be an area consisting of a small number of pixels. Also, the half distance of the ellipse width (long axis or short axis length) approximating the speckle obtained in step 21 was used as the blur width, and it was used in step 22 to create the boundary extraction filter.
- any distance other than half may be calculated as the blur width depending on the nature of the speckle.
- the parameter used as the boundary strength may be the degree of separation specified by Equations (1) to (3), but how much the value differs between the image data contained in the two regions. Needless to say, it can be any index that represents, but it can also be an index based on another calculation method.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Public Health (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Radiology & Medical Imaging (AREA)
- Physiology (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/301,350 US8300909B2 (en) | 2006-05-19 | 2007-05-14 | Ultrasonographic device and ultrasonographic method |
CN2007800182328A CN101448461B (zh) | 2006-05-19 | 2007-05-14 | 超声波诊断装置及边界提取方法 |
EP07743283.9A EP2047803A4 (en) | 2006-05-19 | 2007-05-14 | ULTRASONIC DEVICE AND METHOD |
JP2008516604A JP4879263B2 (ja) | 2006-05-19 | 2007-05-14 | 超音波診断装置及び超音波診断方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006-139865 | 2006-05-19 | ||
JP2006139865 | 2006-05-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007135884A1 true WO2007135884A1 (ja) | 2007-11-29 |
Family
ID=38723197
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2007/059848 WO2007135884A1 (ja) | 2006-05-19 | 2007-05-14 | 超音波診断装置及び超音波診断方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8300909B2 (ja) |
EP (1) | EP2047803A4 (ja) |
JP (1) | JP4879263B2 (ja) |
CN (1) | CN101448461B (ja) |
WO (1) | WO2007135884A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009125592A (ja) * | 2007-11-20 | 2009-06-11 | Medison Co Ltd | 適応的フィルタを用いて3次元超音波映像を形成する超音波映像装置及び方法 |
JP2010063495A (ja) * | 2008-09-08 | 2010-03-25 | Aloka Co Ltd | 超音波データ処理装置 |
WO2015198757A1 (ja) * | 2014-06-24 | 2015-12-30 | オリンパス株式会社 | 画像処理装置、内視鏡システム及び画像処理方法 |
KR20170041879A (ko) * | 2014-10-21 | 2017-04-17 | 우시 히스키 메디칼 테크놀로지스 컴퍼니., 리미티드. | 검출영역을 선택하는 방법 및 장치 및 탄성 검출 시스템 |
KR20170042677A (ko) * | 2014-10-21 | 2017-04-19 | 우시 히스키 메디칼 테크놀로지스 컴퍼니., 리미티드. | 간 경계 식별방법 및 시스템 |
WO2018110089A1 (ja) * | 2016-12-15 | 2018-06-21 | オムロン株式会社 | スジ状領域検出装置、スジ状領域検出方法、プログラム |
JP2020003234A (ja) * | 2018-06-25 | 2020-01-09 | 株式会社島津製作所 | 変位量測定装置、変位量測定方法および変位量測定プログラム |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014155825A1 (ja) * | 2013-03-29 | 2014-10-02 | 日立アロカメディカル株式会社 | 医療用診断装置およびその計測方法 |
US10579879B2 (en) | 2016-08-10 | 2020-03-03 | Vivint, Inc. | Sonic sensing |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08206117A (ja) | 1994-05-27 | 1996-08-13 | Fujitsu Ltd | 超音波診断装置 |
JPH0994248A (ja) | 1995-09-29 | 1997-04-08 | Hitachi Medical Corp | 超音波診断装置におけるスペックルノイズ判定方法及びスペックルノイズ判定除去回路を備えた超音波診断装置 |
JP2005205199A (ja) * | 2003-12-26 | 2005-08-04 | Fuji Photo Film Co Ltd | 超音波画像処理方法及び超音波画像処理装置、並びに、超音波画像処理プログラム |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5457754A (en) * | 1990-08-02 | 1995-10-10 | University Of Cincinnati | Method for automatic contour extraction of a cardiac image |
US8021301B2 (en) * | 2003-12-26 | 2011-09-20 | Fujifilm Corporation | Ultrasonic image processing apparatus, ultrasonic image processing method and ultrasonic image processing program |
US8031978B2 (en) * | 2004-06-30 | 2011-10-04 | Hitachi Aloka Medical, Ltd. | Method and apparatus of image processing to detect edges |
-
2007
- 2007-05-14 CN CN2007800182328A patent/CN101448461B/zh active Active
- 2007-05-14 US US12/301,350 patent/US8300909B2/en active Active
- 2007-05-14 EP EP07743283.9A patent/EP2047803A4/en not_active Withdrawn
- 2007-05-14 JP JP2008516604A patent/JP4879263B2/ja active Active
- 2007-05-14 WO PCT/JP2007/059848 patent/WO2007135884A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08206117A (ja) | 1994-05-27 | 1996-08-13 | Fujitsu Ltd | 超音波診断装置 |
JPH0994248A (ja) | 1995-09-29 | 1997-04-08 | Hitachi Medical Corp | 超音波診断装置におけるスペックルノイズ判定方法及びスペックルノイズ判定除去回路を備えた超音波診断装置 |
JP2005205199A (ja) * | 2003-12-26 | 2005-08-04 | Fuji Photo Film Co Ltd | 超音波画像処理方法及び超音波画像処理装置、並びに、超音波画像処理プログラム |
Non-Patent Citations (8)
Title |
---|
B.J. OOSTERVELD ET AL.: "TEXTURE OF B-MODE ECHOGRAMS: 3-D SIMULATIONS AND EXPERIMENTS OF THE EFFECTS OF DIFFRACTION AND SCATTERER DENSITY", ULTRASONIC IMAGING, vol. 7, 1985, pages 142 - 160 |
ITO M. ET AL.: "Choonpa Gazo no Kyokai Kyocho o Mokuteki to Shita Tekioteki Morphology Kahen Kazo Yoso no Seigyo", IMAGE LAB., vol. 15, no. 6, 1 June 2004 (2004-06-01), pages 9 - 13, XP008088167 * |
MAGAZINE OF PAPERS OF INSTITUTE OF ELECTRONICS AND COMMUNICATION ENGINEERS OF JAPAN, vol. J63-D, no. 4, pages 349 - 356 |
NAGANO T. ET AL.: "Morphology Ensan o Mochiita Kahen Burokku Ho ni Yoru Iyo Choonpa Gazo no Ryoiki Bunkatsu", THE TRANSACTIONS OF THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS, vol. J84-A, no. 12, December 2001 (2001-12-01), pages 1444 - 1451, XP008088170 * |
O. BASSEL ET AL.: "TEXTURE ANALYSIS OF ULTRASONIC IMAGES OF THE PROSTATE BY MEANS OF CO-OCCURRENCE MATRICS", ULTRASONIC IMAGING, vol. 15, 1993, pages 218 - 237 |
See also references of EP2047803A4 |
YAMAUCHI M. ET AL.: "Doteki Rinkaku Model ni Yoru Choonpa Shinsashitsu Yoseki Keisoku Ho", IEICE TECHNICAL REPORT, vol. 102, no. 137, 14 June 2002 (2002-06-14), pages 29 - 32, XP008088169 * |
ZAMA T. ET AL.: "Hanbetsu Kijun ni Motozuku Level Set Ho o Mochiita Kyobu MR Gazo no Ryoiki Chushutsu", IEICE TECHNICAL REPORT, vol. 106, no. 343, 6 November 2006 (2006-11-06), pages 49 - 53, XP008088168 * |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009125592A (ja) * | 2007-11-20 | 2009-06-11 | Medison Co Ltd | 適応的フィルタを用いて3次元超音波映像を形成する超音波映像装置及び方法 |
JP2010063495A (ja) * | 2008-09-08 | 2010-03-25 | Aloka Co Ltd | 超音波データ処理装置 |
WO2015198757A1 (ja) * | 2014-06-24 | 2015-12-30 | オリンパス株式会社 | 画像処理装置、内視鏡システム及び画像処理方法 |
JP2016009984A (ja) * | 2014-06-24 | 2016-01-18 | オリンパス株式会社 | 画像処理装置、内視鏡システム及び画像処理方法 |
US10360474B2 (en) | 2014-06-24 | 2019-07-23 | Olympus Corporation | Image processing device, endoscope system, and image processing method |
US10354390B2 (en) | 2014-10-21 | 2019-07-16 | Wuxi Hisky Medical Technologies Co., Ltd. | Liver boundary identification method and system |
KR20180058228A (ko) * | 2014-10-21 | 2018-05-31 | 우시 히스키 메디칼 테크놀로지스 컴퍼니., 리미티드. | 간 경계 식별방법 및 시스템 |
KR101894212B1 (ko) | 2014-10-21 | 2018-08-31 | 우시 히스키 메디칼 테크놀로지스 컴퍼니., 리미티드. | 간 경계 식별방법 및 시스템 |
KR101913976B1 (ko) * | 2014-10-21 | 2018-10-31 | 우시 히스키 메디칼 테크놀로지스 컴퍼니., 리미티드. | 검출영역을 선택하는 방법 및 장치 및 탄성 검출 시스템 |
KR101913977B1 (ko) | 2014-10-21 | 2018-10-31 | 우시 히스키 메디칼 테크놀로지스 컴퍼니., 리미티드. | 간 경계 식별방법 및 시스템 |
KR20170042677A (ko) * | 2014-10-21 | 2017-04-19 | 우시 히스키 메디칼 테크놀로지스 컴퍼니., 리미티드. | 간 경계 식별방법 및 시스템 |
KR20170041879A (ko) * | 2014-10-21 | 2017-04-17 | 우시 히스키 메디칼 테크놀로지스 컴퍼니., 리미티드. | 검출영역을 선택하는 방법 및 장치 및 탄성 검출 시스템 |
US10748291B2 (en) | 2014-10-21 | 2020-08-18 | Wuxi Hisky Medical Technologies Co., Ltd. | Liver boundary identification method and system |
US10925582B2 (en) | 2014-10-21 | 2021-02-23 | Wuxi Hisky Medical Technologies Co., Ltd. | Method and device for selecting detection area, and elasticity detection system |
WO2018110089A1 (ja) * | 2016-12-15 | 2018-06-21 | オムロン株式会社 | スジ状領域検出装置、スジ状領域検出方法、プログラム |
JP2018097717A (ja) * | 2016-12-15 | 2018-06-21 | オムロン株式会社 | スジ状領域検出装置およびスジ状領域検出方法 |
US10846869B2 (en) | 2016-12-15 | 2020-11-24 | Omron Corporation | Streak-like region detecting device, streak-like region detecting method, and program |
JP2020003234A (ja) * | 2018-06-25 | 2020-01-09 | 株式会社島津製作所 | 変位量測定装置、変位量測定方法および変位量測定プログラム |
Also Published As
Publication number | Publication date |
---|---|
US8300909B2 (en) | 2012-10-30 |
US20090163812A1 (en) | 2009-06-25 |
EP2047803A1 (en) | 2009-04-15 |
JP4879263B2 (ja) | 2012-02-22 |
JPWO2007135884A1 (ja) | 2009-10-01 |
CN101448461B (zh) | 2011-04-06 |
CN101448461A (zh) | 2009-06-03 |
EP2047803A4 (en) | 2014-02-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6367425B2 (ja) | 超音波診断装置 | |
JP4745133B2 (ja) | 超音波診断装置、医用画像処理装置及び医用画像処理プログラム | |
JP5645811B2 (ja) | 医用画像診断装置、関心領域設定方法、医用画像処理装置、及び関心領域設定プログラム | |
JP4879263B2 (ja) | 超音波診断装置及び超音波診断方法 | |
CN109310399B (zh) | 医学超声图像处理设备 | |
US20190244352A1 (en) | Analyzing apparatus and analyzing method | |
CN104093363B (zh) | 医学图像诊断装置及其设定感兴趣区域的方法 | |
US20060034513A1 (en) | View assistance in three-dimensional ultrasound imaging | |
JP7239275B2 (ja) | 超音波診断装置及び穿刺支援プログラム | |
CN104797199A (zh) | 用于实时胎儿心脏评估的标准平面的自动定位 | |
JP7010948B2 (ja) | 胎児超音波撮像 | |
JP2023053346A (ja) | 解析装置及び解析プログラム | |
BR112020014733A2 (pt) | Método implementado por computador para a obtenção de medições anatômicas em uma imagem de ultrassom, meios de programa de computador, dispositivo de análise de imagem e método de imageamento por ultrassom | |
JP2008073423A (ja) | 超音波診断装置、診断パラメータ計測装置及び診断パラメータ計測方法 | |
JP6358192B2 (ja) | 超音波診断装置、及び超音波診断装置の制御方法 | |
US11484286B2 (en) | Ultrasound evaluation of anatomical features | |
CN115279275A (zh) | 超声诊断设备及其操作方法 | |
RU2778840C2 (ru) | Ультразвуковая диагностика анатомических особенностей | |
JP3534667B2 (ja) | 超音波計測装置 | |
US11382595B2 (en) | Methods and systems for automated heart rate measurement for ultrasound motion modes | |
JP7299100B2 (ja) | 超音波診断装置及び超音波画像処理方法 | |
KR20160086126A (ko) | 초음파 진단 방법 및 장치 | |
CN116650006A (zh) | 用于自动超声检查的系统和方法 | |
EP3178401A1 (en) | Ultrasonic diagnostic apparatus and method for controlling the same | |
WO2009031078A1 (en) | Spectral and color doppler imaging system and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200780018232.8 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07743283 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008516604 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007743283 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12301350 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |