WO2011052602A1 - Ultrasonic imaging device, ultrasonic imaging method and program for ultrasonic imaging - Google Patents
Ultrasonic imaging device, ultrasonic imaging method and program for ultrasonic imaging Download PDFInfo
- Publication number
- WO2011052602A1 WO2011052602A1 PCT/JP2010/068988 JP2010068988W WO2011052602A1 WO 2011052602 A1 WO2011052602 A1 WO 2011052602A1 JP 2010068988 W JP2010068988 W JP 2010068988W WO 2011052602 A1 WO2011052602 A1 WO 2011052602A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- region
- similarity
- distribution
- interest
- imaging apparatus
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
- A61B8/085—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5269—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- the present invention relates to an ultrasonic imaging method and an ultrasonic imaging apparatus that can clearly identify a tissue boundary when imaging a living body using ultrasonic waves.
- an elastic modulus distribution of a tissue is estimated based on a change amount of a small area of a diagnostic moving image (B-mode image).
- B-mode image diagnostic moving image
- a method of displaying the hardness after converting it into a color map is known.
- the acoustic impedance and the elastic modulus may not differ greatly with respect to the surrounding tissue. I cannot grasp the boundary with other organizations.
- Patent Document 2 proposes a technique that makes it possible to identify a tissue boundary in which acoustic impedance and elastic modulus are not significantly different from the surroundings by creating a scalar field image directly from a motion vector of a diagnostic moving image. Has been.
- An object of the present invention is to provide an ultrasonic imaging apparatus capable of discriminating a noise region where an echo signal is weak.
- the ultrasonic imaging apparatus of the present invention includes a transmission unit that transmits ultrasonic waves toward a target, a reception unit that receives ultrasonic waves coming from the target, and a reception signal of the reception unit to process two or more frames. And a processing unit for generating an image.
- the processing unit sets a region of interest at a predetermined position or a position received from the operator by using one frame of the generated two or more frames as a reference frame.
- Another frame is set as a comparison frame, a search area wider than the region of interest is set at a predetermined position or a position received from the operator, and a plurality of candidates for the movement destination of the region of interest within the search region Set candidate areas for.
- the similarity between the image characteristic values in the region of interest and the candidate region is calculated for each candidate region, and the similarity distribution over the entire search region is obtained. This makes it possible to determine whether the region of interest is a noise region based on the similarity distribution.
- the processing unit obtains a statistic that compares the minimum value of the similarity and the overall value of the similarity in the similarity distribution, and determines the reliability of the region of interest based on the statistic. Specifically, for example, the processing unit determines the reliability of the region of interest by calculating the above-described statistic using the minimum value, average value, and standard deviation of the similarity, and comparing the calculated statistic with a threshold value. Is possible.
- the processing unit can generate a vector connecting the position corresponding to the region of interest in the comparison frame and the position of the candidate region having the minimum similarity, and the vector for the region of interest determined to have low reliability. Is replaced with zero or a predetermined vector. Thereby, an error vector can be removed and the accuracy of the vector can be improved.
- the processing unit can calculate the average, minimum value, and standard deviation of the similarity for the similarity distribution, and use the degree of separation obtained by dividing the difference between the average and the minimum value by the standard deviation as the statistic. Further, for example, it is also possible to calculate the average and standard deviation of the similarity for the similarity distribution, and use the coefficient of variation obtained by dividing the standard deviation by the average as the statistic.
- Threshold values to be compared with the above-mentioned statistics can be obtained as follows. For example, a plurality of regions of interest are set, a statistic is obtained for each, a histogram distribution indicating the frequency of the obtained statistic value is obtained, and the median, average value, or histogram distribution in the histogram distribution is a plurality of mountain shapes Is used, the statistic of the minimum value of the valley between the mountains is used as the threshold value.
- the smoothing process uses, for example, a method in which a filter having a predetermined size is set in the similarity distribution and the process of smoothing the distribution in the filter is repeated while moving the filter by a predetermined amount.
- the size of the filter can be determined as follows. A vector connecting the position corresponding to the region of interest in the comparison frame and the position of the candidate region having the minimum similarity in the similarity distribution before the smoothing process is generated in advance for a plurality of regions of interest. Of the generated vectors, the maximum vector length is set as the filter size.
- the processing unit obtains a similarity distribution for a region of interest set near the living body tumor boundary, generates a similarity distribution image having the similarity as an image characteristic value, and corresponds to the region of interest on the similarity distribution image.
- a first processing means for setting a one-dimensional area of a predetermined length in a plurality of different directions centered on a position; a second processing means for calculating the sum of similarities in the one-dimensional area for each set direction; and similarity Third processing means for calculating the ratio between the similarity sum in the direction in which the sum is minimum and the similarity sum of the one-dimensional area in the direction orthogonal thereto, and fourth processing for determining the degree of tumor invasion based on the ratio Means.
- the ratio calculated by the third processing means is smaller than a predetermined value set in advance, it is possible to obtain the boundary line by determining that the target pixel is a point constituting the boundary line.
- the following ultrasonic imaging method is provided. That is, an ultrasonic wave is transmitted toward the target, and a reception signal obtained by receiving the ultrasonic wave coming from the target is processed to generate an image of two or more frames, and a reference frame and a comparison frame are selected from the image. A region of interest is set in the reference frame, a search region wider than the region of interest is set in the comparison frame, a plurality of candidate regions that are candidates for movement of the region of interest are set in the search region, The ultrasonic imaging method calculates the similarity of the image characteristic values in the candidate area for each candidate area and obtains the similarity distribution over the entire search area.
- the following ultrasound imaging program is provided. That is, a first step of selecting a reference frame and a comparison frame from two or more frames of ultrasound images, a second step of setting a region of interest in the reference frame, and a search region wider than the region of interest in the comparison frame.
- the present invention it is possible to determine whether the region of interest is a noisy region based on the similarity distribution. As a result, the generation of error vectors is suppressed, and highly accurate vector estimation is possible even in the penetration limit region. The accuracy of the scalar field image converted from the estimated motion vector field is improved, and more appropriate boundary detection is possible.
- FIG. 1 is a block diagram showing an example system configuration of an ultrasound imaging apparatus according to Embodiment 1.
- FIG. 5 is a flowchart showing a processing procedure for image generation by the ultrasonic imaging apparatus according to the first embodiment. The flowchart which shows the detail of the block matching process of step 24 of FIG. The figure explaining the block matching process of step 24 of FIG. 2 with the phantom of a two-layer structure.
- A The figure which shows the example of a B mode image produced
- FIG. 5A is a diagram showing an example of an SAD distribution image obtained by setting the ROI at the position (3) in FIG. 5B, and FIG. 5B is obtained by setting the ROI at the position (5) in FIG.
- the figure which shows the example of a SAD distribution image (c) is the histogram of the SAD value shown to Fig. (A), (d) is the histogram of the SAD value shown to Fig. (B).
- FIG. 4 is an explanatory diagram showing the definition of the degree of separation on the histogram of SAD values in the first embodiment.
- the flowchart which shows the detail of a process in the case of performing the process of step 25 of FIG. 2 using a degree of separation.
- the flowchart which shows the detail of a process in the case of performing the process of step 25 of FIG. 2 using a variation coefficient.
- the figure which shows the example of a vector distribution image which removed the false vector at step 25 of FIG. 10 is a flowchart of processing for removing noise when calculating the SAD distribution according to the second embodiment.
- FIG. 9 is a flowchart illustrating processing for obtaining an infiltration degree according to the third embodiment.
- (A)-(h) Explanatory drawing which shows the area
- FIG. (c) is a figure which shows the image which applied the Laplacian filter to the SAD value distribution shown in FIG. (A)
- (d) is the Laplacian filter in the SAD value distribution shown in FIG. The figure which shows the image which applied.
- FIG. 1 shows a system configuration of the ultrasonic imaging apparatus of the present embodiment.
- This apparatus has an ultrasonic boundary detection function.
- this apparatus includes an ultrasonic probe (probe) 1, a user interface 2, a transmission beam former 3, a control system 4, a transmission / reception changeover switch 5, a reception beam former 6, and an envelope detection unit 7.
- a scan converter 8 a processing unit 10, a parameter setting unit 11, a combining unit 12, and a display unit 13.
- the ultrasonic probe 1 in which ultrasonic elements are arranged one-dimensionally transmits an ultrasonic beam (ultrasonic pulse) to a living body and receives an echo signal (received signal) reflected from the living body.
- a transmission signal having a delay time adjusted to the transmission focus is output by the transmission beamformer 3 and sent to the ultrasonic probe 1 via the transmission / reception changeover switch 5.
- the ultrasonic beam reflected or scattered in the living body and returned to the ultrasonic probe 1 is converted into an electric signal by the ultrasonic probe 1 and received by the receiving beam former 6 via the transmission / reception switch 5. Sent as.
- the receive beamformer 6 is a complex beamformer that mixes two received signals that are 90 degrees out of phase, and performs dynamic focus that adjusts the delay time according to the reception timing under the control of the control system 4. Outputs real and imaginary RF signals.
- the RF signal is detected by the envelope detector 7 and then converted into a video signal, which is input to the scan converter 8 and converted into image data (B-mode image data).
- image data B-mode image data
- the ultrasonic boundary detection process is realized by the processing unit 10.
- the processing unit 10 includes a CPU 10a and a memory 10b.
- the CPU 10a executes a program stored in the memory 10b in advance, the following processing is performed to detect the boundary of the subject tissue. That is, based on the image data of two or more frames output from the scan converter 8, the processing unit 10 first creates a motion vector field. Next, the generated motion vector field is converted into a scalar field. The original image data and the corresponding motion vector field or scalar field are combined by the combining unit 12 and then displayed on the display unit 13.
- the parameter setting unit 11 performs parameter setting for signal processing in the processing unit 10 and selection setting of a display image in the synthesis unit 12. These parameters are input from the user interface 2 by an operator (device operator).
- parameters for signal processing for example, setting of a region of interest on a desired frame m and setting of a search region on a frame m + ⁇ different from the frame m can be received from an operator.
- selection setting of the display image for example, the selection setting of whether the original image and the vector field image (or scalar image) are combined into one image and displayed on the display, or two or more moving images are displayed side by side. Can be received from the operator.
- FIG. 2 shows a flowchart of an example of boundary detection processing and image processing in the processing unit 10 and the combining unit 12 of the present invention.
- the processing unit 10 first acquires a measurement signal from the scan converter 8 and performs normal signal processing to create a B-mode moving image (steps 21 and 22).
- two frames of a desired frame and a frame having a different time are extracted from the B-mode moving image (step 23).
- the desired frame and the next frame are extracted.
- a motion vector field is calculated from the two frames (step 24).
- the motion vector field is calculated based on the block matching method.
- a noise removal process is performed on the calculated motion vector field (step 25), and the noise-removed motion vector field is converted to a scalar field (step 26).
- step 27 the processing for one image is completed.
- FIG. 3 is a flowchart showing detailed processing in step 24, and FIG. 4 is a diagram for explaining block matching processing in step 24.
- the block matching process for calculating the motion vector field in step 24 will be specifically described with reference to FIGS.
- ⁇ 1 frame.
- the processing unit 10 sets a region of interest ROI (region of interest: reference block) 31 having a predetermined number of pixels N as shown in FIG. 4 (step 51).
- the luminance distribution of the pixels included in the ROI 31 is represented as P m (i 0 , j 0 ).
- i 0 and j 0 indicate the position of the pixel in the ROI 31.
- the processing unit 10 sets a search area 32 of a predetermined size at a position corresponding to the ROI 31 of the frame m and its vicinity at the frame m + ⁇ th (step 52).
- the setting of the ROI 31 will be described with respect to a configuration in which the processing unit 10 sequentially sets the ROI 31 over the entire image of the frame m and sets a search area 32 of a predetermined size centered on the ROI 31.
- 10 is a ROI 31 having a predetermined position and size, and a search region 32 having a predetermined size is set in the vicinity thereof, and a region of interest (ROI) and a search region received from an operator in the parameter setting unit 11
- the processing unit 10 can set the ROI 31 and the search area 32.
- the search area 32 is divided into a plurality of movement candidate areas 33 having the same size as the ROI 31.
- the processing unit 10 calculates the movement candidate area 33 having the highest similarity to the luminance of the ROI 31 and selects it as the movement destination area.
- an index representing the degree of similarity a sum of absolute differences, a mean square, a cross-correlation value, or the like can be used.
- the sum of absolute differences is used as an example will be described below.
- the luminance distribution of the pixels included in the movement candidate area 33 in the search area 32 is represented as P m + ⁇ (i, j). i and j indicate the position of the pixel in the movement candidate area 33.
- the processing unit 10 calculates the sum of absolute differences SAD (sum of absolute difference) between the luminance distribution P m + ⁇ (i, j) of the pixel of the ROI 31 and the luminance distribution P m (i 0 , j 0 ) of the movement candidate region 33. (Step 53).
- SAD is defined by the following equation (1).
- the processing unit 10 obtains the SAD value with the ROI 31 for all the movement candidate areas 33 in the search area 32, and determines the movement candidate area 33 having the smallest SAD value in the obtained SAD distribution as the movement destination area. Then, a motion vector connecting the position of the ROI 31 and the position of the movement candidate area 33 with the minimum SAD value is determined (step 54).
- the processing unit 10 determines the motion vector for the entire image of the frame m by repeating the above process while moving the ROI 31 over the entire image of the frame m (step 55).
- a motion vector field motion vector distribution image
- FIGS. 5A and 5B are examples of B-mode images and motion vector distribution images obtained by the above processing.
- the B-mode image in FIG. 5A is obtained by superposing the gel base material phantoms 41 and 42 in two layers and fixing the ultrasonic probe to the upper phantom 41 and moving it laterally.
- the region of the upper ⁇ ⁇ corresponding to the upper phantom 41 to which the ultrasonic probe 1 is fixed is relatively stationary, and the lower phantom 42 is a vector field indicating lateral movement. (Horizontal arrow).
- the arrow points obliquely upward, the direction is not constant, and there is a phenomenon that the motion vector is disturbed. This phenomenon is caused by a decrease in the S / N ratio (SNR) of detection sensitivity as the distance from the probe 1 increases, and indicates a penetration limit. That is, an erroneous vector is generated in a low SNR region far from the probe 1.
- SNR S / N ratio
- FIGS. 6A and 6B show the respective movement candidate areas 33 of the search area 32 of the next frame obtained by setting the ROI 31 at the positions (3) and (5) in FIG.
- FIG. 6 is a diagram (SAD distribution diagram) showing SAD values as densities of respective movement candidate regions 33;
- FIGS. 6A and 6B each show a search area 32 as a whole, and the search area 32 is divided into 21 ⁇ 21 movement candidate areas 33.
- the block size of the movement candidate area 33 is 30 ⁇ 30 pixels, the search area 32 is 50 ⁇ 50 pixels, and the movement candidate area 33 is moved pixel by pixel within the search area 32.
- the search area 32 is set so that the position of the movement candidate area 33 corresponding to the position of the ROI 31 is located at the center of the search area 32.
- the SAD value of the movement candidate region 33 at a position shifted to the right side from the center of the search region 32 is minimum. Therefore, it can be seen that a right lateral vector is determined in step 24 for the position (3) in FIG. As can be seen from FIG. 5 (b), the position (3) is slightly below the boundary between the two-layer phantoms 41 and 42 (the phantom 42 side), and a vector in the horizontal direction is displayed.
- the SAD values are distributed around the movement candidate region 33 having the smallest SAD value in the horizontal direction, that is, in the direction along the boundary between the two-layer phantoms 41 and 42. It can be seen that a small region (SAD value valley) is formed. This phenomenon is that, as shown in FIG. 6B, the boundary can be directly detected from the SAD distribution of FIG. 6A without creating all the motion vector fields while moving the ROI 31 to the entire region of the frame m. It suggests.
- the SAD value distribution of FIG. 6B is the position (5) in the region where the motion vector is disturbed in FIG. 5B
- the movement candidate region 33 with the smallest SAD value is the noise. It can be seen that the value is uniformly spread over a wide area near the probe 1 where the value is small, and the motion vector is upward and easily disturbed. It can also be seen that the region with the small SAD value (the valley of the SAD value) that should be formed around the movement candidate region 33 with the smallest SAD value is buried in the entire noise fluctuation and cannot be recognized.
- the SAD value in which the movement candidate region 33 having the smallest SAD value is clearly smaller than the surrounding region. Therefore, in the SAD distribution image of the search area 32 in which the ROI 31 is set at the noisy position (5) as shown in FIG. 6B, the movement candidate area 33 having the smallest SAD value is obtained.
- the noisy ROI 31 is determined using the phenomenon that it cannot be clearly recognized. For the noisy ROI 31, a process for removing the corresponding motion vector is performed.
- the processing unit 10 determines, for example, from the SAD distribution of the search region 32 as shown in FIGS. 6A and 6B when the ROI 31 is set at a predetermined position in step 25 of FIG. Histograms showing the distribution of SAD values are created as shown in FIGS.
- the SAD minimum value is sufficiently separated from the SAD value having a high frequency in the histogram distribution as shown in FIG. 6C. . That is, the minimum SAD value and the range of SAD values with high frequency are sufficiently separated.
- the histogram when the ROI 31 is set at the position (5) in FIG. 5B does not have much frequency difference with respect to the SAD value distribution as shown in FIG.
- the histogram distribution is broad. It has spread. For this reason, the SAD minimum value is included in the range of the SAD value having a high frequency, and the separation between the minimum SAD value and the range of the SAD value having a high frequency is insufficient.
- the reliability of the signal of the ROI 31 corresponding to this search region 32 (less noise)
- the reliability of the motion vector determined in the search area 32 can be determined.
- a region with low reliability can be determined as a low SNR region, and the reliability of the corresponding motion vector can also be determined.
- an index is used to determine the degree of separation between the SAD minimum value and the frequent SAD value in the histogram of the SAD value distribution.
- FIG. 7 shows the concept of definition of the degree of separation.
- the degree of separation is a value corresponding to the distance between the distribution average of the histogram and the minimum value, and is defined by the following equation (2).
- equation (2) in order to avoid the influence by the difference in distribution, normalization is performed with the standard deviation.
- FIG. 8 shows a flow of processing for discriminating a noisy region when the degree of separation is used as an index and removing a motion vector.
- This process specifically shows the process of step 25 in FIG. 2, and is performed for all ROIs 31 set in frame m in step 51 of FIG.
- the processing unit 10 first determines the target ROI 31 (step 81), and the SAD value of all the movement candidate regions 33 in the search region 32 set and calculated in steps 52 and 53 in FIG. 3 for the determined ROI 31. Is used to calculate the mean, minimum value, and standard deviation of the SAD values according to statistical processing (step 82). Then, the degree of separation defined by the above equation (2) is obtained (step 83). This is repeated for all ROIs 31 (step 84).
- the motion vector obtained in step 54 in FIG. 3 is replaced with 0 (step 85).
- a region with low reliability can be determined from the motion vector image, and the vector (erroneous vector) can be removed.
- a predetermined threshold value or the median value of the distribution of the degree of separation obtained for all ROIs 31 in Step 84 can be used.
- a histogram of the obtained degree of separation is generated and a plurality of frequency peaks are formed, between the mountain located on the side with the lowest degree of separation and the mountain located on the side with the higher degree of separation. It is also possible to use the degree of separation of the valley positions as a threshold value.
- FIG. 9 shows an image of the degree of separation obtained for all ROIs 31 in step 83.
- 33 ⁇ 51 ROIs 31 are set in the frame m, and the separation degree of each ROI 31 is indicated by the density.
- the degree of separation is low in the low SNR region below the frame m, and it can be seen that the degree of separation reflects the reliability of motion vector estimation.
- the degree of separation is used.
- another index can be used as an index for determining the degree of separation between the SAD minimum value and the SAD value having a high frequency.
- another index can be used.
- a coefficient of variation can be used. The coefficient of variation is defined by the following equation, and is a statistic obtained by standardizing standard deviations on the average, and indicates the magnitude of distribution variation (that is, difficulty in separating minimum values).
- FIG. 10 shows a flow of processing for removing a vector in a noisy area when the coefficient of variation is used as an index.
- the target ROI 31 is determined in the same manner as in the processing flow of FIG. 8 (step 81), and all the movement candidates in the search area 32 set and calculated in steps 52 and 53 of FIG.
- the average of the SAD values and the standard deviation are calculated according to statistical processing (step 101).
- the coefficient of variation defined by the above equation (3) is obtained (step 102).
- step 84 For the ROI 31 whose coefficient of variation obtained in step 102 is larger than a predetermined value, the motion vector obtained in step 54 of FIG. 3 is replaced with 0 (step 85).
- a predetermined threshold value or the median value of the distribution of variation coefficients obtained for all ROIs 31 in step 84 can be used.
- a histogram of the obtained variation coefficient is generated and the frequency exhibits two peaks, it is also effective to adopt the minimum value of the valley between the two peaks as the predetermined value.
- FIG. 11 shows the variation coefficient obtained for all ROIs 31 in step 102 as an image.
- the coefficient of variation of each ROI 31 is shown by the concentration.
- the coefficient of variation increases in the low SNR region at the bottom of the frame m, and it can be seen that the coefficient of variation reflects the reliability of motion vector estimation.
- FIGS. 12A and 12B show examples of motion vector distribution images before and after removing an erroneous vector.
- FIG. 12A shows the same motion vector field before error vector removal as FIG. 4B
- FIG. 12B obtains the coefficient of variation distribution from the SAD distribution by the processing of FIG.
- An ROI having a median value as a threshold and a coefficient of variation larger than that is determined to have low reliability, and the motion vector is set to 0 (stationary state).
- the lower region where the motion vector is disturbed is clearly removed and replaced with a stationary state.
- the lower region can be determined as a penetration region (that is, a region with low reliability) where an ultrasonic echo cannot be accurately obtained.
- the motion vector distribution is converted into a scalar distribution by steps 26 and 27 in FIG.
- the distribution image (or B-mode image) is synthesized and displayed.
- the motion vector in the region with low reliability is removed to make it stationary.
- the present invention is not limited to this processing method.
- instead of setting the motion vector to a static state it is possible to use a processing method that maintains the state of the motion vector obtained previously for the same region as it is.
- FIG. 13 is a processing flow for removing noise during the calculation of the SAD distribution of the second embodiment.
- noise removal processing (steps 132 and 133) is added to the SAD calculation processing of steps 51 to 54 of FIG. 3 of the first embodiment.
- 14A, 14B, and 14C show examples of the SAD distribution at each processing stage in FIG.
- the processing unit 10 first performs the same processing as steps 51 to 53 in FIG. 3 of the first embodiment to obtain the SAD value distribution.
- An example of the obtained SAD distribution image is shown in FIG.
- the ROI 31 is at the position (4) in FIG. 5B, and the SAD value of the upper movement candidate area 33 is affected by noise even though the phantom 42 originally moves relatively in the horizontal direction. Since the motion vector is determined as it is, erroneous detection of the vector occurs. In order to avoid such erroneous detection, the processing unit 10 performs a smoothing process (low-pass filter LPF (low pass filter) process) on the SAD distribution image obtained in steps 51 to 53 (step 132).
- LPF low pass filter
- a filter having a predetermined size is applied to the SAD distribution image, and the high frequency component of the SAD distribution in the filter is cut and smoothed. This process is repeated while moving the filter by a predetermined amount.
- smoothing the SAD value distribution image the change in the SAD value due to the movement of the phantom 42 is removed because it is steep, whereas the change in the AD value due to noise is gradual and can be extracted. .
- An SAD value distribution image obtained by the smoothing process is shown in FIG.
- step 133 the difference between the original SAD value distribution at step 53 and the smoothed SAD value distribution at step 132 is obtained (step 133).
- the original SAD value distribution by the movement of the phantom from which the fluctuation of the SAD value due to noise is removed can be obtained.
- the obtained distribution is shown in FIG.
- step 54 of FIG. 3 is performed, the movement candidate area 33 having the smallest SAD value is determined as a movement destination, and a motion vector is determined. After the motion vector is determined, a motion vector distribution image is generated by the process of step 56 in FIG. 3 of the first embodiment. Further, in step 25 of FIG. 2 in the first embodiment, it is possible to further perform processing such as removing a vector with low reliability of the motion vector distribution.
- the motion vector can be determined using the SAD value distribution from which the SAD value fluctuation due to noise is removed by the processing of the second embodiment, the reliability of the motion vector can be improved.
- LPF is used.
- the present invention is not limited to this. If the spatial frequency of the distribution of SAD values due to movement of the subject (phantom) is high (that is, a more complicated shape), the band pass is used. It is also effective to apply a filter.
- the size of one side of the filter used in the filter processing in step 132 can be determined as follows. That is, for the SAD distribution that has not been smoothed, the motion vector field is created in advance by performing step 54 in FIG. Set as the size of one side.
- a processing method for directly determining a tissue boundary using the SAD value distribution obtained in step 53 of FIG. 3 of the first embodiment and determining an infiltration degree of a living tumor with respect to a normal tissue Will be described.
- the movement candidate area 33 of the search area 32 is simply referred to as an area 33.
- the ROI 31 is also referred to as a target pixel.
- the region along the boundary of the tissue of the subject shows high brightness in the B-mode image because the tissue similarity is high. For this reason, the SAD has a feature that the region 33 along the boundary of the tissue of the subject has a smaller value than the region 33 along the orthogonal direction of the boundary. On the other hand, when the invasion of the living tumor progresses, the boundary becomes unclear, so the SAD value of the region 33 along the boundary increases. Using this, the degree of infiltration is determined.
- FIG. 15 shows a processing flow of the processing unit 10 of the third embodiment.
- FIGS. 16A to 16H show eight patterns of the target direction and the region 33 selected on the SAD value distribution image correspondingly.
- the processing unit 10 sets a pixel of interest (ROI) 31 at the boundary position of the tissue to be investigated in the desired frame m of the B-mode image, sets a search area 32 in the frame m + ⁇ , and calculates the SAD value distribution of the search area 32.
- ROI pixel of interest
- Obtain step 151).
- the frame selection method and the SAD value distribution calculation method are performed in the same manner as steps 21 to 23 in FIG. 2 and steps 51 to 53 in FIG.
- an area 33 passing through the center of the search area 32 and positioned along a predetermined target direction (horizontal direction) 151 is selected (step 63), and the selected area 33 is selected.
- the sum of the SAD values is obtained (step 64).
- the region 33 positioned along the orthogonal direction (vertical direction) 152 of the target direction 151 is selected, and the sum of the SAD values of the selected region 33 is also obtained.
- steps 63 and 64 are repeated until all the eight patterns shown in FIGS. 16A to 16H are performed (step 62).
- the sum of the SAD values of the region 33 located along a predetermined target direction (direction inclined about 30 ° counterclockwise with respect to the horizontal direction) 151 is obtained, and the orthogonal direction 152 is obtained.
- the sum of the SAD values of the region 33 located along is obtained.
- the target directions 151 of about 45 °, about 60 °, 90 °, about 120 °, about 135 °, and about 150 ° counterclockwise with respect to the horizontal direction;
- the sum of the SAD values of the region 33 located along the orthogonal direction 152 is obtained.
- the target direction 151 that minimizes the sum of the calculated SAD values of each target direction 151 is selected (step 65).
- the direction of the selected target direction 151 is the direction of the tissue boundary. Thereby, the boundary can be directly detected without obtaining a motion vector.
- a direction 152 orthogonal to the selected target direction 151 is selected (step 66).
- the ratio of the sum of the SAD values in the selected target direction 151 and the SAD sum in the direction 152 orthogonal to the selected target direction 151 (the SAD value sum in the target direction / the SAD value sum in the orthogonal direction) is calculated (step 67).
- the degree of infiltration When the degree of infiltration is low and the boundary is clear, the SAD sum in the boundary direction (selected target direction 151) is small and the SAD sum in the orthogonal direction 152 is large, so a small value is obtained as the ratio.
- the degree of infiltration increases and the boundary becomes unclear, the SAD sum in the boundary direction (selected target direction 151) increases, so the ratio increases. Therefore, the degree of infiltration can be evaluated using the ratio as a parameter. Specifically, for example, the degree of infiltration is determined by comparing a plurality of predetermined reference values and ratios, and the determination result is displayed.
- the target pixel (ROI 31) can be identified as a point constituting a boundary line and the boundary can be displayed.
- the direction-dependent filter is a filter having a function of determining a direction in which the density change in the one-dimensional direction is the smallest in the filter range (search region 32) of the processing pixel.
- FIGS. 16A to 16H show region selection patterns in the target direction 151 and the orthogonal direction 152 in the search region 32 composed of the 5 ⁇ 5 region 33 for easy illustration.
- an area selection pattern is set corresponding to the number of areas 33 in the search area 32.
- FIGS. 17A and 17B show the SAD distribution of the search region 32 by setting the ROI 31 at the positions (1) and (2) in FIG. 5B.
- the position (1) is a position inside the phantom 41 that is relatively stationary with respect to the probe 1.
- the position (2) is located in the vicinity of the boundary between the phantom 41 and the phantom 42 moving laterally relative thereto.
- the processing for obtaining the SAD distribution at the positions (1) and (2) is performed in the same manner as steps 21 to 23 in FIG. 2 and steps 51 to 53 in FIG.
- a Laplacian filter that performs spatial quadratic differentiation is applied to the obtained SAD distribution image (FIGS. 17A and 17B), a portion where the fluctuation of the SAD value is greatly emphasized due to the edge enhancement effect. Images of 17 (c) and (d) are obtained, respectively.
- the boundary can be extracted directly by performing Laplacian processing on the SAD distribution. Therefore, steps 54 and 55 in FIG. 3 for determining and imaging the motion vector of the first embodiment, and steps 25 to 55 in FIG. 2 in which the motion vector is denoised and converted into a scalar distribution to estimate the boundary. Since 26 can be omitted, the amount of processing can be greatly reduced.
- the present invention can be applied to medical ultrasonic diagnostic / treatment apparatuses and apparatuses that measure distortion and deviation using electromagnetic waves including ultrasonic waves in general.
- Ultrasonic probe (probe) 2 User interface 3: Transmission beamformer 4: Control system 5: Transmission / reception changeover switch 6: Reception beamformer 7: Envelope detector 8: Scan converter, 10: processing unit, 10a: CPU, 10b: memory, 11: parameter setting unit, 12: synthesis unit, 13: display unit.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Vascular Medicine (AREA)
- Quality & Reliability (AREA)
- Multimedia (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Image Analysis (AREA)
Abstract
Description
(実施形態1)
図1に本実施形態の超音波イメージング装置のシステム構成を示す。本装置は、超音波境界検出機能を備えている。図1のように、本装置は、超音波探触子(プローブ)1、ユーザインタフェース2、送波ビームフォーマ3、制御系4、送受切り替えスイッチ5、受波ビームフォーマ6、包絡線検波部7、スキャンコンバータ8、処理部10、パラメータ設定部11、合成部12および表示部13を備えて構成される。 An ultrasonic imaging apparatus according to an embodiment of the present invention will be described below.
(Embodiment 1)
FIG. 1 shows a system configuration of the ultrasonic imaging apparatus of the present embodiment. This apparatus has an ultrasonic boundary detection function. As shown in FIG. 1, this apparatus includes an ultrasonic probe (probe) 1, a
実施形態1では、SAD分布を求めた後に、ROI31が低SNR領域かどうかをSAD分布により判別し、低SNR領域の場合には動きベクトルを除去等するものであったが、本実施形態では、SAD分布を演算する際に、ノイズを除去し、ノイズ除去後の検出感度の高いSAD分布を用いて信頼性の高い動きベクトルを求める。 (Embodiment 2)
In the first embodiment, after obtaining the SAD distribution, whether or not the ROI 31 is in the low SNR region is determined by the SAD distribution, and in the case of the low SNR region, the motion vector is removed. When calculating the SAD distribution, noise is removed, and a highly reliable motion vector is obtained using the SAD distribution with high detection sensitivity after noise removal.
次に、実施形態3として、実施形態1の図3のステップ53で求めたSAD値分布を用いて、直接的に組織の境界を求めるとともに、生体腫瘍の正常組織に対する浸潤度を判定する処理方法について説明する。なお、本実施形態3においては、探索領域32の移動候補領域33を単に領域33と呼ぶ。また、ROI31を注目画素とも呼ぶ。 (Embodiment 3)
Next, as a third embodiment, a processing method for directly determining a tissue boundary using the SAD value distribution obtained in
実施形態4として、探索領域32のSAD値分布から、動きベクトルを求めずに、直接的に境界を検出する他の方法を説明する。ここでは、2次微分に対応する強調処理を行うラプラシアンフィルタを適用する。 (Embodiment 4)
As a fourth embodiment, another method for directly detecting a boundary without obtaining a motion vector from the SAD value distribution in the search region 32 will be described. Here, a Laplacian filter that performs enhancement processing corresponding to second-order differentiation is applied.
Claims (14)
- 対象に向かって超音波を送信する送信部と、前記対象から到来する超音波を受信する受信部と、前記受信部の受信信号を処理して2フレーム以上の画像を生成する処理部とを有し、
前記処理部は、前記生成した2フレーム以上の画像のうち、1のフレームを基準フレームとし、予め定めた位置、または、操作者から受け付けた位置に関心領域を設定し、別の1のフレームを比較フレームとし、前記関心領域よりも広い探索領域を予め定めた位置、または、操作者から受け付けた位置に設定し、前記探索領域内に前記関心領域の移動先の候補である複数の候補領域を設定し、前記関心領域内と前記候補領域内の画像特性値の類似度を前記候補領域ごとに計算して、前記探索領域全体にわたる前記類似度の分布を求めることを特徴とする超音波イメージング装置。 A transmission unit that transmits ultrasonic waves toward the target; a reception unit that receives ultrasonic waves coming from the target; and a processing unit that generates an image of two or more frames by processing a reception signal of the reception unit. And
The processing unit sets a region of interest at a predetermined position or a position received from an operator from one of the generated two or more frames as a reference frame, and sets another frame as a reference frame. As a comparison frame, a search area wider than the region of interest is set to a predetermined position or a position received from an operator, and a plurality of candidate regions that are candidates for the destination of the region of interest are included in the search region. An ultrasound imaging apparatus characterized by setting and calculating a similarity between image characteristic values in the region of interest and the candidate region for each candidate region to obtain a distribution of the similarity over the entire search region . - 請求項1に記載の超音波イメージング装置において、前記処理部は、前記類似度の分布において前記類似度の最小値と前記類似度の全体の値とを比較する統計量を求め、当該統計量により前記関心領域の信頼度を判定することを特徴とする超音波イメージング装置。 The ultrasonic imaging apparatus according to claim 1, wherein the processing unit obtains a statistic for comparing the minimum value of the similarity and the overall value of the similarity in the similarity distribution, and uses the statistic to calculate the statistic. An ultrasound imaging apparatus, wherein the reliability of the region of interest is determined.
- 請求項2に記載の超音波イメージング装置において、前記処理部は、前記類似度の最小値、平均値および標準偏差を用いて前記統計量を求め、求めた前記統計量と閾値とを比較することにより前記関心領域の信頼度を判定することを特徴とする超音波イメージング装置。 The ultrasonic imaging apparatus according to claim 2, wherein the processing unit obtains the statistic using the minimum value, average value, and standard deviation of the similarity, and compares the obtained statistic with a threshold value. And determining the reliability of the region of interest.
- 請求項3に記載の超音波イメージング装置において、前記処理部は、前記比較フレームにおける前記関心領域に対応する位置と、前記類似度が最小の前記候補領域の位置とを結ぶベクトルを生成し、前記信頼度が低いと判定された前記関心領域については前記ベクトルをゼロ、もしくは、所定のベクトルに置き換えることを特徴とする超音波イメージング装置。 The ultrasound imaging apparatus according to claim 3, wherein the processing unit generates a vector connecting a position corresponding to the region of interest in the comparison frame and a position of the candidate region having the minimum similarity, An ultrasonic imaging apparatus, wherein the vector of interest is determined to have a low reliability, and the vector is replaced with zero or a predetermined vector.
- 請求項2に記載の超音波イメージング装置において、前記処理部は、前記類似度分布について類似度の平均、最小値及び標準偏差を算出し、前記平均と前記最小値の差分を前記標準偏差で除算した分離度を前記統計量として求めることを特徴とする超音波イメージング装置。 The ultrasound imaging apparatus according to claim 2, wherein the processing unit calculates an average, a minimum value, and a standard deviation of the similarity for the similarity distribution, and divides a difference between the average and the minimum value by the standard deviation. An ultrasonic imaging apparatus characterized in that the degree of separation obtained is obtained as the statistic.
- 請求項2に記載の超音波イメージング装置において、前記処理部は、前記類似度分布について類似度の平均及び標準偏差を算出し、前記標準偏差を前記平均で除算した変動係数を前記統計量として求めることを特徴とする超音波イメージング装置。 3. The ultrasound imaging apparatus according to claim 2, wherein the processing unit calculates an average of the similarity and a standard deviation for the similarity distribution, and obtains a coefficient of variation obtained by dividing the standard deviation by the average as the statistic. An ultrasonic imaging apparatus characterized by that.
- 請求項3に記載の超音波イメージング装置において、
前記処理部は、複数の前記関心領域を設定し、それぞれについて前記統計量を求め、求めた統計量の値の頻度を示すヒストグラム分布を求め、
前記ヒストグラム分布における中央値、平均値、または、ヒストグラム分布が複数の山形状を呈する場合には、山の間の谷の最小値の統計量を前記閾値として用いることを特徴とする超音波イメージング装置。 The ultrasonic imaging apparatus according to claim 3.
The processing unit sets a plurality of regions of interest, obtains the statistic for each, obtains a histogram distribution indicating the frequency of the obtained statistic value,
An ultrasonic imaging apparatus characterized in that, when the median value, average value, or histogram distribution in the histogram distribution has a plurality of mountain shapes, a statistic of a minimum value of a valley between mountains is used as the threshold value. . - 請求項1に記載の超音波イメージング装置において、
前記処理部は、前記類似度分布を平滑化処理した平滑化後類似度分布を生成し、平滑化処理前の前記類似度分布から前記平滑化後類似度分布を差し引いた差分類似度分布を求めることを特徴とする超音波イメージング装置。 The ultrasound imaging apparatus according to claim 1,
The processing unit generates a smoothed similarity distribution obtained by smoothing the similarity distribution, and obtains a difference similarity distribution obtained by subtracting the smoothed similarity distribution from the similarity distribution before the smoothing process. An ultrasonic imaging apparatus characterized by that. - 請求項8に記載の超音波イメージング装置において、前記平滑化処理は、前記類似度分布に、予め定めた大きさのフィルタを設定し、該フィルタ内の分布を平滑化する処理を、フィルタを所定量移動させながら繰り返すものであり、
前記処理部は、前記比較フレームにおける前記関心領域に対応する位置と、前記平滑化処理前の類似度分布において前記類似度が最小の前記候補領域の位置とを結ぶベクトルを、複数の前記関心領域について生成し、生成したベクトルのうち、最大のベクトル長を前記フィルタの大きさとすることを特徴とする超音波イメージング装置。 9. The ultrasonic imaging apparatus according to claim 8, wherein the smoothing process includes a process of setting a filter of a predetermined size in the similarity distribution and smoothing the distribution in the filter. It repeats while moving quantitatively,
The processing unit uses a vector connecting a position corresponding to the region of interest in the comparison frame and a position of the candidate region having the minimum similarity in the similarity distribution before the smoothing process as a plurality of the regions of interest. And the maximum vector length among the generated vectors is the size of the filter. - 請求項1に記載の超音波イメージング装置において、前記処理部は、前記類似度分布をラプラシアンフィルタでフィルタリング処理して輪郭強調分布を作成し、前記輪郭強調分布において連続する輪郭線を抽出することにより、前記対象の境界を求めることを特徴とする超音波イメージング装置。 The ultrasonic imaging apparatus according to claim 1, wherein the processing unit creates a contour enhancement distribution by filtering the similarity distribution with a Laplacian filter, and extracts a continuous contour line in the contour enhancement distribution. An ultrasonic imaging apparatus for obtaining a boundary of the object.
- 請求項1に記載の超音波イメージング装置において、前記処理部は、生体腫瘍境界付近に設定された前記関心領域について前記類似度分布を求め、該類似度を画像特性値とする類似度分布画像を生成し、該類似度分布画像上で前記関心領域に対応する位置を中心として複数の異なる方向に所定長の1次元領域を設定する第1処理手段と、
前記1次元領域内の前記類似度の総和を、設定した方向ごとに計算する第2処理手段と、
前記類似度総和が最小となる方向の類似度総和と、それと直交する方向の1次元領域の類似度総和との比率を計算する第3処理手段と、
前記比率に基づいて腫瘍の浸潤度を判定する第4処理手段と、
を有することを特徴とする超音波イメージング装置。 The ultrasonic imaging apparatus according to claim 1, wherein the processing unit obtains the similarity distribution for the region of interest set in the vicinity of a living body tumor boundary, and obtains a similarity distribution image having the similarity as an image characteristic value. Generating and setting a one-dimensional region of a predetermined length in a plurality of different directions around the position corresponding to the region of interest on the similarity distribution image;
A second processing means for calculating the sum of the similarities in the one-dimensional region for each set direction;
A third processing means for calculating a ratio between a similarity sum in a direction in which the similarity sum is minimum and a similarity sum of a one-dimensional region in a direction orthogonal thereto;
Fourth processing means for determining the degree of tumor invasion based on the ratio;
An ultrasonic imaging apparatus comprising: - 請求項1に記載の超音波イメージング装置において、前記処理部は、前記第3処理手段で計算された前記比率が予め設定した一定値よりも小さい場合には、その関心領域が境界線を構成する点であると判断することを特徴とする超音波イメージング装置。 2. The ultrasound imaging apparatus according to claim 1, wherein when the ratio calculated by the third processing unit is smaller than a predetermined value, the processing unit forms a boundary line. 3. An ultrasonic imaging apparatus characterized in that it is determined to be a point.
- 対象に向かって超音波を送信し、前記対象から到来する超音波を受信して得た受信信号を処理して2フレーム以上の画像を生成し、
前記画像から基準フレームと比較フレームとを選択し、
前記基準フレームに関心領域を設定し、
前記比較フレームに前記関心領域よりも広い探索領域を設定し、前記探索領域内に前記関心領域の移動先の候補である複数の候補領域を設定し、
前記関心領域内と前記候補領域内の画像特性値の類似度を前記候補領域ごとに計算して、前記探索領域全体にわたる前記類似度の分布を求めることを特徴とする超音波イメージング方法。 Transmitting an ultrasonic wave toward the object, processing the reception signal obtained by receiving the ultrasonic wave coming from the object, and generating an image of two or more frames,
Select a reference frame and a comparison frame from the image,
A region of interest is set in the reference frame;
A search area wider than the region of interest is set in the comparison frame, and a plurality of candidate regions that are candidates for movement of the region of interest are set in the search region,
An ultrasonic imaging method, wherein similarity between image characteristic values in the region of interest and the candidate region is calculated for each candidate region, and the similarity distribution over the entire search region is obtained. - コンピュータに、
2フレーム以上の超音波画像から基準フレームと比較フレームとを選択する第1のステップ、
前記基準フレームに関心領域を設定する第2のステップ、
前記比較フレームに前記関心領域よりも広い探索領域を設定し、前記探索領域内に前記関心領域の移動先の候補である複数の候補領域を設定する第3のステップ、
前記関心領域内と前記候補領域内の画像特性値の類似度を前記候補領域ごとに計算して、前記探索領域全体にわたる前記類似度の分布を求める第4のステップ、
を実行させるための超音波イメージング用プログラム。 On the computer,
A first step of selecting a reference frame and a comparison frame from an ultrasonic image of two or more frames;
A second step of setting a region of interest in the reference frame;
A third step of setting a search region wider than the region of interest in the comparison frame, and setting a plurality of candidate regions that are candidates for the destination of the region of interest in the search region;
A fourth step of calculating a similarity between image characteristic values in the region of interest and the candidate region for each candidate region, and obtaining a distribution of the similarity over the entire search region;
Ultrasound imaging program for running
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201080046798.3A CN102596050B (en) | 2009-10-27 | 2010-10-26 | Ultrasonic imaging device and ultrasonic imaging method |
EP10826734A EP2494924A1 (en) | 2009-10-27 | 2010-10-26 | Ultrasonic imaging device, ultrasonic imaging method and program for ultrasonic imaging |
US13/503,858 US8867813B2 (en) | 2009-10-27 | 2010-10-26 | Ultrasonic imaging device, ultrasonic imaging method and program for ultrasonic imaging |
JP2011538440A JP5587332B2 (en) | 2009-10-27 | 2010-10-26 | Ultrasonic imaging apparatus and program for ultrasonic imaging |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-246734 | 2009-10-27 | ||
JP2009246734 | 2009-10-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011052602A1 true WO2011052602A1 (en) | 2011-05-05 |
Family
ID=43922027
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/068988 WO2011052602A1 (en) | 2009-10-27 | 2010-10-26 | Ultrasonic imaging device, ultrasonic imaging method and program for ultrasonic imaging |
Country Status (5)
Country | Link |
---|---|
US (1) | US8867813B2 (en) |
EP (1) | EP2494924A1 (en) |
JP (1) | JP5587332B2 (en) |
CN (1) | CN102596050B (en) |
WO (1) | WO2011052602A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013061664A1 (en) * | 2011-10-28 | 2013-05-02 | 日立アロカメディカル株式会社 | Ultrasound imaging apparatus, ultrasound imaging method and ultrasound imaging program |
US20130170721A1 (en) * | 2011-12-29 | 2013-07-04 | Samsung Electronics Co., Ltd. | Method and apparatus for processing ultrasound image |
CN103536305A (en) * | 2012-07-11 | 2014-01-29 | 通用电气公司 | Systems and methods for performing image type recognition |
KR20140063440A (en) * | 2012-11-15 | 2014-05-27 | 톰슨 라이센싱 | Method for superpixel life cycle management |
WO2014080833A1 (en) * | 2012-11-21 | 2014-05-30 | 株式会社東芝 | Ultrasonic diagnostic device, image processing device, and image processing method |
WO2014103501A1 (en) * | 2012-12-28 | 2014-07-03 | 興和株式会社 | Image processing device, image processing method, image processing program, and recording medium storing said program |
JP2018000261A (en) * | 2016-06-27 | 2018-01-11 | 株式会社日立製作所 | Ultrasonic imaging apparatus, ultrasonic imaging method, and coupling state evaluation apparatus |
KR101819028B1 (en) * | 2011-07-11 | 2018-01-16 | 삼성전자주식회사 | Method and apparatus for processing a ultrasound image |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103093418B (en) * | 2013-02-21 | 2015-08-26 | 深圳市晶日盛科技有限公司 | A kind of digital image scaling method of improvement |
WO2014155272A1 (en) * | 2013-03-28 | 2014-10-02 | Koninklijke Philips N.V. | Real-time quality control for acquisition of 3d ultrasound images |
JP5651258B1 (en) * | 2014-02-27 | 2015-01-07 | 日立アロカメディカル株式会社 | Ultrasonic diagnostic apparatus and program |
US9686470B2 (en) * | 2014-05-30 | 2017-06-20 | Apple Inc. | Scene stability detection |
WO2017211757A1 (en) * | 2016-06-10 | 2017-12-14 | Koninklijke Philips N.V. | Using reflected shear waves for monitoring lesion growth in thermal ablations |
CN108225496B (en) * | 2016-12-15 | 2020-02-21 | 重庆川仪自动化股份有限公司 | Radar level meter echo signal automatic testing device, method and system |
US11611773B2 (en) * | 2018-04-06 | 2023-03-21 | Qatar Foundation For Education, Science And Community Development | System of video steganalysis and a method for the detection of covert communications |
KR102661955B1 (en) * | 2018-12-12 | 2024-04-29 | 삼성전자주식회사 | Method and apparatus of processing image |
CN109875607A (en) * | 2019-01-29 | 2019-06-14 | 中国科学院苏州生物医学工程技术研究所 | Infiltrate tissue testing method, apparatus and system |
JP6579727B1 (en) * | 2019-02-04 | 2019-09-25 | 株式会社Qoncept | Moving object detection device, moving object detection method, and moving object detection program |
CN113556979A (en) * | 2019-03-19 | 2021-10-26 | 奥林巴斯株式会社 | Ultrasonic observation device, method for operating ultrasonic observation device, and program for operating ultrasonic observation device |
CN116823829B (en) * | 2023-08-29 | 2024-01-09 | 深圳微创心算子医疗科技有限公司 | Medical image analysis method, medical image analysis device, computer equipment and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002222410A (en) * | 2001-01-25 | 2002-08-09 | Hitachi Medical Corp | Image diagnostic device |
JP2004057275A (en) * | 2002-07-25 | 2004-02-26 | Hitachi Medical Corp | Image diagnostic device |
JP2004121834A (en) * | 2002-09-12 | 2004-04-22 | Hitachi Medical Corp | Movement tracing method for biological tissue, image diagnosing system using the tracing method, and movement tracing program for biological tissue |
JP2004129773A (en) * | 2002-10-09 | 2004-04-30 | Hitachi Medical Corp | Ultrasonic imaging device and ultrasonic signal processing method |
JP2004135929A (en) | 2002-10-18 | 2004-05-13 | Hitachi Medical Corp | Ultrasonic diagnostic apparatus |
JP2004351050A (en) * | 2003-05-30 | 2004-12-16 | Hitachi Medical Corp | Method of tracking movement of organic tissue in diagnostic image and image diagnostic apparatus using the method |
JP2008079792A (en) | 2006-09-27 | 2008-04-10 | Hitachi Ltd | Ultrasonic diagnostic apparatus |
WO2010052868A1 (en) * | 2008-11-10 | 2010-05-14 | 株式会社日立メディコ | Ultrasonic image processing method and device, and ultrasonic image processing program |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5002181B2 (en) * | 2006-03-31 | 2012-08-15 | 株式会社東芝 | Ultrasonic diagnostic apparatus and ultrasonic diagnostic apparatus control method |
-
2010
- 2010-10-26 CN CN201080046798.3A patent/CN102596050B/en not_active Expired - Fee Related
- 2010-10-26 JP JP2011538440A patent/JP5587332B2/en not_active Expired - Fee Related
- 2010-10-26 EP EP10826734A patent/EP2494924A1/en not_active Withdrawn
- 2010-10-26 WO PCT/JP2010/068988 patent/WO2011052602A1/en active Application Filing
- 2010-10-26 US US13/503,858 patent/US8867813B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002222410A (en) * | 2001-01-25 | 2002-08-09 | Hitachi Medical Corp | Image diagnostic device |
JP2004057275A (en) * | 2002-07-25 | 2004-02-26 | Hitachi Medical Corp | Image diagnostic device |
JP2004121834A (en) * | 2002-09-12 | 2004-04-22 | Hitachi Medical Corp | Movement tracing method for biological tissue, image diagnosing system using the tracing method, and movement tracing program for biological tissue |
JP2004129773A (en) * | 2002-10-09 | 2004-04-30 | Hitachi Medical Corp | Ultrasonic imaging device and ultrasonic signal processing method |
JP2004135929A (en) | 2002-10-18 | 2004-05-13 | Hitachi Medical Corp | Ultrasonic diagnostic apparatus |
JP2004351050A (en) * | 2003-05-30 | 2004-12-16 | Hitachi Medical Corp | Method of tracking movement of organic tissue in diagnostic image and image diagnostic apparatus using the method |
JP2008079792A (en) | 2006-09-27 | 2008-04-10 | Hitachi Ltd | Ultrasonic diagnostic apparatus |
WO2010052868A1 (en) * | 2008-11-10 | 2010-05-14 | 株式会社日立メディコ | Ultrasonic image processing method and device, and ultrasonic image processing program |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101819028B1 (en) * | 2011-07-11 | 2018-01-16 | 삼성전자주식회사 | Method and apparatus for processing a ultrasound image |
WO2013061664A1 (en) * | 2011-10-28 | 2013-05-02 | 日立アロカメディカル株式会社 | Ultrasound imaging apparatus, ultrasound imaging method and ultrasound imaging program |
JPWO2013061664A1 (en) * | 2011-10-28 | 2015-04-02 | 日立アロカメディカル株式会社 | Ultrasonic imaging apparatus, ultrasonic imaging method, and ultrasonic imaging program |
CN103906473A (en) * | 2011-10-28 | 2014-07-02 | 日立阿洛卡医疗株式会社 | Ultrasound imaging apparatus, ultrasound imaging method and ultrasound imaging program |
US20130170721A1 (en) * | 2011-12-29 | 2013-07-04 | Samsung Electronics Co., Ltd. | Method and apparatus for processing ultrasound image |
KR20130077406A (en) * | 2011-12-29 | 2013-07-09 | 삼성전자주식회사 | Apparatus and method for processing ultrasound image |
KR101881924B1 (en) * | 2011-12-29 | 2018-08-27 | 삼성전자주식회사 | Apparatus and Method for processing ultrasound image |
US9129186B2 (en) * | 2011-12-29 | 2015-09-08 | Samsung Electronics Co., Ltd. | Method and apparatus for processing ultrasound image |
CN103536305A (en) * | 2012-07-11 | 2014-01-29 | 通用电气公司 | Systems and methods for performing image type recognition |
JP2014099178A (en) * | 2012-11-15 | 2014-05-29 | Thomson Licensing | Method for superpixel life cycle management |
KR102128121B1 (en) * | 2012-11-15 | 2020-06-29 | 인터디지털 브이씨 홀딩스 인코포레이티드 | Method for superpixel life cycle management |
KR20140063440A (en) * | 2012-11-15 | 2014-05-27 | 톰슨 라이센싱 | Method for superpixel life cycle management |
WO2014080833A1 (en) * | 2012-11-21 | 2014-05-30 | 株式会社東芝 | Ultrasonic diagnostic device, image processing device, and image processing method |
US10376236B2 (en) | 2012-11-21 | 2019-08-13 | Canon Medical Systems Corporation | Ultrasound diagnostic apparatus, image processing apparatus, and image processing method |
WO2014103501A1 (en) * | 2012-12-28 | 2014-07-03 | 興和株式会社 | Image processing device, image processing method, image processing program, and recording medium storing said program |
JPWO2014103501A1 (en) * | 2012-12-28 | 2017-01-12 | 興和株式会社 | Image processing apparatus, image processing method, image processing program, and recording medium storing the program |
JP2018000261A (en) * | 2016-06-27 | 2018-01-11 | 株式会社日立製作所 | Ultrasonic imaging apparatus, ultrasonic imaging method, and coupling state evaluation apparatus |
Also Published As
Publication number | Publication date |
---|---|
JPWO2011052602A1 (en) | 2013-03-21 |
EP2494924A1 (en) | 2012-09-05 |
US20120224759A1 (en) | 2012-09-06 |
JP5587332B2 (en) | 2014-09-10 |
CN102596050A (en) | 2012-07-18 |
CN102596050B (en) | 2014-08-13 |
US8867813B2 (en) | 2014-10-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5587332B2 (en) | Ultrasonic imaging apparatus and program for ultrasonic imaging | |
JP5498299B2 (en) | System and method for providing 2D CT images corresponding to 2D ultrasound images | |
JP5579527B2 (en) | System and method for providing 2D CT images corresponding to 2D ultrasound images | |
JP6935020B2 (en) | Systems and methods for identifying features of ultrasound images | |
US7864998B2 (en) | Apparatus and method for processing an ultrasound spectrum image | |
US11013495B2 (en) | Method and apparatus for registering medical images | |
US10136875B2 (en) | Ultrasonic diagnostic apparatus and ultrasonic diagnostic method | |
JP5813779B2 (en) | Ultrasonic imaging apparatus, ultrasonic imaging method, and ultrasonic imaging program | |
JP2008079792A (en) | Ultrasonic diagnostic apparatus | |
CN108209966B (en) | Parameter adjusting method and device of ultrasonic imaging equipment | |
JP2011152416A (en) | Ultrasonic system and method for processing three-dimensional ultrasonic screen image and measuring size of concerned object | |
JP2005193017A (en) | Method and system for classifying diseased part of mamma | |
US20240050062A1 (en) | Analyzing apparatus and analyzing method | |
US9110156B2 (en) | Apparatus and system for measuring velocity of ultrasound signal | |
JP6385702B2 (en) | Ultrasonic diagnostic apparatus, medical image processing apparatus, and medical image processing program | |
KR101627319B1 (en) | medical image processor and method thereof for medical diagnosis | |
JP6731369B2 (en) | Ultrasonic diagnostic device and program | |
JP4634872B2 (en) | Ultrasonic diagnostic equipment | |
JP2005318921A (en) | Ultrasonic diagnostic equipment | |
JP2004242836A (en) | Ultrasonic diagnostic apparatus and method for image processing in ultrasonic diagnostic apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080046798.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10826734 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011538440 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13503858 Country of ref document: US Ref document number: 2010826734 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |