US9008387B2 - Method and apparatus for processing ultrasound images - Google Patents

Method and apparatus for processing ultrasound images Download PDF

Info

Publication number
US9008387B2
US9008387B2 US13/544,577 US201213544577A US9008387B2 US 9008387 B2 US9008387 B2 US 9008387B2 US 201213544577 A US201213544577 A US 201213544577A US 9008387 B2 US9008387 B2 US 9008387B2
Authority
US
United States
Prior art keywords
frames
key
frame
roi
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US13/544,577
Other languages
English (en)
Other versions
US20130182926A1 (en
Inventor
Jae-keun Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Medison Co Ltd
Original Assignee
Samsung Medison Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Medison Co Ltd filed Critical Samsung Medison Co Ltd
Assigned to SAMSUNG MEDISON CO., LTD. reassignment SAMSUNG MEDISON CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LEE, JAE-KEUN
Publication of US20130182926A1 publication Critical patent/US20130182926A1/en
Application granted granted Critical
Publication of US9008387B2 publication Critical patent/US9008387B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • G06T5/008
    • G06K9/00671
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • G06K2209/01
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S128/00Surgery
    • Y10S128/92Computer assisted medical diagnostics
    • Y10S128/922Computer assisted medical diagnostics including image analysis

Definitions

  • the present invention relates to a method and apparatus for processing ultrasound images, and more particularly, to a method and apparatus for reconstructing color image data by compensating for the color image data.
  • An ultrasound diagnosis apparatus delivers an ultrasound signal (in general, equal to or greater than 20 kHz) to a predetermined internal part of a target object by using a probe, and obtains an image of the internal part of the target object by using information of a reflected echo signal.
  • the ultrasound diagnosis apparatus is used for medical purposes including detection of foreign materials in the target object, damage measurement and observance, or the like.
  • the ultrasound diagnosis apparatus is stable, is non-invasive, and displays an image in real-time, so that the ultrasound diagnosis apparatus is widely used with an image diagnosis apparatus.
  • the image of the target object which is obtained via the probe, undergoes a process such as rendering and then is displayed.
  • the present invention provides an ultrasound image processing apparatus and a method thereof for improving an image quality of color image data by performing image compensation on a plurality of frames before rendering is performed on the color image data.
  • the present invention also provides a computer-readable recording medium having recorded thereon a program for executing the method.
  • a method of processing ultrasound images including operations of receiving color image data comprising a plurality of frames; determining one or more key-frames from among the plurality of frames, based on a brightness value of each of the plurality of frames; setting a region of interest (ROI) in each of the one or more key-frames based on brightness values of a plurality of regions comprised in each of the one or more key-frames; performing image compensation on the plurality of frames based on the ROIs of the one or more key-frames; and reconstructing the color image data by using the image-compensated frames.
  • ROI region of interest
  • the operation of determining the one or more key-frames may include an operation of dividing the plurality of frames into N frame sections and determining the key-frame in each of the N frame sections, wherein the key-frame has the largest sum of brightness values of a plurality of pixels included in each of one or more frames included in each of the N frame sections.
  • the operation of setting the ROI may be performed based on a brightness value in a first axis direction and a brightness value in a second axis direction of each of the one or more key-frames.
  • the operation of setting the ROI may include an operation of setting a region as the ROI, wherein, among the plurality of regions comprised in each of the one or more key-frames, the region has a size equal to or greater than a first threshold value.
  • the operation of setting the ROI may further include an operation of determining a central point in each of the one or more key-frames, wherein the central point has the largest brightness value in first and second axes directions, and the operation of performing the image compensation may include an operation of determining differences between locations in the first and second axes directions of central points of two consecutive key-frames, and determining central points of frames between the two consecutive key-frames based on the differences.
  • the operation of performing the image compensation may include an operation of compensating for brightness values of ROIs of frames between two consecutive key-frames from among the plurality of frames, by using a persistence method, and the persistence method may be performed by summing a received data value for a specific frame and a data value of at least one frame that is received before the specific frame, according to a persistence rate, and then by determining a sum value as a data value of the specific frame.
  • the persistence rate may be randomly set by a user or may be automatically decided according to a period of the determined key-frames.
  • the operation of performing the image compensation may include an operation of compensating for brightness values of ROIs of frames between two consecutive key-frames from among the plurality of frames, by using a frame interpolation method, and the frame interpolation method may be performed by determining data values of one or more frames between two consecutive specific frames according to data values of the two consecutive specific frames.
  • an ultrasound image processing apparatus including a key-frame determining unit for receiving color image data comprising a plurality of frames, and determining one or more key-frames from among the plurality of frames, based on a brightness value of each of the plurality of frames; a region of interest (ROI) setting unit for setting an ROI in each of the one or more key-frames based on brightness values of a plurality of regions comprised in each of the one or more key-frames; and an image compensating unit for performing image compensation on the plurality of frames based on the ROIs of the one or more key-frames, and reconstructing the color image data by using the image-compensated frames.
  • ROI region of interest
  • a computer-readable recording medium having recorded thereon a program for executing the method.
  • FIG. 1 is a block diagram illustrating an ultrasound color image processing procedure according to an embodiment of the present invention
  • FIG. 2 is a diagram illustrating a structure of an ultrasound image processing apparatus, according to an embodiment of the present invention
  • FIG. 3 is a flowchart of a method of processing ultrasound images, according to an embodiment of the present invention.
  • FIG. 4 is a graph illustrating a brightness value of each frame of color image data
  • FIG. 5 illustrates an example in which a region of interest (ROI) is set in a key-frame
  • FIG. 6 is a graph illustrating a brightness value of each frame after brightness values of the graph of FIG. 4 are compensated for;
  • FIG. 7 illustrates an example of color image data
  • FIG. 8 illustrates data obtained by compensating for brightness values and locations of data of FIG. 7 .
  • FIG. 1 is a diagram illustrating a procedure of processing an ultrasound signal received via a probe 10 .
  • a beamformer 11 focuses an emitted beam by adjusting a delay time of the ultrasound signal emitted from the probe 10 , and adjusts focusing of a received echo signal.
  • the received ultrasound signal is amplified and then is processed as a Doppler signal via an In-phase/Quadrature (I/Q) demodulation unit 12 .
  • the Doppler signal is obtained in the form of color image data 13 and passes through a pre-processing unit 14 . Then, the color image data 13 undergoes a rendering process via a rendering unit 15 and is displayed on a display unit 16 .
  • I/Q In-phase/Quadrature
  • the pre-processing process is called ‘pre-processing process’ because a processing operation is performed on the color image data 13 before a rendering operation is performed thereon, and hereinafter, an apparatus and method for efficiently improving connectivity between frames of a color image in the pre-processing process will now be described.
  • FIG. 2 is a diagram illustrating a structure of an ultrasound image processing apparatus 100 according to an embodiment of the present invention.
  • the ultrasound image processing apparatus 100 may include a key-frame determining unit 110 , a region of interest (ROI) setting unit 120 , and an image compensating unit 130 .
  • the ultrasound image processing apparatus 100 of FIG. 2 only includes configuring elements related to one or more embodiments of the present invention. Thus, it is obvious to one of ordinary skill in the art that general-use elements other than the configuring elements of FIG. 2 may be further included.
  • the ultrasound image processing apparatus 100 improves connectivity between frames of an image by processing the color image data 13 received via the probe 10 .
  • the ultrasound image processing apparatus 100 may be included in an ultrasound diagnosis apparatus (not shown) that diagnoses a target object by using an ultrasound signal and then displays a diagnosis result.
  • the key-frame determining unit 110 receives the color image data 13 including a plurality of frames, and determines at least one key-frame from among the plurality of frames. In the determination of the at least one key-frame, the key-frame determining unit 110 may determine the at least one key-frame based on a brightness value of each frame.
  • the term ‘brightness value’ means a power of a signal of a plurality of pixels included in each frame.
  • a brightness value in a black-and-white image means a gray level component between 0 through 255 indicating a brightness of each pixel.
  • pixels included in a frame of a color image may be numerically expressed by digitizing velocity of a measured target object and power of a measured signal in the form of RGB.
  • a power component of the measured signal corresponds to the brightness value in the black-and-white image. Accordingly, hereinafter, it is assumed that the term ‘brightness value’ of a color image means a power component of a measured signal indicated by pixels.
  • the key-frame determining unit 110 may divide the plurality of frames into N frame sections (where N is a natural number) and may determine a key-frame in every frame section. Also, the key-frame determining unit 110 may determine as a key-frame a frame that has the largest sum of brightness values of a plurality of pixels included in each of one or more frames included in each of the N frame sections.
  • the ROI setting unit 120 sets an ROI in every key-frame determined by the key-frame determining unit 110 , based on a brightness value in first and second axes directions.
  • the first axis and the second axis may be an X-axis and a Y-axis of a key-frame, respectively, and any two axes that meet at right angles may be the first axis and the second axis.
  • the ROI setting unit 120 may exclude a region from the ROI, wherein the region has a size equal to or less than a threshold value.
  • the threshold value may be randomly set by a user or may be determined according to a type of color image data.
  • the ROI setting unit 120 may set a central point 530 at which a brightness value of a key-frame is maximal. A process of determining the ROI will be described in detail with reference to FIGS. 3 and 5 .
  • the image compensating unit 130 performs image compensation on the plurality of frames based on ROIs of the key-frames, and reconstructs the color image data 13 by using the image-compensated frames.
  • the image compensating unit 130 when the image compensating unit 130 performs the image compensation on the plurality of frames, the image compensating unit 130 may determine differences between locations in the first and second axes directions of central points 530 of two consecutive key-frames, and may decide central points of frames between the two consecutive key-frames based on the differences.
  • the image compensating unit 130 may compensate for a brightness value of ROIs by using a persistence method, wherein the ROIs are of frames between two consecutive key-frames from among a plurality of frames.
  • the image compensating unit 130 may use a frame interpolation method other than the persistence method. A process of compensating for an image will be described in detail with reference to FIGS. 3 and 6 .
  • Reconstructed color image data 140 that is reconstructed via the image compensating unit 130 may pass through the rendering unit 15 and then may be displayed on the display unit 16 .
  • the ultrasound image processing apparatus 100 may further include a storage unit (not shown) for storing the color image data 13 .
  • the storage unit may store the color image data 13 to which key-frame determination has not yet been performed by the key-frame determining unit 110 .
  • the storage unit may store the reconstructed color image data 140 that is obtained after the image compensating unit 130 compensates for the color image data 13 .
  • the ultrasound image processing apparatus 100 may improve connection by compensating for the color image data 13 .
  • a realistic color image may be provided. Further, a user may have help in efficiently diagnosing the target object.
  • FIG. 3 is a flowchart of a method of processing ultrasound images, according to an embodiment of the present invention.
  • the key-frame determining unit 110 receives the color image data 13 including a plurality of frames.
  • Each of the plurality of frames may be a two-dimensional (2D) image frame or a three-dimensional (3D) image frame.
  • the plurality of frames may correspond to B-mode image frames that are obtained by scanning a target object via the probe 10 .
  • the plurality of frames may correspond to A/B/C plane image frames, respectively, that are obtained by dividing volume data according to three sections that cross each other. Because the B-mode image frames or the A/B/C plane image frames are a well-known technology to one of ordinary skill in the art, detailed descriptions thereof are omitted here.
  • the key-frame determining unit 110 may receive the color image data 13 from the storage unit.
  • the key-frame determining unit 110 determines at least one key-frame from among the plurality of frames, based on a brightness value of each frame.
  • Each of the plurality of frames included in the color image data 13 includes a plurality of pixels. Accordingly, the key-frame determining unit 110 may obtain a brightness value of a frame by summing all brightness values of a plurality of pixels included in the frame.
  • a type of the probe 10 and a characteristic of the color image data 13 may vary according to a tissue or a part to be scanned.
  • a diagnosis of vascular diseases by scanning blood vessels is referred to as a vascular application.
  • the color image data 13 includes a frame having a large brightness value and a small brightness value, due to a heartbeat period. Due to the brightness difference between frames, it is required to improve connection in the color image data 13 .
  • the key-frame determining unit 110 may determine one or more frames having larger brightness values, compared to brightness values of other frames of a plurality of frames.
  • the one or more frames having the larger brightness values correspond to key-frames.
  • a horizontal axis of a frame graph 400 indicates numbers of a plurality of frames included in the color image data 13
  • a vertical axis of the frame graph 400 indicates sum values, each obtained by summing brightness values of a plurality of pixels included in each of the plurality of frames.
  • the sum value of the brightness values of each of a plurality of frames 400 a through 400 e is relatively greater than other frames, thus, the frames 400 a through 400 e may become key-frames.
  • the frames 400 a through 400 e correspond to 6 th , 27 th , 47 th , 68 th and 87 th frames, respectively, and the key-frames are obtained according to a predetermined period.
  • the key-frame determining unit 110 may obtain one key-frame for about every 20 frames.
  • the key-frame determining unit 110 may divide the plurality of frames included in the color image data 13 into N frame sections (where N is a natural number) and may determine a key-frame in every frame section. Accordingly, the key-frame determining unit 110 may obtain one or more key-frames having a predetermined period.
  • the key-frame determining unit 110 may determine that the frame is not a key-frame.
  • the first threshold value 410 may be directly set by a user or may be automatically decided based on a part to be diagnosed.
  • a process, performed by the key-frame determining unit 110 , of determining a key-frame from among a plurality of frames may be performed by using many methods other than the method of using the sum value of the brightness values of each frame.
  • the key-frame determining unit 110 may determine three consecutive frames having large brightness values as candidates for a key-frame and may determine as the key-frame a frame that has the largest brightness value from among the three consecutive frames.
  • a key-frame from among a plurality of frames may be determined by using many other methods other than the aforementioned methods.
  • the ROI setting unit 120 sets an ROI 510 for every key-frame, based on brightness values of a plurality of regions included in every key-frame.
  • the key-frame may include the plurality of regions, and the plurality of regions may have different brightness values. From among the plurality of regions, a region that is directly related to a diagnosis target corresponds to the ROI 510 .
  • the ROI 510 may have a larger brightness value, compared to other regions of the plurality of regions.
  • the ROI setting unit 120 when the ROI setting unit 120 sets an ROI, the ROI setting unit 120 may make a profile of a brightness value in first and second axes directions.
  • the first axis and the second axis may be an X-axis and a Y-axis of the key-frame, respectively, and any two axes that meet at right angles may be the first axis and the second axis. An example thereof will now be described with reference to FIG. 5 .
  • the ROI 510 is displayed on a key-frame 500 .
  • the ROI setting unit 120 may determine a brightness profile 502 in a first axis direction of the key-frame 500 and a brightness profile 504 in a second axis direction of the key-frame 500 . Accordingly, when a brightness value of a specific part from among a plurality of pixels included in the key-frame 500 has a large value, the ROI setting unit 120 determines the specific part as the ROI 510 .
  • the image compensating unit 130 may include not only data about a part such as a blood vessel or a body tissue to be diagnosed but also includes undesired data such as noise. Thus, it is necessary for the ROI setting unit 120 to prevent an excluded region 520 from being included in the ROI 510 , according to the brightness profiles 502 and 504 .
  • the ROI setting unit 120 may set a second threshold value with respect to a size of the ROI 510 and may determine that the excluded region 520 having a size equal to or less than the second threshold value is not included in the ROI 510 although the excluded region 520 has a large brightness value according to the brightness profiles 502 and 504 .
  • the ROI setting unit 120 may obtain one or more regions in every key-frame 500 based on a brightness value according to first and second axes directions, and may select a region among the one or more regions that has a size equal to or greater than the second threshold value, as the ROI 510 .
  • the ROI setting unit 120 may obtain the regions 510 and 520 that have the large brightness profiles 502 and 504 and may select only the region 510 having a size equal to or greater than the second threshold value as an ROI.
  • the second threshold value may be randomly set by a user or may be automatically set as a value that is predefined according to a measuring application.
  • the second threshold value with respect to the size of the ROI 510 is different from the first threshold value 410 with respect to the brightness value of the frames.
  • the ROI setting unit 120 may select the central point 530 at which the brightness value according to the first and second axes directions is at its peak value, which is separate from setting the ROI 510 . Because a region having the largest brightness value in the key-frame 500 is the ROI 510 , the central point 530 may be located in the ROI 510 . The central point 530 may be used in image compensation that is performed by the image compensating unit 130 , as described below.
  • the image compensating unit 130 performs image compensation on the plurality of frames based on the ROI 510 of the key-frame.
  • the image compensating unit 130 may perform two types of compensation operations.
  • the image compensating unit 130 may compensate for a location of a central point of each frame. Based on the central point 530 of the key-frame, which is determined by the ROI setting unit 120 , the image compensating unit 130 may compensate for locations of central points of frames other than the key-frame. In more detail, the image compensating unit 130 may determine dx and dy that are differences between locations in the first and second axes directions of the central points 530 of two consecutive key-frames, and may determine locations of central points of frames between the two consecutive key-frames based on dx and dy.
  • 1 st and 11 th frames are key-frames, and dx and dy that are differences between locations of central points of the 1 st and 11 th frames are increased by 20 pixels along the first and second axes directions.
  • a 2 nd frame may have a central point that is moved from the 1 st frame by 2 pixels in the first and second axes directions.
  • a 3 rd frame may have a central point that is moved from the 1 st frame by 4 pixels in the first and second axes directions.
  • a 10 th frame may have a central point that is moved from the 1 st frame by 10 pixels in the first and second axes directions.
  • the image compensating unit 130 may perform image compensation by adjusting locations of central points of frames other than key-frames, based on differences between locations of the central points 530 of the key-frames, so that the key-frames may be naturally connected.
  • the image compensating unit 130 may compensate for a brightness value of each frame.
  • the image compensating unit 130 may compensate for brightness values of ROIs 510 in frames between two consecutive key-frames from among frames included in the color image data 13 .
  • a persistence method may be used.
  • the persistence method sums a received data value for a specific frame and a data value of at least one frame that is received before the specific frame, according to a persistence rate, and then decides a sum value as a data value of the specific frame.
  • brightness values of a plurality of pixels included in an ROI 510 of a next frame of a key-frame may be determined based on brightness values of a plurality of pixels included in an ROI 510 of the key-frame.
  • Equation 1 is a basic equation used for the persistence method.
  • y ( n ) (1 ⁇ p )* x ( n )+ p*y ( n ⁇ 1) [Equation 1] where, y(n ⁇ 1) indicates a brightness value of a pixel at a specific location of a key-frame, and x(n) indicates a brightness value of a pixel at the same specific location of a next frame of the key-frame. p indicates a persistence rate and will be described in detail at a later time. y(n) indicates a compensated brightness value of a pixel at a corresponding location, which is obtained by using the persistence method.
  • Equation 1 is a basic equation related to the persistence method. However, it is obvious to one of ordinary skill in the art that a further complicated equation may be used to sum x(n) and y(n ⁇ 1).
  • the persistence rate p may be determined based on a period of key-frames which is determined by the key-frame determining unit 110 . As the determined period of key-frames increases, the image compensating unit 130 may determine the persistence rate to have a large value. For example, when a case in which a key-frame is determined in every 10 frames is compared to a case in which a key-frame is determined in every 20 frames, if brightness values of ROIs of frames other than the key-frame are compensated for by using the same persistence rate p, in a case of 20 frames, an affect by a brightness value of the key-frame is considerably decreased in frames after a 10 th frame.
  • a frame interpolation method other than the persistence method may be used.
  • the persistence method and the frame interpolation method are different from each other in that the persistence method involves performing compensation on frames between key-frames by using brightness values of the key-frames whereas the frame interpolation method involves newly generating frames between the key-frames by using the brightness values of the key-frames. Because the frame interpolation method is a well-known technology to one of ordinary skill in the art, the detailed descriptions thereof are omitted here.
  • FIG. 6 illustrates a frame graph 600 showing a result of compensation for the brightness values of the frame graph 400 in FIG. 4 .
  • a bold line 610 indicates the result of the compensation for the brightness values of the frame graph 400 in FIG. 4 , performed by the image compensating unit 130 .
  • a result obtained by the image compensating unit 130 that compensates for brightness values of the key-frame 400 a and the key-frame 400 b according to a predetermined persistence rate is shown in FIG. 4 .
  • the bold line 610 sharply increases just before a key-frame and then slowly decreases from the key-frame to a next key-frame.
  • This curve shape is called a fast attack slow decay (FASD) technique that is a type of the persistence method.
  • FASD fast attack slow decay
  • a reference for compensation for a brightness value does not exist in a portion before the key-frame 400 a .
  • the image compensating unit 130 may perform compensation in such a manner that frame numbers may slowly decrease from the key-frame 400 a that is a first key-frame, or may perform compensation according to brightness values of several frames having larger brightness values, compared to adjacent frames, not key-frames.
  • an order of compensating locations and brightness values is not fixed, thus, one type of compensation may be randomly performed and then the other type of compensation may be performed, or two types of compensation may be simultaneously performed.
  • the image compensating unit 130 compensates for the brightness value of the ROI 510 and locations of the frames based on the key-frames, the frames other than the key-frames may be naturally connected, so that connection in the color image data 13 may be improved.
  • the image compensating unit 130 reconstructs the color image data 13 by using the plurality of frames to which the image compensation is performed.
  • the reconstructed color image data 140 undergoes a rendering process and then is displayed to a user.
  • an operation of storing the reconstructed color image data 140 in the storage unit may be further performed.
  • a predetermined marker may be displayed on key-frames so as to allow the key-frames to be different from other frames. Accordingly, a user may distinguish between the key-frames and compensated frames in the reconstructed color image data 140 .
  • FIGS. 7 and 8 an embodiment of the present invention will now be described with reference to FIGS. 7 and 8 .
  • FIGS. 7 and 8 illustrate an example in which the color image data 13 for a blood vessel is processed.
  • FIG. 7 illustrates a pre-compensation frame 700 and
  • FIG. 8 illustrates a post-compensation frame 800 .
  • FIGS. 7 and 8 illustrate diagrams in a direction different from a cross-section of FIG. 5 .
  • an actual reference for determining a key-frame, setting an ROI, and compensating for an image may be based on a frame that is viewed in a direction of FIG. 5 .
  • the key-frame determining unit 110 may determine frames 720 having large brightness values as key-frames.
  • the ROI setting unit 120 may set an ROI 710 for each of the key-frames.
  • the image compensating unit 130 may compensate for brightness values and locations of frames between the key-frames, according to the ROI 710 .
  • FIG. 8 illustrates the post-compensation frame 800 .
  • FIG. 8 illustrates a result obtained by determining the frames 720 having the large brightness values as the key-frames and then by compensating for brightness values and locations of frames other than the key-frames.
  • the frames other than the key-frames have a compensated brightness value 810 .
  • the frames other than the key-frames are naturally connected to the key-frames, so that the result of FIG. 8 may help a user to efficiently diagnose the blood vessel.
  • an image quality of color image data may be improved by assuring connection between frames of the color image data. Therefore, a target object may be efficiently diagnosed by using the ultrasound diagnosis apparatus.
  • the embodiments of the present invention can be written as computer programs and can be implemented in general-use digital computers that execute the programs using a computer-readable recording medium.
  • a data structure used in the embodiments of the present invention can be written in a computer-readable recording medium through various means.
  • the computer-readable recording medium include magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.), optical recording media (e.g., CD-ROMs, or DVDs), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Radiology & Medical Imaging (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
US13/544,577 2012-01-13 2012-07-09 Method and apparatus for processing ultrasound images Expired - Fee Related US9008387B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020120004512A KR101468418B1 (ko) 2012-01-13 2012-01-13 초음파 영상 처리 방법 및 장치
KR10-2012-0004512 2012-01-13

Publications (2)

Publication Number Publication Date
US20130182926A1 US20130182926A1 (en) 2013-07-18
US9008387B2 true US9008387B2 (en) 2015-04-14

Family

ID=48780003

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/544,577 Expired - Fee Related US9008387B2 (en) 2012-01-13 2012-07-09 Method and apparatus for processing ultrasound images

Country Status (2)

Country Link
US (1) US9008387B2 (ko)
KR (1) KR101468418B1 (ko)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10667790B2 (en) 2012-03-26 2020-06-02 Teratech Corporation Tablet ultrasound system
US9877699B2 (en) 2012-03-26 2018-01-30 Teratech Corporation Tablet ultrasound system
KR102002408B1 (ko) * 2012-09-12 2019-07-24 삼성전자주식회사 초음파 영상 생성 장치 및 방법
US9820717B2 (en) * 2013-02-22 2017-11-21 Toshiba Medical Systems Corporation Apparatus and method for fetal image rendering
WO2016031273A1 (ja) * 2014-08-25 2016-03-03 オリンパス株式会社 超音波観測装置、超音波観測システム、超音波観測装置の作動方法
KR102372214B1 (ko) * 2015-01-19 2022-03-14 삼성전자주식회사 영상 처리 장치, 의료영상 장치 및 영상 처리 방법
EP3167810B1 (en) * 2015-11-10 2019-02-27 Samsung Medison Co., Ltd. Ultrasound imaging apparatus and method of operating the same
KR102182489B1 (ko) * 2016-07-26 2020-11-24 지멘스 메디컬 솔루션즈 유에스에이, 인크. 초음파 영상을 생성하는 방법, 초음파 시스템 및 기록매체
KR20190083234A (ko) * 2018-01-03 2019-07-11 삼성메디슨 주식회사 초음파 진단 장치의 제어 방법 및 초음파 진단 장치
KR102608821B1 (ko) * 2018-02-08 2023-12-04 삼성메디슨 주식회사 무선 초음파 프로브 및 무선 초음파 프로브와 연결되는 초음파 영상 장치
WO2020181263A1 (en) * 2019-03-06 2020-09-10 San Diego State University (SDSU) Foundation, dba San Diego State University Research Foundation Methods and systems for continuous measurement and/or screening of anomalies for fetal alcohol spectrum disorder analysis
CN111214255B (zh) * 2020-01-12 2023-07-25 刘涛 一种医学超声图像计算机辅助方法
CN111358494A (zh) * 2020-03-13 2020-07-03 达闼科技(北京)有限公司 图像的生成方法、装置、存储介质和电子设备
KR102459183B1 (ko) * 2021-04-16 2022-10-26 (주)비스토스 컬러영상을 이용한 심박 측정방법 및 장치
CN115623215B (zh) * 2022-12-20 2023-04-18 荣耀终端有限公司 一种播放视频的方法、电子设备和计算机可读存储介质
CN115661376B (zh) * 2022-12-28 2023-04-07 深圳市安泽拉科技有限公司 一种基于无人机图像的目标重建方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007330764A (ja) 2006-01-10 2007-12-27 Toshiba Corp 超音波診断装置及び超音波画像生成方法
US20080262354A1 (en) 2006-01-10 2008-10-23 Tetsuya Yoshida Ultrasonic diagnostic apparatus and method of generating ultrasonic image
US20090303252A1 (en) 2008-06-04 2009-12-10 Dong Gyu Hyun Registration Of CT Image Onto Ultrasound Images
US20100204579A1 (en) 2009-02-10 2010-08-12 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus and ultrasonic diagnostic method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007330764A (ja) 2006-01-10 2007-12-27 Toshiba Corp 超音波診断装置及び超音波画像生成方法
US20080262354A1 (en) 2006-01-10 2008-10-23 Tetsuya Yoshida Ultrasonic diagnostic apparatus and method of generating ultrasonic image
US20090303252A1 (en) 2008-06-04 2009-12-10 Dong Gyu Hyun Registration Of CT Image Onto Ultrasound Images
KR101017610B1 (ko) 2008-06-04 2011-02-28 한국과학기술원 초음파 영상과 ct 영상의 정합 시스템 및 방법
US20100204579A1 (en) 2009-02-10 2010-08-12 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus and ultrasonic diagnostic method
JP2010183935A (ja) 2009-02-10 2010-08-26 Toshiba Corp 超音波診断装置及び超音波診断装置の制御プログラム

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Korean Notice of Allowance issued in Korean Patent Application No. KR10-2012-0004512 dated Aug. 29, 2014.
Korean Office Action issued in Korean Application No. 10-2012-0004512 dated Mar. 7, 2014, w/English translation.
Yohida Tetsuya et al. (Japanese publication 2007-330764 will be further referred to as YT). Provided by applicant, cited in IDS. Machine translation provided by examiner. *

Also Published As

Publication number Publication date
US20130182926A1 (en) 2013-07-18
KR101468418B1 (ko) 2014-12-03
KR20130083725A (ko) 2013-07-23

Similar Documents

Publication Publication Date Title
US9008387B2 (en) Method and apparatus for processing ultrasound images
JP6640922B2 (ja) 超音波診断装置及び画像処理装置
JP5670324B2 (ja) 医用画像診断装置
US11138723B2 (en) Analyzing apparatus and analyzing method
US20130116557A1 (en) Ultrasonic diagnostic device and ultrasonic image display method
US20110066031A1 (en) Ultrasound system and method of performing measurement on three-dimensional ultrasound image
WO2012051216A1 (en) Direct echo particle image velocimetry flow vector mapping on ultrasound dicom images
JP2007296336A (ja) 超音波システム内で情報を表示するためのユーザ・インターフェース及び方法
JP7150800B2 (ja) 医用4dイメージングにおける動き適応型可視化
US8900147B2 (en) Performing image process and size measurement upon a three-dimensional ultrasound image in an ultrasound system
JP2017526467A (ja) 即時のユーザフィードバックのためのマルチビート心エコー取得のための品質メトリック
JP2014023928A (ja) 超音波撮像システムおよび方法
WO2011041244A1 (en) Contrast-enhanced ultrasound assessment of liver blood flow for monitoring liver therapy
JP2010119842A (ja) Imt測定領域設定方法およびそのための超音波システム
US9357981B2 (en) Ultrasound diagnostic device for extracting organ contour in target ultrasound image based on manually corrected contour image in manual correction target ultrasound image, and method for same
US10779798B2 (en) Ultrasound three-dimensional (3-D) segmentation
US20130150718A1 (en) Ultrasound imaging system and method for imaging an endometrium
US20190298304A1 (en) Medical diagnosis apparatus, medical image processing apparatus, and image processing method
CN110809801A (zh) 用于血管中壁面剪切应力的同时可视化和量化的系统和方法
CN114375178A (zh) 对微血管进行高空间和时间分辨率超声成像的方法
EP3905960A1 (en) Systems and methods for contrast enhanced imaging
EP3600058A1 (en) System and method for concurrent visualization and quantification of blood flow using ultrasound vector flow imaging
US9204861B2 (en) Ultrasound diagnostic apparatus and method of determining a time intensity curve
CN103190932A (zh) 一种冠状动脉血管壁应力和应变的估算方法
JP5416392B2 (ja) 超音波診断装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG MEDISON CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LEE, JAE-KEUN;REEL/FRAME:028515/0863

Effective date: 20120627

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20230414