US20140050381A1 - Method and apparatus for managing and displaying ultrasound image - Google Patents
Method and apparatus for managing and displaying ultrasound image Download PDFInfo
- Publication number
- US20140050381A1 US20140050381A1 US13/971,482 US201313971482A US2014050381A1 US 20140050381 A1 US20140050381 A1 US 20140050381A1 US 201313971482 A US201313971482 A US 201313971482A US 2014050381 A1 US2014050381 A1 US 2014050381A1
- Authority
- US
- United States
- Prior art keywords
- observation
- image
- volume data
- ultrasound
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/465—Displaying means of special interest adapted to display user selection data, e.g. icons or menus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/523—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for generating planar views from image data in a user selectable plane not corresponding to the acquisition plane
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5292—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves using additional data, e.g. patient information, image labeling, acquisition parameters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8993—Three dimensional imaging systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52053—Display arrangements
- G01S7/52057—Cathode ray tube displays
- G01S7/52073—Production of cursor lines, markers or indicia by electronic means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52053—Display arrangements
- G01S7/52057—Cathode ray tube displays
- G01S7/52074—Composite displays, e.g. split-screen displays; Combination of multiple images or of images and alphanumeric tabular information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0866—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0883—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/008—Cut plane or projection plane definition
Definitions
- the present invention relates to a method and apparatus for efficiently managing an ultrasound image of a fetal heart, and a method and apparatus for displaying an ultrasound image for a user to diagnose an object.
- An ultrasound diagnosis apparatus obtains an image of a desired internal part of an object by generating an ultrasound signal (generally, an ultrasound signal of 20 kHz or higher) by using a probe, and using information about an echo signal reflected from the desired internal part.
- the ultrasound diagnosis apparatus is used for medical purposes, e.g., to detect foreign substances in an object, and measure and observe the degree of injury of the object.
- the ultrasound diagnosis apparatus has been widely used together with other image diagnosis apparatuses, since the ultrasound diagnosis apparatus is more stable, is capable of displaying images in real time, and hardly causes exposure to radiation, compared to an X-ray examination.
- Fetal cardiac malformation occupies a large part of fetal diseases.
- a location of a fetal heart frequently varies according to a posture of a fetus within a placenta, unlike an adult's heart. Accordingly, it is very difficult for doctors, who are not sufficiently trained, to obtain an ultrasound image of a fetal heart.
- the present invention provides a method and apparatus for obtaining and displaying an ultrasound image of a fetal heart from ultrasound volume data.
- the present invention also provides a computer-readable recording medium having recorded thereon a computer program for performing the method.
- the obtaining of the plurality of images may include obtaining the plurality of images by splitting the ultrasound volume data such that planes that split the ultrasound volume data form a predetermined angle with the observation plane.
- the obtaining of the plurality of images may include obtaining the plurality of images by splitting the ultrasound volume data such that planes that split the ultrasound volume data intersect with respect to the reference point.
- the obtaining of the plurality of images may include obtaining the plurality of images by adjusting at least one of distances between and a total number of planes that split the ultrasound volume data.
- the observation plane may be one of an A-plane, a B-plane, and a C-plane of the object, included in the ultrasound volume data.
- the storing of the image selected from among the plurality of images may include storing location information of planes that split the ultrasound volume data, together with the selected image, so as to obtain the selected image.
- the method may further include displaying the current observation operation and an image stored to match the current observation operation together.
- the displaying of the current observation operation and the image stored to match the current observation operation together may include displaying an exemplary image of the current observation operation.
- the method may further include arranging the ultrasound volume data such that a reference region including the reference point is disposed in a determined direction.
- the method may further include selecting the current observation operation from among the plurality of observation operations.
- the selecting of the current observation operation may include selecting the current observation operation from among the plurality of observation operations, according to a predetermined order or an external input signal.
- the method may further include selecting one of the plurality of observation operation as a new current observation operation.
- the method may be repeatedly performed.
- a method of managing an ultrasound image including storing a plurality of pieces of setting information to respectively match a plurality of observation operations for diagnosing an object, each of the plurality of pieces of the setting information including at least one from among information about an observation plane that splits ultrasound volume data in a predetermined direction, information about a split method of splitting the ultrasound volume data, and information about a reference point; displaying a plurality of plane images for a current observation operation which is one of the plurality of observation operations, wherein the plurality of plane images are obtained based on a piece of the setting information matching the current observation operation; and storing an image selected from among the plurality of plane images according to an external input signal such that the selected image matches the current observation operation.
- the method may further include repeatedly performing a process of displaying a new observation operation and storing an image to match the new observation operation.
- an ultrasound apparatus including a storage unit for storing ultrasound volume data; an image processing unit for determining a reference point and an observation plane for the ultrasound volume data, and obtaining a plurality of images by splitting the ultrasound volume data, based on the reference point and the observation plane; a display unit for displaying the plurality of images; and a control unit for controlling the storage unit, the image processing unit, and the display unit.
- the storage unit stores an image selected from among the plurality of images according to an external input signal such that the selected image matches a current observation operation from among a plurality of observation operations for observing an object.
- a computer-readable recording medium having recorded thereon a computer program for executing the method of managing an ultrasound image and the method of displaying an ultrasound image.
- FIG. 1 is a block diagram of an ultrasound apparatus according to an embodiment of the present invention
- FIG. 2 is a flowchart illustrating a method of managing an ultrasound image according to an embodiment of the present invention
- FIG. 3 is a flowchart illustrating a method of displaying an ultrasound image according to an embodiment of the present invention
- FIGS. 4A and 4B are images illustrating a process of determining a reference point, performed by an ultrasound apparatus, according to an embodiment of the present invention
- FIG. 5 illustrates a process of obtaining a plurality of images and storing an image selected from among the plurality of images, performed by an ultrasound apparatus, according to an embodiment of the present invention
- FIG. 6 illustrates a process of obtaining a plurality of images and storing an image selected from among the plurality of images, performed by an ultrasound apparatus, according to another embodiment of the present invention
- FIG. 7 illustrates a process of storing an image selected from among a plurality of images according to an external input image such that the selected image may match an observation operation, according to an embodiment of the present invention
- FIG. 8 illustrates a process of displaying an exemplary image of an observation operation, according to an embodiment of the present invention
- FIG. 9 is a diagram illustrating a process of displaying an observation operation together with an image corresponding to the observation operation, according to an embodiment of the present invention.
- FIG. 10 is a diagram illustrating a process of displaying an observation operation together with an image corresponding to the observation operation, according to another embodiment of the present invention.
- FIG. 11 is a diagram illustrating a process of displaying an order of a current observation operation in all of the observation operations, according to an embodiment of the present invention.
- FIG. 1 is a block diagram of an ultrasound apparatus 100 according to an embodiment of the present invention.
- the ultrasound apparatus 100 may include a storage unit 110 , an image processing unit 120 , a display unit 130 , and a control unit 140 .
- a method of managing and displaying an ultrasound image by using elements of the ultrasound apparatus 100 will now be described in detail.
- the storage unit 110 stores ultrasound volume data.
- the ultrasound volume data stored in the storage unit 110 is obtained by scanning an object with an ultrasound probe.
- the ultrasound volume data is a three-dimensional (3D) image having a fan shape rather than a rectangular parallelepiped, according to the characteristics of the ultrasound apparatus 100 .
- the present invention will now be described with respect to ultrasound volume data obtained by scanning a fetal heart, but the ultrasound volume data is not limited to data obtained by scanning a human body.
- the ultrasound volume data stored in the storage unit 110 may be obtained from a storage medium installed outside the ultrasound apparatus 100 .
- the storage unit 110 may obtain and store the ultrasound volume data by using a picture archiving and communication system (PACS).
- PACS picture archiving and communication system
- the storage unit 110 may further store an ultrasound image.
- the ultrasound image stored in the storage unit 110 may be a two-dimensional (2D) image or a 3D image. If the ultrasound image is a 2D image, then the 2D image may be an image of a plane obtained by splitting the ultrasound volume data.
- the ultrasound image stored in the storage unit 110 may be obtained by scanning an object with the ultrasound apparatus 100 or may be received through a PACS, in a wired/wireless manner.
- the storage unit 110 may store a plurality of images obtained by the image processing unit 120 , and an image selected from among the plurality of images according to an external input signal. Also, the storage unit 110 may store the selected image to match a corresponding observation operation. That is, the storage unit 110 may store images selected to correspond to observation operations such that the selected images may match the observation operations, respectively.
- observation operation means a process of observing an object by using an observation plane.
- a user may diagnose the object by using an observation plane that has been predetermined for each of the observation operations.
- the ultrasound apparatus 100 may obtain an image by splitting ultrasound volume data based on the observation plane for each of the observation operations, and provide the image to a user. This process will be described in detail with reference to FIGS. 5 to 8 below.
- observation plane means a plane obtained by splitting volume data in a predetermined direction so as to observe an object. That is, the observation plane is a plane, the location of which varies in the volume data, according to the type of the object to be observed and an observation operation.
- the storage unit 110 may store information regarding observation operations of observing an object and observation planes such that each of the observation operations may match a corresponding observation plane among the observation planes. For each of the observing operations, the storage unit 110 may store information regarding an observation plane and a method of splitting volume data. Also, the storage unit 110 may store information regarding an observation plane matching each of the observation operations, a reference point, and a split method, as setting information.
- the storage unit 110 may store a C-plane as an observation plane for a 4-chamber view observation operation. Also, the storage unit 110 may store information regarding a method of splitting an image of a heart, which is to be observed, by a horizontal straight line, as a split method of splitting volume data on the C-plane. Furthermore, the storage unit 110 may store a center of a descending aorta, as a reference point for rotating the volume data.
- the storage unit 110 may store location information of a plurality of planes obtained by splitting volume data, in the volume data.
- a method of storing various information to match an observation operation in the storage unit 110 will be described in detail with reference to FIGS. 5 to 8 below.
- the image processing unit 120 determines reference points, observation planes, and split methods with respect to the ultrasound volume data. Information regarding the reference points, observation planes, and split methods may match the respective observation operations. As described above, the information regarding the determined reference points, observation planes, and split methods may be stored in the storage unit 110 .
- the information regarding the reference points, observation planes, and split methods that match each of the observation operations may be determined based information input from a user. That is, various information regarding each observation operation may be determined by the image processing unit 120 , based on the ultrasound volume data and a predetermined algorithm, or may be determined according to an external input signal. This will be described in detail with reference to FIGS. 4A and 4B below.
- the image processing unit 120 obtains a plurality of images by splitting the ultrasound volume data according to a split method, based on a reference point and an observation plane. Any of various split methods may be employed by the image processing unit 120 to split the ultrasound volume data so as to obtain the plurality of images, as will be described in detail with reference to FIGS. 5 and 6 below. Also, the image processing unit 120 may obtain the plurality of images by adjusting distances between and a total number of sub data to be split from the ultrasound volume data.
- the display unit 130 may display a plurality of images of planes obtained by splitting the ultrasound volume data, on a screen of the ultrasound apparatus 100 . Also, the display unit 130 may display an image stored to match an observation operation from among the plurality of images, together with the observation operation. In addition, the display unit 130 may display an exemplary image of the observation operation.
- the display unit 130 may display an observation operation for an object, which is to be observed, using ultrasound volume data, on a first region of the screen of the ultrasound apparatus 100 . Also, the display unit 130 may display at least one from among a reference point, an observation plane matching the observation operation, and a split method of splitting the ultrasound volume data, on a second region of the screen. Furthermore, the display unit 130 may display a plurality of images obtained by splitting the ultrasound volume data, based on the observation plane and the reference point, on a third region of the screen. The current embodiment will be described in greater detail with reference to FIGS. 5 to 7 below.
- the display unit 130 may display an order of a current observation operation in all of the observation operations. For example, when an object is diagnosed using a total of five observation operations, that a current observation operation is a second observation operation among the five observation operations may be displayed. Thus, a user may easily understand a whole process, and may select a previous observation operation again through an additional user input.
- the display unit 130 may include a plurality of modules for performing the above operations.
- the display unit 130 may include an observation operation display module for displaying an order of a current observation operation in all of a plurality of observation operations.
- the display unit 130 may include a split information display module for displaying at least one from among an observation plane corresponding to the current observation operation, a reference point, and a split method of splitting volume data.
- the display unit 130 may include a split screen display module for displaying a plurality of images of planes obtained by splitting the volume data.
- the display unit 130 may include at least one from among a liquid crystal display (LCD), a thin film transistor-LCD, an organic light-emitting diode (OLED) display, a flexible display, and a 3D display.
- the ultrasound apparatus 100 may include at least two display units 130 according to a structure thereof.
- the control unit 140 controls overall operations of the ultrasound apparatus 100 . Also, the control unit 140 may control the storage unit 110 , the image processing unit 120 , and the display unit 130 to manage and output obtained ultrasound images.
- control unit 140 may control the storage unit 110 to store an image selected for an observation operation such that the selected image may match the observation operation, and may then proceed to a subsequent observation operation.
- control unit 140 may control performing of a plurality of observation operations.
- the ultrasound apparatus 100 may further include a user input unit (not shown).
- the user input unit receives an external input signal for controlling the ultrasound apparatus 100 from a user.
- the user input unit may receive an external input signal for selecting an image corresponding to a current observation operation from among a plurality of images.
- the user input unit may receive an external input signal for selecting one of a plurality of observation operations.
- the user input unit may receive an external input signal via an input unit, e.g., a keyboard, a mouse, or a stylus pen. Also, the user input unit may receive an external input signal that is input by directly touching or dragging on a liquid crystal screen.
- an input unit e.g., a keyboard, a mouse, or a stylus pen.
- the user input unit may receive an external input signal that is input by directly touching or dragging on a liquid crystal screen.
- the display unit 130 may act as the user input unit.
- the display unit 130 may sense a touched location, area, and pressure of a touch input.
- the touch screen may sense not only a real touch but also a proximity touch.
- FIGS. 2 and 3 A method of managing and displaying an ultrasound image by using the elements of the ultrasound apparatus 100 will now be described with reference to FIGS. 2 and 3 .
- Each of the flowcharts illustrated in FIGS. 2 and 3 includes operations that are sequentially performed by the storage unit 110 , the image processing unit 120 , the display unit 130 , and the control unit 140 of the ultrasound apparatus 100 .
- FIG. 2 is a flowchart illustrating a method of managing an ultrasound image, according to an embodiment of the present invention.
- the ultrasound apparatus 100 determines an observation operation. Specifically, the ultrasound apparatus 100 determines an observation operation from among a plurality of observation operations, based on a pre-input order or a user input.
- a 4-chamber view may be appropriate as an observation operation.
- a 5-chamber view, a 3-vessel & trachea view, a left/right ventricular outflow tract (LVOT/RVOT) view, or an aortic arch view may be used as an observation operation for observing the fetal heart.
- LVOT/RVOT left/right ventricular outflow tract
- aortic arch view may be used as an observation operation for observing the fetal heart.
- any of various other observation operations may be used.
- the ultrasound apparatus 100 displays an observation plane matching the observation operation. Specifically, the ultrasound apparatus 100 may display an observation plane stored to match the observation operation determined in operation S 210 .
- the observation plane may be an A-plane, a B-plane, or a C-plane.
- the A-plane may be an observation plane of ultrasound volume data viewed from above.
- the B-plane may be an observation plane of the ultrasound volume data viewed from a left or right side.
- the C-plane may be an observation plane of the ultrasound volume data viewed from a front side. That is, the A-plane, the B-plane, and the C-plane mean a transverse plane, a sagittal plane, and a coronal plane of the ultrasound volume data, respectively.
- the observation plane may be one of the A-plane, the B-plane, and the C-plane of an object, included in the ultrasound volume data.
- the ultrasound apparatus 100 may display more than one observation plane. That is, the ultrasound apparatus 100 may display all of the observation planes, namely, the A-plane, the B-plane, and the C-plane.
- the ultrasound apparatus 100 splits the ultrasound volume data according to the reference point and a split method matching the observation operation.
- the reference point is a point representing a spatial location in the ultrasound volume data, and may be expressed with 3D coordinates.
- the reference point may also be a location on the observation plane. A process of determining the reference point will be described in detail with reference to FIG. 4 below.
- any of various split methods may be used to split the ultrasound volume data, in operation S 230 .
- the ultrasound apparatus 100 may split the ultrasound volume data, such that planes of the ultrasound volume data may intersect with one another or may be disposed apart from one another by a predetermined distance, with respect to the reference point.
- the ultrasound apparatus 100 may determine a plurality of split lines that split the ultrasound volume data in left and right directions of the C-plane.
- the plurality of split lines may be arranged such that the distances between the plurality of split lines may be the same in a vertical direction with respect to the reference point.
- the plurality of split lines are shown as one-directional (1D) lines on the observation plane, but may mean planes that split the ultrasound volume data.
- the ultrasound apparatus 100 may split the ultrasound volume data with the plurality of split lines determined as described above.
- the ultrasound apparatus 100 may obtain a plurality of images from a result of splitting the ultrasound volume data in operation S 230 .
- the ultrasound apparatus 100 displays the plurality of images obtained by splitting the ultrasound volume data. That is, the ultrasound apparatus 100 may display the plurality of images as candidates of the observation operation determined in operation S 210 .
- the ultrasound apparatus 100 stores an image selected from among the plurality of images such that the selected image may match the observation operation.
- the stored image may be selected according to an external input signal, or an image closest to an exemplary image stored in the ultrasound apparatus 100 may be selected by comparing the plurality of images with the exemplary image.
- the observation operation is a 4-chamber view
- a user may select one image that most exactly represents the 4-chamber view from among the plurality of images.
- the ultrasound apparatus 100 may store the selected image to match the 4-chamber view.
- the user may diagnose the object, based on the selected image stored to match the 4-chamber view.
- the user may conveniently and efficiently diagnose the object, based on images stored to match the 4-chamber view and other various observation operations.
- the ultrasound apparatus 100 may store an image matching the observation operation which is the 4-chamber view, and then select the 5-chamber view as the next observation operation according to the previously input order.
- the image processing unit 120 may determine the 4-chamber view, the 5-chamber view, the 3-vessel & trachea view, or the like, as an observation operation, based on an external input signal received via the user input unit.
- FIG. 3 is a flowchart illustrating a method of displaying an ultrasound image, according to an embodiment of the present invention.
- the display unit 130 may display an observation operation of ultrasound volume data for an object that is to be observed, on the first region of the screen of the ultrasound apparatus 100 .
- the display unit 130 may display the observation operation that is manually or automatically determined by the image processing unit 120 according to an external input signal.
- the displaying of the observation operation means displaying a location of a viewpoint for observing the object in the ultrasound volume data.
- the display unit 130 may display an image of the object, e.g., a fetal heart, and a position of the 4-chamber view which is the observation operation, as will be described in detail with reference to FIGS. 5 and 6 .
- the display unit 130 may display at least one from among an observation plane matching the observation operation, a reference point, and a split line for splitting the ultrasound volume data, on the second region of the screen of the ultrasound apparatus 100 .
- the split line for splitting the ultrasound volume data may be determined according to a split method stored to match the observation operation.
- the display unit 130 may display a C-plane as the observation plane, and display a plurality of split lines that split the ultrasound volume data on the observation plane. Also, the display unit 130 may display a split method of splitting the ultrasound volume data in a left direction and a right direction of the observation plane, with respect to the reference point.
- the display unit 130 may display a plurality of images of planes obtained by splitting the ultrasound volume data based on the split method and the reference point, on the third region of the screen of the ultrasound apparatus 100 .
- the plurality of images may be images obtained by splitting the ultrasound volume data by the image processing unit 120 .
- each of the images may be obtained by splitting the ultrasound volume data by using one of the plurality of split lines displayed on the second region.
- FIGS. 4A and 4B are images illustrating a process of determining a reference point, performed by the ultrasound apparatus 100 , according to an embodiment of the present invention.
- the display unit 130 of the ultrasound apparatus 100 may display an image 410 of an A-plane of ultrasound volume data 440 , an image 420 of a B-plane of the ultrasound volume data 440 , an image 430 of a C-plane of ultrasound volume data 440 , and the ultrasound volume data 440 on a left upper portion, a right upper portion, a left lower portion, and a right lower portion of a screen 400 of the ultrasound apparatus 100 , respectively.
- the display unit 130 may output the image 410 of the A-plane, the image 420 of the B-plane, and the image 430 of the C-plane of the ultrasound volume data 440 stored in the storage unit 110 , to the screen 400 of the ultrasound apparatus 100 .
- the image processing unit 120 may determine reference points 415 , 425 , and 435 of the ultrasound volume data 440 .
- the image processing unit 120 may determine the reference points 415 , 425 , and 435 from a 4-chamber view or may determine centers of images of a descending aorta AoD shown in the 4-chamber view, as the reference points 415 , 425 , and 435 .
- the centers of images of the descending aorta AoD may be automatically determined as the reference points 415 , 425 , and 435 .
- the user input unit of the ultrasound apparatus 100 may receive an external input signal for selecting a center of the descending aorta AoD, and the image processing unit 120 may determine the reference points 415 , 425 , and 435 , based on the external input signal.
- FIG. 4A illustrates a process of determining the reference points 415 , 425 , and 435 through the 4-chamber view of the ultrasound volume data 440
- the image processing unit 120 may determine the reference points 415 , 425 , and 435 through various observation operations other than the 4-chamber view.
- the storage unit 110 stores the determined reference points 415 , 425 , and 435 . According to an embodiment of the present invention, the storage unit 110 may store the reference points 415 , 425 , and 435 to match respective observation operations as described above.
- a reference region 433 including the reference point 435 , is shown in the image 430 of the C-plane.
- a process of arranging an image 450 of an A-plane, an image 460 of a B-plane, an image 470 of a C-plane, and ultrasound volume data 480 with respect to a reference region 473 will now be described with reference to FIG. 4B .
- FIG. 4B a central point 475 and the reference region 473 are shown in the image 470 of the C-plane.
- FIG. 4B illustrates the image 450 of the A-plane, the image 460 of the B-plane, and the image 470 of the C-plane obtained by rotating the image 410 of the A-plane, the image 420 of the B-plane, and the image 430 of the C-plane with respect to the reference point 435 (or, central point 475 ) by the image processing unit 120 in such a manner that the reference region 433 of FIG. 4A may be vertically disposed.
- ultrasound volume data is a 3D image. That is, a location of the 3D image cannot be exactly expressed only with one point.
- reference point 435 not only the reference point 435 but also other criteria should be determined to split the ultrasound volume data according to the observation plane.
- the image processing unit 120 determines the other criteria for the 4-chamber view of the ultrasound volume data 440 by rotating the reference region 433 , including the reference point 435 , to be vertically disposed.
- the present invention is not limited thereto, and any of other various methods may be used to determine the other criteria for the ultrasound volume data 440 . Any of other various methods may be performed with respect to observation operations other than the 4-chamber view.
- the image processing unit 120 may reverse a brightness value of the image 430 of the C-plane of the 4-chamber view, based on a predetermined brightness value. Then, the image processing unit 120 may determine an object to be rotated, based on an 8- or 4-connected component analysis algorithm and a skeletonization algorithm. Furthermore, the ultrasound volume data 440 may be rotated by checking an inclination angle of a determined reference region.
- the brightness value of the image 430 of the C-plane may be reversed and two boundary lines may be detected according to an edge detection algorithm. Then, the boundary lines may be arranged to be vertically disposed.
- any of other various methods may be used to arrange the image 410 of the A-plane, the image 420 of the B-plane, and the image 430 of the C-plane of the ultrasound volume data 440 .
- the image processing unit 120 may determine criteria for splitting the ultrasound volume data 480 to obtain a plurality of images by obtaining the reference region 473 . That is, if an image of the descending aorta AoD illustrated as a reference region is vertically disposed on a C-plane, then the image processing unit 120 may obtain images for various observation operations by splitting resultant ultrasound volume data.
- FIG. 5 illustrates a process of obtaining a plurality of images and storing an image selected from among the plurality of images, performed by the ultrasound apparatus 100 , according to an embodiment of the present invention.
- the image processing unit 120 determines a 4-chamber view as an observation operation will now be described.
- the display unit 130 may display the 4-chamber view as an observation operation on a first region 510 of a screen 500 of the ultrasound apparatus 100 . That is, the display unit 130 may display a location of a plane corresponding to a 4-chamber view for observing a fetal heart, in the volume data.
- first region denotes a plurality of regions displayed on a screen of the ultrasound apparatus 100 , regardless of the order thereof. In other words, each of these terms may be selected for convenience of explanation, regardless of locations thereof on the screen of the ultrasound apparatus 100 .
- observation operations 5101 , 5102 , 5103 , and 5104 are displayed on the first region 510 , and a current observation operation 5104 which is a 4-chamber view is thickly displayed from among these observation operations.
- current observation operation may mean an observation plane on which a process of selecting and storing an image is performed based on previously stored information about an observation plane, a reference point, and a split method, from among these observation operations.
- the display unit 130 may display an observation plane corresponding to the current observation operation 5104 and a plurality of split lines 5201 , 5202 , . . . , 5216 for splitting ultrasound volume data, on a second region 520 of the screen 500 .
- the display unit 130 may further display a reference point matching the current observation operation 5104 .
- the 4-chamber view corresponds to an image of an A-plane obtained by horizontally splitting the ultrasound volume data. That is, a plane obtained by horizontally splitting the ultrasound volume data based on the image of the C-plane is the 4-chamber view.
- the display unit 130 may display the C-plane as the observation plane and the plurality of split lines 5201 , 5202 , . . . , 5216 that horizontally split the image of the C-plane.
- the image processing unit 120 may determine a reference point based on the B-plane.
- the display unit 130 may display a plurality of split lines that horizontally split the image of the B-plane.
- the image processing unit 120 may obtain a plurality of images by splitting the ultrasound volume data. That is, the image processing unit 120 may obtain images of planes obtained by splitting the ultrasound volume data by using the plurality of split lines 5201 , 5202 , . . . , 5216 .
- the images of the planes may be 2D images of planes obtained by splitting the ultrasound volume data with respect to a reference point determined for the current observation operation 5104 .
- the image processing unit 120 may obtain a plurality of images by adjusting the distances between or a total number of the plurality of split lines 5201 , 5202 , . . . , 5216 that split the ultrasound volume data. That is, the image processing unit 120 may obtain a plurality of images by more densely or sparsely splitting the ultrasound volume data by arbitrarily adjusting the distances between the plurality of split lines 5201 , 5202 , . . . , 5216 . Otherwise, the image processing unit 120 may adjust the number of images to be obtained by adjusting the total number of split lines 5201 , 5202 , . . . , 5216 .
- the display unit 130 may display the images of the planes obtained by splitting the ultrasound volume data by using the plurality of split lines 5201 , 5202 , . . . , 5216 , on a third region 530 of the screen 530 .
- sixteen images displayed on the screen 500 of the ultrasound apparatus 100 are images of planes obtained by splitting the ultrasound volume data by using the plurality of split lines 5201 , 5202 , . . . , 5216 displayed on the second region 520 .
- the user input unit may receive an input for selecting one of the plurality of images from a user. That is, the display unit 130 may receive an input for selecting an image that most exactly represents the 4-chamber view from among the images of the planes obtained by splitting the ultrasound volume data, from the user. Further, the storage unit 110 may store an image selected according to an external input signal to match an observation operation.
- the storage unit 110 may also store either information about split lines that split the ultrasound volume data or location information of a plane corresponding to the selected image so as to obtain the selected image. For example, a case where the image selected from among the plurality of images displayed in FIG. 5 corresponds to the plane obtained by splitting the ultrasound volume data by using the last split line 5216 from among the plurality of split lines 5201 , 5202 , . . . , 5216 displayed on the second region 520 may be considered. In this case, the storage unit 110 may store information about the 4-chamber view which is the observation operation, the selected image, and the last split line 5216 . Otherwise, the storage unit 110 may store the location information of the plane based on the last split line 5216 , instead of the last split line 5216 .
- the storage unit 110 may store not only an image selected from among a plurality of images, which are obtained by the image processing unit 120 , according to an external input signal, but also the other images.
- the display unit 130 may display the stored plurality of images again and the image processing unit 120 thus needs not to split the ultrasound volume data again.
- a time needed for the image processing unit 120 to split the ultrasound volume data may be reduced.
- FIG. 5 illustrates a result of splitting ultrasound volume data based on the C-plane as an observation plane and the plurality of split lines 5201 , 5202 , . . . , 5216 which are parallel lines, performed by the image processing unit 120 .
- the plurality of split lines 5201 , 5202 , . . . , 5216 are displayed as parallel lines on the C-plane which is an observation plane, but actually correspond to planes that split the ultrasound volume data.
- each of the planes that split the ultrasound volume data may form a predetermined angle with respect to the C-plane which is an observation plane.
- the planes that split the ultrasound volume data may split the ultrasound volume data to be parallel with the A-plane which is an observation plane, but may split the ultrasound volume data not to be parallel with the A-plane, as shown in the first region 510 .
- the image processing unit 120 may split the ultrasound volume data such that the planes that split the ultrasound volume data may contact the A-plane.
- the ultrasound volume data may be split according to any of various methods other than the method described with reference to FIG. 5 .
- a method of splitting the ultrasound volume data, according to another embodiment of the present invention, will be described with reference to FIG. 6 below.
- FIG. 6 illustrates a process of obtaining a plurality of images and storing an image selected from among the plurality of images, performed by an ultrasound apparatus, according to another embodiment of the present invention.
- the display unit 130 may display an observation operation on a first region 610 of a screen 600 of the ultrasound apparatus 100 .
- the display unit 130 may display a plurality of split lines 6201 , 6202 , . . . , 6206 that split ultrasound volume data based on the observation plane, on a second region 620 of the screen 600 , and may display a plurality of images obtained by splitting the ultrasound volume data by using the plurality of split lines 6201 , 6202 , . . . , 6206 , on a third region 630 of the screen 600 .
- FIG. 6 illustrates a case where an LVOT view is determined as the observation operation for the ultrasound volume data.
- the LVOT view is an observation operation for observing a left ventricular outflow tract and corresponds to an image of a B-plane.
- the display unit 130 may display the determined observation operation on the first region 610 of the screen 600 .
- the display unit 130 may display an observation operation, which is determined by the image processing unit 120 , thickly or using a different color, so that a user may easily distinguish the determined observation operation from other observation operations.
- an LVOT view 6103 is displayed thickly.
- the image processing unit 120 may determine a reference point of the ultrasound volume data from a 5-chamber view image, to correspond to the LVOT view which is the observation operation.
- the image processing unit 120 may determine an A-plane as an observation plane. That is, since the LVOT view corresponds to a B-plane, an image corresponding to the LVOT view may be selected from among a plurality of images obtained by splitting the ultrasound volume data by using split lines displayed on the A-plane.
- the 5-chamber view corresponds to an A-plane and is an observation operation for observing not only a left atrium, a left ventricle, a right atrium, and a right ventricle of a fetal heart but also an aorta.
- the image processing unit 120 may determine a center of an aorta observed from the 5-chamber view, as a reference point.
- the image processing unit 120 may obtain a plurality of images split from the ultrasound volume data, based on the observation plane and the reference point.
- the image processing unit 120 may obtain a plurality of images split from the ultrasound volume data by using the plurality of split lines 6201 , 6202 , . . . , 6206 displayed on the second region 620 of the screen 600 .
- the plurality of images may be obtained by splitting the ultrasound volume data such that planes that split the ultrasound volume data may intersect with respect to the reference point, according to the process of FIG. 6 .
- the image processing unit 120 may split the ultrasound volume data such that the plurality of split lines 6201 , 6202 , . . . , 6206 may intersect with reference to the reference point, on the A-plane which is the observation plane.
- the plurality of split lines 6201 , 6202 , . . . , 6206 correspond to the planes that split the ultrasound volume data, respectively. Accordingly, the planes that split the ultrasound volume data intersect with reference to the reference point.
- each of the plurality of images is a B-plane or a C-plane.
- the display unit 130 may display the images of the planes that split the ultrasound volume data, on the third region 630 of the screen 600 .
- a process of storing an image selected from among the plurality of images according to an external input signal so as to match the observation operation, performed by the storage unit 110 , is as described above with reference to FIG. 5 .
- FIG. 7 illustrates a process of storing an image selected from among a plurality of images according to an external input signal such that the selected image may match an observation operation, according to an embodiment of the present invention.
- the display unit 130 may display a plurality of images split from ultrasound volume data, and the storage unit 110 may store an image selected from the plurality of images according to an external input signal such that the selected image may match an observation operation.
- Contents displayed on a first region 710 , a second region 720 , and a third region 730 of a screen of the ultrasound apparatus 100 by the display unit 130 are substantially the same as those described above with reference to FIGS. 5 and 6 .
- the ultrasound apparatus 100 may repeatedly perform the process described above with reference to FIG. 5 or 6 with respect to a plurality of observation operations. That is, the ultrasound apparatus 100 stores the selected image to match a first observation operation from among the plurality of observation operations.
- the ultrasound apparatus 100 selects a second observation operation as a next observation operation, and performs the process, which was performed with respect to the first observation operation, with respect to the second observation operation.
- a plurality of images are obtained using a reference point, an observation plane, and a split method matching the second observation operation.
- the ultrasound apparatus 100 After an image selected from among a plurality of images is stored to match the second observation operation, the ultrasound apparatus 100 performs the process with respect to the other operations, e.g., a third observation operation, a fourth observation operation, and so on. As described above, an order of selecting the plurality of observation operations may be determined according to a protocol that has been input to the ultrasound apparatus 100 .
- a 5-chamber view 7103 may be determined as a subsequent observation operation.
- the storage unit 110 may store an image 7302 selected from among the plurality of images based on an input received from a user, such that the image 7301 may match the 5-chamber view 7103 .
- the ultrasound apparatus 100 may display a plurality of images of a main pulmonary artery view 7102 , and may store an image 7303 selected from among the plurality of images according to an external input signal such that the image 7303 may match the main pulmonary artery view 7102 .
- the ultrasound apparatus 100 may receive an external input signal for selecting the first observation operation when the user desires to replace the image matching the first observation operation with a new image.
- the image processing unit 120 may use a C-plane as an observation plane to obtain a plurality of images matching the 5-chamber view which is the first observation operation.
- the image processing unit 120 may determine a center of a descending aorta shown in an image of the C-plane, as a reference point, and may split ultrasound volume data to obtain a plurality of A-plane images.
- the control unit 140 may control the image processing unit 120 to determine the LVOT view as the second observation operation, according to a predetermined order or an external input signal.
- a C-plane image may be used as an observation plane.
- the image processing unit 120 may determine a new reference point of the C-plane image. That is, the image processing unit 120 may determine a center of an aorta shown in the C-plane image as a new reference point of the second observation operation.
- Examples of an observation operation may include not only the 4-chamber view and the 5-chamber view described above but also other various observation operations.
- a 3-vessel & trachea view, an LVOT/RVOT view, and an aortic arch view may be used as observation operations as described above.
- a ductal arch view, a superior vena cava (SVC) view, an inferior vena cava (IVC) view, and upper abdomen with the stomach view, and the like may be used as observation operations.
- the ultrasound apparatus 100 may store images selected by a user such that the selected images may automatically match various observation operations, respectively.
- the user may himself or herself efficiently diagnose an object without having to process ultrasound volume data with respect to each of various observation operations.
- the display unit 130 may display an image without changing observation planes.
- all of the 4-chamber view, the 5-chamber view, and the 3-vessel & trachea view are observation operations corresponding to an A-plane and may thus match a C-plane as an observation plane.
- the ultrasound apparatus 100 may perform image display by simply switching between the split lines 7101 , 7102 , 7103 , and 7104 to be thickly displayed on the first region 710 while maintaining the observation plane 720 .
- the ultrasound apparatus 100 may avoid unnecessary changing of observation planes.
- the ultrasound apparatus 100 may determine an order of a plurality of observation operations according to the types of observation planes matching the plurality of observation operations. In other words, when the ultrasound apparatus 100 displays observation operations corresponding to the same observation plane, e.g., an A-plane, a B-plane, or a C-plane, in a consecutive order, a change in content to be displayed may be minimized.
- the ultrasound apparatus 100 displays observation operations corresponding to the same observation plane, e.g., an A-plane, a B-plane, or a C-plane, in a consecutive order, a change in content to be displayed may be minimized.
- the display unit 130 may display a name of an observation operation on the screen of the ultrasound apparatus 100 .
- the display unit 130 may not only display images of respective observation operations but also display names of the respective observation operations in the form of text, on the first region 710 of the screen.
- a user may view the images of the respective observation operations displayed on the first region 710 and may further exactly identify a current observation operation.
- the display unit 130 may mark the images 7301 , 7302 , and 7303 , which are respectively selected with respect to observation operations according to an external input signal, to be differentiated from other images.
- the display unit 130 may express the images 7301 , 7302 , and 7303 selected from among images displayed on the third region 730 of the screen with a different chroma or color, mark a number indicating that the selected images 7301 , 7302 , and 7303 are selected, or mark names of the observation operations matching the respective selected images 7301 , 7302 , and 7303 .
- a user may select an image of the current observation operation, based on marked images selected with respect to the observation operations.
- FIG. 8 illustrates a process of displaying an exemplary image of an observation operation, according to an embodiment of the present invention.
- the display unit 130 may display an exemplary image of a determined observation operation 8104 on a second region 820 of a screen 800 , instead of a plurality of split lines that split ultrasound volume data.
- the display unit 130 may display exemplary images of respective observation operations, stored in the storage unit 110 , so that an image may be selected from among a plurality of images, based on an ideal ultrasound image of a selected observation operation.
- an exemplary image may be a real image obtained from an object but may be a simple picture as illustrated in FIG. 8 .
- the display unit 130 may display an exemplary image of a 4-chamber view on the second region 820 , and a user may select an image that most exactly represents the 4-chamber view from among a plurality of images displayed on a third region 830 of the screen 800 , based on the exemplary image.
- FIG. 8 illustrates that the display unit 130 displays the exemplary image on the second region 820 , but split lines that split ultrasound volume data may be displayed on the second region 820 as illustrated in FIGS. 5 to 7 and the exemplary image may be displayed on a fourth region (not shown) of the screen 800 .
- the display unit 130 may further display a text 840 which a user may refer to, on the screen 800 of the ultrasound apparatus 100 .
- FIG. 11 illustrates that the display unit 130 displays a screen 1100 including the first to third regions 710 to 730 shown in FIG. 7 , together with a flowchart 1110 of an observation operation, according to an embodiment of the present invention. That is, the display unit 130 may display an order of a current observation operation in all of a plurality of observation operations.
- the display unit 130 may display this order.
- the display unit 130 may display a region corresponding to the 5-chamber view, with a rectangle or with a different color so that the image region may be visually differentiated from other regions.
- the ultrasound apparatus 100 may select another observation operation according to an external input signal. If a current observation operation is the 5-chamber view and a user desires to change an image stored to match the 4-chamber view, the ultrasound apparatus 100 may receive an external input signal, e.g., an input of touching a region matching the 4-chamber view. Thus, the ultrasound apparatus 100 may switch the current observation operation from the 5-chamber view to the 4-chamber view, and store a new image of the 4-chamber view by displaying images of planes that split ultrasound volume data again.
- an external input signal e.g., an input of touching a region matching the 4-chamber view.
- a user may easily check an order of the current observation operation in all of the observation operations, and may arbitrarily change the order of the observation operations.
- FIGS. 9 and 10 are diagrams illustrating processes of displaying an observation operation together with an image corresponding to the observation operation, according to embodiments of the present invention.
- the display unit 130 may display an image stored to match an observation operation, together with the observation operation. If images that are respectively selected for a plurality of observation operations according to an external input signal are stored, the display unit 130 may display the selected images together with the plurality of observation operations corresponding thereto.
- the display unit 130 may display not only four observation operations 1 , 2 , 3 , and 4 on a first region 910 but also images 911 , 912 , 913 , and 914 stored to respectively correspond to the observation operations 1 , 2 , 3 , and 4 .
- the display unit 130 may expand and display only an image corresponding to an observation operation selected from among the four observation operations 1 , 2 , 3 , and 4 .
- the display unit 130 may expand and display only image stored to correspond to the selected observation operation, without displaying all of the images 911 , 912 , 913 , and 914 .
- a user may observe the image matching the selected observation operation in a larger size than when all of the images 911 , 912 , 913 , and 914 are displayed.
- the control unit 140 may control the storage unit 110 to store a new image corresponding to the observation operation, based on a received user input. That is, when a user determines that the image stored to correspond to the observation operation is not appropriate and thus desires to store a new image corresponding to the observation operation, the control unit 140 may control the display unit 130 to display a plurality of images, which are obtained for the observation operation by using the image processing unit 120 , again.
- the user input unit may receive an external input signal that instructs to select a new image from the user.
- the control unit 140 may control a plurality of images for the 4-chamber view, which are obtained by splitting ultrasound volume data by the image processing unit 120 , to be displayed again.
- the storage unit 110 may store a new image for the 4-chamber view, based on the external input signal.
- control unit 140 may control not only data stored for the observation operation but also data stored for a reference point and an observation plane, to be changed. Specifically, the control unit 140 may control the image processing unit 120 to determine a new reference point and a new observation plane. According to another embodiment of the present invention, the control unit 140 may control images stored to match all observation operations other than images stored to match one observation operation, to be determined again.
- the display unit 130 may display observation operations 1 , 2 , and 3 on a first region 1010 , and images 1011 , 1012 , and 1013 stored to correspond to the observation operations 1 , 2 , and 3 .
- the images 911 , 912 , 913 , and 914 stored to correspond to the observation operations 1 , 2 , 3 , and 4 are images of A-planes that horizontally split ultrasound volume data
- the images 1011 , 112 , and 1013 stored to correspond to the observation operations 1 , 2 , and 3 are images of B-planes or C-planes that vertically split ultrasound volume data.
- the display unit 130 displays observation operations and images corresponding thereto, based on whether each of the observation planes corresponding to these images is an A-plane, a B-plane, or a C-plane. That is, the display unit 130 may display the images stored to correspond to the observation operations such that the images may be classified according to observation planes.
- the methods described above may be embodied as a computer program.
- the computer program may be stored in a computer-readable recording medium, and executed using a general digital computer. Data structures employed in the methods may be recorded on a computer-readable recording medium via various means. Devices that may be used to store programs including computer codes for performing various methods according to the present invention should not be understood as including temporary objects, such as carrier waves or signals. Examples of the computer-readable medium may include a magnetic recording medium (a ROM, a floppy disc, a hard disc, etc.), and an optical recording medium (a CD-ROM, a DVD, etc.).
- a conventional ultrasound apparatus cannot efficiently diagnose an object, according to the skill level of a user.
- the location and direction of which are irregular, e.g., a fetal heart a result of diagnosing the object may often vary according to users.
- a user may be automatically provided an image for an observation operation from ultrasound volume data. Accordingly, the object may be efficiently and conveniently diagnosed through an ultrasound examination.
- a user may easily obtain and interpret an ultrasound image of a fetal heart without cumbersome manipulations. Also, even inexperienced users are capable of efficiently obtaining an ultrasound image according to a predetermined protocol. That is, when an ultrasound image of an object, e.g., a fetal heart, is obtained and the object is diagnosed based on the ultrasound image, dependence upon a user and a rate of success may be improved.
- an ultrasound image of an object e.g., a fetal heart
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Acoustics & Sound (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- Physiology (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
- This application claims the benefit of Korean Patent Application No. 10-2012-0090898, filed on Aug. 20, 2012, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference.
- 1. Field of the Invention
- The present invention relates to a method and apparatus for efficiently managing an ultrasound image of a fetal heart, and a method and apparatus for displaying an ultrasound image for a user to diagnose an object.
- 2. Description of the Related Art
- An ultrasound diagnosis apparatus obtains an image of a desired internal part of an object by generating an ultrasound signal (generally, an ultrasound signal of 20 kHz or higher) by using a probe, and using information about an echo signal reflected from the desired internal part. In particular, the ultrasound diagnosis apparatus is used for medical purposes, e.g., to detect foreign substances in an object, and measure and observe the degree of injury of the object. The ultrasound diagnosis apparatus has been widely used together with other image diagnosis apparatuses, since the ultrasound diagnosis apparatus is more stable, is capable of displaying images in real time, and hardly causes exposure to radiation, compared to an X-ray examination.
- Fetal cardiac malformation occupies a large part of fetal diseases. However, a location of a fetal heart frequently varies according to a posture of a fetus within a placenta, unlike an adult's heart. Accordingly, it is very difficult for doctors, who are not sufficiently trained, to obtain an ultrasound image of a fetal heart.
- Many guidelines and protocols have been suggested for users of ultrasound diagnosis apparatuses to efficiently obtain ultrasound images of a fetal heart. However, such guidelines and protocols are inconvenient to use because they include operations, such as rotating, moving, expanding, and scaling down ultrasound volume data along the x-axis, y-axis, and z-axis. Also, much time and effort has to be spent performing measurements for various observation operations of observing a fetal heart.
- Thus, the present invention provides a method and apparatus for obtaining and displaying an ultrasound image of a fetal heart from ultrasound volume data. The present invention also provides a computer-readable recording medium having recorded thereon a computer program for performing the method.
- According to an aspect of the present invention, there is provided a method of managing an ultrasound image, the method including determining a reference point and an observation plane of ultrasound volume data; obtaining a plurality of images by splitting the ultrasound volume data, based on the reference point and the observation plane; and storing an image selected from among the plurality of images according to an external input signal such that the selected image matches a current observation operation from among a plurality of observation operations for observing an object.
- The obtaining of the plurality of images may include obtaining the plurality of images by splitting the ultrasound volume data such that planes that split the ultrasound volume data form a predetermined angle with the observation plane.
- The obtaining of the plurality of images may include obtaining the plurality of images by splitting the ultrasound volume data such that planes that split the ultrasound volume data intersect with respect to the reference point.
- The obtaining of the plurality of images may include obtaining the plurality of images by adjusting at least one of distances between and a total number of planes that split the ultrasound volume data.
- The observation plane may be one of an A-plane, a B-plane, and a C-plane of the object, included in the ultrasound volume data.
- The storing of the image selected from among the plurality of images may include storing location information of planes that split the ultrasound volume data, together with the selected image, so as to obtain the selected image.
- The method may further include displaying the current observation operation and an image stored to match the current observation operation together.
- The displaying of the current observation operation and the image stored to match the current observation operation together may include displaying an exemplary image of the current observation operation.
- The method may further include arranging the ultrasound volume data such that a reference region including the reference point is disposed in a determined direction.
- The method may further include selecting the current observation operation from among the plurality of observation operations. The selecting of the current observation operation may include selecting the current observation operation from among the plurality of observation operations, according to a predetermined order or an external input signal.
- The method may further include selecting one of the plurality of observation operation as a new current observation operation. The method may be repeatedly performed.
- According to another aspect of the present invention, there is provided a method of managing an ultrasound image, the method including storing a plurality of pieces of setting information to respectively match a plurality of observation operations for diagnosing an object, each of the plurality of pieces of the setting information including at least one from among information about an observation plane that splits ultrasound volume data in a predetermined direction, information about a split method of splitting the ultrasound volume data, and information about a reference point; displaying a plurality of plane images for a current observation operation which is one of the plurality of observation operations, wherein the plurality of plane images are obtained based on a piece of the setting information matching the current observation operation; and storing an image selected from among the plurality of plane images according to an external input signal such that the selected image matches the current observation operation.
- The method may further include repeatedly performing a process of displaying a new observation operation and storing an image to match the new observation operation.
- According to another aspect of the present invention, there is provided a method of displaying an ultrasound image, the method including displaying an order of a current observation operation for diagnosing an object in a plurality of observation operation, on a first region of a screen; displaying at least one of an observation plane corresponding to the current observation operation, a reference point, and a split method of splitting ultrasound volume data, on a second region of the screen; and displaying a plurality of images of planes that split the ultrasound volume data based on the observation plane and the reference point, on a third region of the screen.
- The current observation operation may be selected from among the plurality of observation operations, according to a predetermined order or an external input signal.
- The method may further include marking an image, which is selected from among the plurality of images according to an external input signal, on the third region such that the selected image is differentiated from the other images.
- According to another aspect of the present invention, there is provided an ultrasound apparatus including a storage unit for storing ultrasound volume data; an image processing unit for determining a reference point and an observation plane for the ultrasound volume data, and obtaining a plurality of images by splitting the ultrasound volume data, based on the reference point and the observation plane; a display unit for displaying the plurality of images; and a control unit for controlling the storage unit, the image processing unit, and the display unit. The storage unit stores an image selected from among the plurality of images according to an external input signal such that the selected image matches a current observation operation from among a plurality of observation operations for observing an object.
- According to another aspect of the present invention, there is provided a computer-readable recording medium having recorded thereon a computer program for executing the method of managing an ultrasound image and the method of displaying an ultrasound image.
- The above and other features and advantages of the present invention will become more apparent by describing in detail exemplary embodiments thereof with reference to the following attached drawings. Here, reference numerals denote structural elements.
-
FIG. 1 is a block diagram of an ultrasound apparatus according to an embodiment of the present invention; -
FIG. 2 is a flowchart illustrating a method of managing an ultrasound image according to an embodiment of the present invention; -
FIG. 3 is a flowchart illustrating a method of displaying an ultrasound image according to an embodiment of the present invention; -
FIGS. 4A and 4B are images illustrating a process of determining a reference point, performed by an ultrasound apparatus, according to an embodiment of the present invention; -
FIG. 5 illustrates a process of obtaining a plurality of images and storing an image selected from among the plurality of images, performed by an ultrasound apparatus, according to an embodiment of the present invention; -
FIG. 6 illustrates a process of obtaining a plurality of images and storing an image selected from among the plurality of images, performed by an ultrasound apparatus, according to another embodiment of the present invention; -
FIG. 7 illustrates a process of storing an image selected from among a plurality of images according to an external input image such that the selected image may match an observation operation, according to an embodiment of the present invention; -
FIG. 8 illustrates a process of displaying an exemplary image of an observation operation, according to an embodiment of the present invention; -
FIG. 9 is a diagram illustrating a process of displaying an observation operation together with an image corresponding to the observation operation, according to an embodiment of the present invention; -
FIG. 10 is a diagram illustrating a process of displaying an observation operation together with an image corresponding to the observation operation, according to another embodiment of the present invention; and -
FIG. 11 is a diagram illustrating a process of displaying an order of a current observation operation in all of the observation operations, according to an embodiment of the present invention. - Most of the terms used herein are general terms that have been widely used in the technical art to which the present invention pertains. However, some of the terms used herein may be created to reflect the intentions of technicians in this art, precedents, or new technologies. Also, some of the terms used herein may be arbitrarily chosen by the present applicant. In this case, these terms are defined in detail below. Accordingly, the specific terms used herein should be understood based on the unique meanings thereof and the whole context of the present invention.
- In the present specification, it should be understood that the terms, such as ‘include’ or ‘have,’ etc., are intended to indicate the existence of the features, numbers, steps, actions, components, parts, or combinations thereof disclosed in the specification, and are not intended to preclude the possibility that one or more other features, numbers, steps, actions, components, parts, or combinations thereof may exist or may be added. Also, the term ‘. . . unit’ used herein should be understood as an unit that is capable of performing at least one function or operation and that may be embodied in a hardware or software manner or a combination of hardware and software manners.
- As used herein, the term “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
- Hereinafter, exemplary embodiments of the present invention will be described in greater detail.
-
FIG. 1 is a block diagram of anultrasound apparatus 100 according to an embodiment of the present invention. According to an embodiment of the present invention, theultrasound apparatus 100 may include astorage unit 110, animage processing unit 120, adisplay unit 130, and acontrol unit 140. A method of managing and displaying an ultrasound image by using elements of theultrasound apparatus 100 will now be described in detail. - The
storage unit 110 stores ultrasound volume data. The ultrasound volume data stored in thestorage unit 110 is obtained by scanning an object with an ultrasound probe. The ultrasound volume data is a three-dimensional (3D) image having a fan shape rather than a rectangular parallelepiped, according to the characteristics of theultrasound apparatus 100. The present invention will now be described with respect to ultrasound volume data obtained by scanning a fetal heart, but the ultrasound volume data is not limited to data obtained by scanning a human body. - Alternatively, the ultrasound volume data stored in the
storage unit 110 may be obtained from a storage medium installed outside theultrasound apparatus 100. Otherwise, thestorage unit 110 may obtain and store the ultrasound volume data by using a picture archiving and communication system (PACS). - The
storage unit 110 may further store an ultrasound image. The ultrasound image stored in thestorage unit 110 may be a two-dimensional (2D) image or a 3D image. If the ultrasound image is a 2D image, then the 2D image may be an image of a plane obtained by splitting the ultrasound volume data. The ultrasound image stored in thestorage unit 110 may be obtained by scanning an object with theultrasound apparatus 100 or may be received through a PACS, in a wired/wireless manner. - The
storage unit 110 may store a plurality of images obtained by theimage processing unit 120, and an image selected from among the plurality of images according to an external input signal. Also, thestorage unit 110 may store the selected image to match a corresponding observation operation. That is, thestorage unit 110 may store images selected to correspond to observation operations such that the selected images may match the observation operations, respectively. - Hereinafter, the term ‘observation operation’ means a process of observing an object by using an observation plane. In other words, a user may diagnose the object by using an observation plane that has been predetermined for each of the observation operations. More specifically, the
ultrasound apparatus 100 may obtain an image by splitting ultrasound volume data based on the observation plane for each of the observation operations, and provide the image to a user. This process will be described in detail with reference toFIGS. 5 to 8 below. - The term ‘observation plane’ means a plane obtained by splitting volume data in a predetermined direction so as to observe an object. That is, the observation plane is a plane, the location of which varies in the volume data, according to the type of the object to be observed and an observation operation.
- Thus, the
storage unit 110 may store information regarding observation operations of observing an object and observation planes such that each of the observation operations may match a corresponding observation plane among the observation planes. For each of the observing operations, thestorage unit 110 may store information regarding an observation plane and a method of splitting volume data. Also, thestorage unit 110 may store information regarding an observation plane matching each of the observation operations, a reference point, and a split method, as setting information. - A 4-chamber view for observing a fetal heart will now be described as an example of an observation operation. The
storage unit 110 may store a C-plane as an observation plane for a 4-chamber view observation operation. Also, thestorage unit 110 may store information regarding a method of splitting an image of a heart, which is to be observed, by a horizontal straight line, as a split method of splitting volume data on the C-plane. Furthermore, thestorage unit 110 may store a center of a descending aorta, as a reference point for rotating the volume data. - According to another embodiment of the present invention, the
storage unit 110 may store location information of a plurality of planes obtained by splitting volume data, in the volume data. A method of storing various information to match an observation operation in thestorage unit 110, according to an embodiment of the present invention, will be described in detail with reference toFIGS. 5 to 8 below. - The
image processing unit 120 determines reference points, observation planes, and split methods with respect to the ultrasound volume data. Information regarding the reference points, observation planes, and split methods may match the respective observation operations. As described above, the information regarding the determined reference points, observation planes, and split methods may be stored in thestorage unit 110. - According to one embodiment of the present invention, the information regarding the reference points, observation planes, and split methods that match each of the observation operations may be determined based information input from a user. That is, various information regarding each observation operation may be determined by the
image processing unit 120, based on the ultrasound volume data and a predetermined algorithm, or may be determined according to an external input signal. This will be described in detail with reference toFIGS. 4A and 4B below. - Also, the
image processing unit 120 obtains a plurality of images by splitting the ultrasound volume data according to a split method, based on a reference point and an observation plane. Any of various split methods may be employed by theimage processing unit 120 to split the ultrasound volume data so as to obtain the plurality of images, as will be described in detail with reference toFIGS. 5 and 6 below. Also, theimage processing unit 120 may obtain the plurality of images by adjusting distances between and a total number of sub data to be split from the ultrasound volume data. - The
display unit 130 may display a plurality of images of planes obtained by splitting the ultrasound volume data, on a screen of theultrasound apparatus 100. Also, thedisplay unit 130 may display an image stored to match an observation operation from among the plurality of images, together with the observation operation. In addition, thedisplay unit 130 may display an exemplary image of the observation operation. - According to another embodiment of the present invention, the
display unit 130 may display an observation operation for an object, which is to be observed, using ultrasound volume data, on a first region of the screen of theultrasound apparatus 100. Also, thedisplay unit 130 may display at least one from among a reference point, an observation plane matching the observation operation, and a split method of splitting the ultrasound volume data, on a second region of the screen. Furthermore, thedisplay unit 130 may display a plurality of images obtained by splitting the ultrasound volume data, based on the observation plane and the reference point, on a third region of the screen. The current embodiment will be described in greater detail with reference toFIGS. 5 to 7 below. - Also, the
display unit 130 may display an order of a current observation operation in all of the observation operations. For example, when an object is diagnosed using a total of five observation operations, that a current observation operation is a second observation operation among the five observation operations may be displayed. Thus, a user may easily understand a whole process, and may select a previous observation operation again through an additional user input. - According to one embodiment of the present invention, the
display unit 130 may include a plurality of modules for performing the above operations. For example, thedisplay unit 130 may include an observation operation display module for displaying an order of a current observation operation in all of a plurality of observation operations. Also, thedisplay unit 130 may include a split information display module for displaying at least one from among an observation plane corresponding to the current observation operation, a reference point, and a split method of splitting volume data. Furthermore, thedisplay unit 130 may include a split screen display module for displaying a plurality of images of planes obtained by splitting the volume data. - The
display unit 130 may include at least one from among a liquid crystal display (LCD), a thin film transistor-LCD, an organic light-emitting diode (OLED) display, a flexible display, and a 3D display. Alternatively, theultrasound apparatus 100 may include at least twodisplay units 130 according to a structure thereof. - The
control unit 140 controls overall operations of theultrasound apparatus 100. Also, thecontrol unit 140 may control thestorage unit 110, theimage processing unit 120, and thedisplay unit 130 to manage and output obtained ultrasound images. - For example, the
control unit 140 may control thestorage unit 110 to store an image selected for an observation operation such that the selected image may match the observation operation, and may then proceed to a subsequent observation operation. In other words, thecontrol unit 140 may control performing of a plurality of observation operations. - The
ultrasound apparatus 100 may further include a user input unit (not shown). The user input unit receives an external input signal for controlling theultrasound apparatus 100 from a user. For example, the user input unit may receive an external input signal for selecting an image corresponding to a current observation operation from among a plurality of images. Also, the user input unit may receive an external input signal for selecting one of a plurality of observation operations. - The user input unit may receive an external input signal via an input unit, e.g., a keyboard, a mouse, or a stylus pen. Also, the user input unit may receive an external input signal that is input by directly touching or dragging on a liquid crystal screen.
- If the
display unit 130 and a touch pad (not shown) form a layered structure to manufacture a touch screen, thedisplay unit 130 may act as the user input unit. In this case, thedisplay unit 130 may sense a touched location, area, and pressure of a touch input. The touch screen may sense not only a real touch but also a proximity touch. - A method of managing and displaying an ultrasound image by using the elements of the
ultrasound apparatus 100 will now be described with reference toFIGS. 2 and 3 . Each of the flowcharts illustrated inFIGS. 2 and 3 includes operations that are sequentially performed by thestorage unit 110, theimage processing unit 120, thedisplay unit 130, and thecontrol unit 140 of theultrasound apparatus 100. -
FIG. 2 is a flowchart illustrating a method of managing an ultrasound image, according to an embodiment of the present invention. In operation S210, theultrasound apparatus 100 determines an observation operation. Specifically, theultrasound apparatus 100 determines an observation operation from among a plurality of observation operations, based on a pre-input order or a user input. - More specifically, for example, when a user desires to observe all of a left atrium, a left ventricle, a right atrium, and a right ventricle of a fetal heart, a 4-chamber view may be appropriate as an observation operation. Also, a 5-chamber view, a 3-vessel & trachea view, a left/right ventricular outflow tract (LVOT/RVOT) view, or an aortic arch view may be used as an observation operation for observing the fetal heart. In addition, it would be apparent to those of ordinary skill in the art that any of various other observation operations may be used.
- In operation S220, the
ultrasound apparatus 100 displays an observation plane matching the observation operation. Specifically, theultrasound apparatus 100 may display an observation plane stored to match the observation operation determined in operation S210. - According to an embodiment of the present invention, the observation plane may be an A-plane, a B-plane, or a C-plane. The A-plane may be an observation plane of ultrasound volume data viewed from above. The B-plane may be an observation plane of the ultrasound volume data viewed from a left or right side. The C-plane may be an observation plane of the ultrasound volume data viewed from a front side. That is, the A-plane, the B-plane, and the C-plane mean a transverse plane, a sagittal plane, and a coronal plane of the ultrasound volume data, respectively. Thus, the observation plane may be one of the A-plane, the B-plane, and the C-plane of an object, included in the ultrasound volume data.
- According to another embodiment of the present invention, in operation S220, the
ultrasound apparatus 100 may display more than one observation plane. That is, theultrasound apparatus 100 may display all of the observation planes, namely, the A-plane, the B-plane, and the C-plane. - According to another embodiment of the present invention, in operation S220, the
ultrasound apparatus 100 may rotate the ultrasound volume data to obtain the observation plane. That is, theultrasound apparatus 100 may obtain the observation plane matching the observation operation by rotating the ultrasound volume data with respect to a reference point. Information regarding the reference point and the degree of rotating the ultrasound volume data may have been stored to match the observation operation. - In operation S230, the
ultrasound apparatus 100 splits the ultrasound volume data according to the reference point and a split method matching the observation operation. The reference point is a point representing a spatial location in the ultrasound volume data, and may be expressed with 3D coordinates. The reference point may also be a location on the observation plane. A process of determining the reference point will be described in detail with reference toFIG. 4 below. - Any of various split methods may be used to split the ultrasound volume data, in operation S230. For example, the
ultrasound apparatus 100 may split the ultrasound volume data, such that planes of the ultrasound volume data may intersect with one another or may be disposed apart from one another by a predetermined distance, with respect to the reference point. - For example, a case where the observation plane is the C-plane, i.e., a direction towards a front side of the ultrasound volume data, may be considered. The
ultrasound apparatus 100 may determine a plurality of split lines that split the ultrasound volume data in left and right directions of the C-plane. The plurality of split lines may be arranged such that the distances between the plurality of split lines may be the same in a vertical direction with respect to the reference point. The plurality of split lines are shown as one-directional (1D) lines on the observation plane, but may mean planes that split the ultrasound volume data. Theultrasound apparatus 100 may split the ultrasound volume data with the plurality of split lines determined as described above. - The
ultrasound apparatus 100 may obtain a plurality of images from a result of splitting the ultrasound volume data in operation S230. - In operation S240, the
ultrasound apparatus 100 displays the plurality of images obtained by splitting the ultrasound volume data. That is, theultrasound apparatus 100 may display the plurality of images as candidates of the observation operation determined in operation S210. - In operation S250, the
ultrasound apparatus 100 stores an image selected from among the plurality of images such that the selected image may match the observation operation. The stored image may be selected according to an external input signal, or an image closest to an exemplary image stored in theultrasound apparatus 100 may be selected by comparing the plurality of images with the exemplary image. - For example, if the observation operation is a 4-chamber view, a user may select one image that most exactly represents the 4-chamber view from among the plurality of images. The
ultrasound apparatus 100 may store the selected image to match the 4-chamber view. Thus, when the user selects the 4-chamber view, the user may diagnose the object, based on the selected image stored to match the 4-chamber view. The user may conveniently and efficiently diagnose the object, based on images stored to match the 4-chamber view and other various observation operations. - In operation S260, the
ultrasound apparatus 100 selects a next observation operation. As described above, the next observation operation may be selected according to an order that has been previously input to theultrasound apparatus 100, or may be selected according to an external input signal. - For example, the
ultrasound apparatus 100 may store an image matching the observation operation which is the 4-chamber view, and then select the 5-chamber view as the next observation operation according to the previously input order. Theimage processing unit 120 may determine the 4-chamber view, the 5-chamber view, the 3-vessel & trachea view, or the like, as an observation operation, based on an external input signal received via the user input unit. -
FIG. 3 is a flowchart illustrating a method of displaying an ultrasound image, according to an embodiment of the present invention. In operation S310, thedisplay unit 130 may display an observation operation of ultrasound volume data for an object that is to be observed, on the first region of the screen of theultrasound apparatus 100. In other words, thedisplay unit 130 may display the observation operation that is manually or automatically determined by theimage processing unit 120 according to an external input signal. - The displaying of the observation operation means displaying a location of a viewpoint for observing the object in the ultrasound volume data. For example, the
display unit 130 may display an image of the object, e.g., a fetal heart, and a position of the 4-chamber view which is the observation operation, as will be described in detail with reference toFIGS. 5 and 6 . - In operation S320, the
display unit 130 may display at least one from among an observation plane matching the observation operation, a reference point, and a split line for splitting the ultrasound volume data, on the second region of the screen of theultrasound apparatus 100. As described above, the split line for splitting the ultrasound volume data may be determined according to a split method stored to match the observation operation. - If the observation operation is the 4-chamber view, the
display unit 130 may display a C-plane as the observation plane, and display a plurality of split lines that split the ultrasound volume data on the observation plane. Also, thedisplay unit 130 may display a split method of splitting the ultrasound volume data in a left direction and a right direction of the observation plane, with respect to the reference point. - In operation S330, the
display unit 130 may display a plurality of images of planes obtained by splitting the ultrasound volume data based on the split method and the reference point, on the third region of the screen of theultrasound apparatus 100. The plurality of images may be images obtained by splitting the ultrasound volume data by theimage processing unit 120. Alternatively, each of the images may be obtained by splitting the ultrasound volume data by using one of the plurality of split lines displayed on the second region. - A user may efficiently diagnose the object, based on content displayed on the first to third regions of the screen of the
ultrasound apparatus 100 by thedisplay unit 130. That is, an ultrasound image for diagnosing the object may be conveniently obtained based on the observation operation, the observation plane, and the plurality of images of planes obtained by splitting the ultrasound volume data. -
FIGS. 4A and 4B are images illustrating a process of determining a reference point, performed by theultrasound apparatus 100, according to an embodiment of the present invention. - Referring to
FIG. 4A , thedisplay unit 130 of theultrasound apparatus 100 may display an image 410 of an A-plane ofultrasound volume data 440, animage 420 of a B-plane of theultrasound volume data 440, animage 430 of a C-plane ofultrasound volume data 440, and theultrasound volume data 440 on a left upper portion, a right upper portion, a left lower portion, and a right lower portion of ascreen 400 of theultrasound apparatus 100, respectively. Thedisplay unit 130 may output the image 410 of the A-plane, theimage 420 of the B-plane, and theimage 430 of the C-plane of theultrasound volume data 440 stored in thestorage unit 110, to thescreen 400 of theultrasound apparatus 100. - Referring to
FIG. 4A , theimage processing unit 120 may determinereference points ultrasound volume data 440. According to an embodiment of the present invention, theimage processing unit 120 may determine thereference points reference points image processing unit 120 identifies the descending aorta AoD from the 4-chamber view, the centers of images of the descending aorta AoD may be automatically determined as thereference points ultrasound apparatus 100 may receive an external input signal for selecting a center of the descending aorta AoD, and theimage processing unit 120 may determine thereference points - Although
FIG. 4A illustrates a process of determining thereference points ultrasound volume data 440, theimage processing unit 120 may determine thereference points - The
storage unit 110 stores thedetermined reference points storage unit 110 may store thereference points - In
FIG. 4A , areference region 433, including thereference point 435, is shown in theimage 430 of the C-plane. A process of arranging animage 450 of an A-plane, animage 460 of a B-plane, animage 470 of a C-plane, andultrasound volume data 480 with respect to areference region 473 will now be described with reference toFIG. 4B . - Referring to
FIG. 4B , acentral point 475 and thereference region 473 are shown in theimage 470 of the C-plane.FIG. 4B illustrates theimage 450 of the A-plane, theimage 460 of the B-plane, and theimage 470 of the C-plane obtained by rotating the image 410 of the A-plane, theimage 420 of the B-plane, and theimage 430 of the C-plane with respect to the reference point 435 (or, central point 475) by theimage processing unit 120 in such a manner that thereference region 433 ofFIG. 4A may be vertically disposed. - Not only a reference point but also other criteria for ultrasound volume data are needed for the
ultrasound apparatus 100 to obtain a plurality of images split from the ultrasound volume data according to an observation plane. This is because the ultrasound volume data is a 3D image. That is, a location of the 3D image cannot be exactly expressed only with one point. Thus, not only thereference point 435 but also other criteria should be determined to split the ultrasound volume data according to the observation plane. - In the current embodiment, the
image processing unit 120 determines the other criteria for the 4-chamber view of theultrasound volume data 440 by rotating thereference region 433, including thereference point 435, to be vertically disposed. However, the present invention is not limited thereto, and any of other various methods may be used to determine the other criteria for theultrasound volume data 440. Any of other various methods may be performed with respect to observation operations other than the 4-chamber view. - A process of arranging the
ultrasound volume data 440 and the image 410 of the A-plane, theimage 420 of the B-plane, and theimage 430 of the C-plane, performed by theimage processing unit 120, will now be described in more detail. According to an embodiment of the present invention, theimage processing unit 120 may reverse a brightness value of theimage 430 of the C-plane of the 4-chamber view, based on a predetermined brightness value. Then, theimage processing unit 120 may determine an object to be rotated, based on an 8- or 4-connected component analysis algorithm and a skeletonization algorithm. Furthermore, theultrasound volume data 440 may be rotated by checking an inclination angle of a determined reference region. - In a method of rotating the
ultrasound volume data 440, the image 410 of the A-plane, theimage 420 of the B-plane, and theimage 430 of the C-plane according to another embodiment of the present invention, the brightness value of theimage 430 of the C-plane may be reversed and two boundary lines may be detected according to an edge detection algorithm. Then, the boundary lines may be arranged to be vertically disposed. Alternatively, any of other various methods may be used to arrange the image 410 of the A-plane, theimage 420 of the B-plane, and theimage 430 of the C-plane of theultrasound volume data 440. - The
image processing unit 120 may determine criteria for splitting theultrasound volume data 480 to obtain a plurality of images by obtaining thereference region 473. That is, if an image of the descending aorta AoD illustrated as a reference region is vertically disposed on a C-plane, then theimage processing unit 120 may obtain images for various observation operations by splitting resultant ultrasound volume data. -
FIG. 5 illustrates a process of obtaining a plurality of images and storing an image selected from among the plurality of images, performed by theultrasound apparatus 100, according to an embodiment of the present invention. A case where theimage processing unit 120 determines a 4-chamber view as an observation operation will now be described. - The
display unit 130 may display the 4-chamber view as an observation operation on afirst region 510 of ascreen 500 of theultrasound apparatus 100. That is, thedisplay unit 130 may display a location of a plane corresponding to a 4-chamber view for observing a fetal heart, in the volume data. - Hereinafter, the terms ‘first region’, ‘second region’, and ‘third region’ denote a plurality of regions displayed on a screen of the
ultrasound apparatus 100, regardless of the order thereof. In other words, each of these terms may be selected for convenience of explanation, regardless of locations thereof on the screen of theultrasound apparatus 100. - In
FIG. 5 , fourobservation operations first region 510, and acurrent observation operation 5104 which is a 4-chamber view is thickly displayed from among these observation operations. Here, the term ‘current observation operation’ may mean an observation plane on which a process of selecting and storing an image is performed based on previously stored information about an observation plane, a reference point, and a split method, from among these observation operations. - The
display unit 130 may display an observation plane corresponding to thecurrent observation operation 5104 and a plurality ofsplit lines second region 520 of thescreen 500. Thedisplay unit 130 may further display a reference point matching thecurrent observation operation 5104. - As described above, when the 4-chamber view is determined as an observation operation, the 4-chamber view corresponds to an image of an A-plane obtained by horizontally splitting the ultrasound volume data. That is, a plane obtained by horizontally splitting the ultrasound volume data based on the image of the C-plane is the 4-chamber view. Thus, the
display unit 130 may display the C-plane as the observation plane and the plurality ofsplit lines - On the other hand, since the image of the A-plane may also be obtained by horizontally splitting an image of a B-plane, the
image processing unit 120 may determine a reference point based on the B-plane. In this case, thedisplay unit 130 may display a plurality of split lines that horizontally split the image of the B-plane. - Then, the
image processing unit 120 may obtain a plurality of images by splitting the ultrasound volume data. That is, theimage processing unit 120 may obtain images of planes obtained by splitting the ultrasound volume data by using the plurality ofsplit lines current observation operation 5104. - According to an embodiment of the present invention, the
image processing unit 120 may obtain a plurality of images by adjusting the distances between or a total number of the plurality ofsplit lines image processing unit 120 may obtain a plurality of images by more densely or sparsely splitting the ultrasound volume data by arbitrarily adjusting the distances between the plurality ofsplit lines image processing unit 120 may adjust the number of images to be obtained by adjusting the total number ofsplit lines - Then, the
display unit 130 may display the images of the planes obtained by splitting the ultrasound volume data by using the plurality ofsplit lines third region 530 of thescreen 530. For example, sixteen images displayed on thescreen 500 of theultrasound apparatus 100 are images of planes obtained by splitting the ultrasound volume data by using the plurality ofsplit lines second region 520. - Then, the user input unit may receive an input for selecting one of the plurality of images from a user. That is, the
display unit 130 may receive an input for selecting an image that most exactly represents the 4-chamber view from among the images of the planes obtained by splitting the ultrasound volume data, from the user. Further, thestorage unit 110 may store an image selected according to an external input signal to match an observation operation. - According to an embodiment of the present invention, when the selected image is stored to match the observation operation, the
storage unit 110 may also store either information about split lines that split the ultrasound volume data or location information of a plane corresponding to the selected image so as to obtain the selected image. For example, a case where the image selected from among the plurality of images displayed inFIG. 5 corresponds to the plane obtained by splitting the ultrasound volume data by using thelast split line 5216 from among the plurality ofsplit lines second region 520 may be considered. In this case, thestorage unit 110 may store information about the 4-chamber view which is the observation operation, the selected image, and thelast split line 5216. Otherwise, thestorage unit 110 may store the location information of the plane based on thelast split line 5216, instead of thelast split line 5216. - According to another embodiment of the present invention, the
storage unit 110 may store not only an image selected from among a plurality of images, which are obtained by theimage processing unit 120, according to an external input signal, but also the other images. Thus, if thecurrent observation operation 5104 is selected again, thedisplay unit 130 may display the stored plurality of images again and theimage processing unit 120 thus needs not to split the ultrasound volume data again. According to this embodiment, a time needed for theimage processing unit 120 to split the ultrasound volume data may be reduced. -
FIG. 5 illustrates a result of splitting ultrasound volume data based on the C-plane as an observation plane and the plurality ofsplit lines image processing unit 120. In other words, the plurality ofsplit lines - However, although the planes that split the ultrasound volume data are displayed in parallel on the C-plane which is an observation plane, the planes may not be parallel with one another, as shown on the
first region 510. That is, each of the planes that split the ultrasound volume data may form a predetermined angle with respect to the C-plane which is an observation plane. - In other words, the planes that split the ultrasound volume data may split the ultrasound volume data to be parallel with the A-plane which is an observation plane, but may split the ultrasound volume data not to be parallel with the A-plane, as shown in the
first region 510. In other words, theimage processing unit 120 may split the ultrasound volume data such that the planes that split the ultrasound volume data may contact the A-plane. - The ultrasound volume data may be split according to any of various methods other than the method described with reference to
FIG. 5 . A method of splitting the ultrasound volume data, according to another embodiment of the present invention, will be described with reference toFIG. 6 below. -
FIG. 6 illustrates a process of obtaining a plurality of images and storing an image selected from among the plurality of images, performed by an ultrasound apparatus, according to another embodiment of the present invention. - Similar to the process of
FIG. 5 , thedisplay unit 130 may display an observation operation on afirst region 610 of ascreen 600 of theultrasound apparatus 100. Similarly, thedisplay unit 130 may display a plurality ofsplit lines second region 620 of thescreen 600, and may display a plurality of images obtained by splitting the ultrasound volume data by using the plurality ofsplit lines third region 630 of thescreen 600. -
FIG. 6 illustrates a case where an LVOT view is determined as the observation operation for the ultrasound volume data. The LVOT view is an observation operation for observing a left ventricular outflow tract and corresponds to an image of a B-plane. Thedisplay unit 130 may display the determined observation operation on thefirst region 610 of thescreen 600. According to an embodiment of the present invention, thedisplay unit 130 may display an observation operation, which is determined by theimage processing unit 120, thickly or using a different color, so that a user may easily distinguish the determined observation operation from other observation operations. InFIG. 6 , anLVOT view 6103 is displayed thickly. - In the current embodiment, the
image processing unit 120 may determine a reference point of the ultrasound volume data from a 5-chamber view image, to correspond to the LVOT view which is the observation operation. - Also, in the current embodiment, the
image processing unit 120 may determine an A-plane as an observation plane. That is, since the LVOT view corresponds to a B-plane, an image corresponding to the LVOT view may be selected from among a plurality of images obtained by splitting the ultrasound volume data by using split lines displayed on the A-plane. - Similar to the 4-chamber view, the 5-chamber view corresponds to an A-plane and is an observation operation for observing not only a left atrium, a left ventricle, a right atrium, and a right ventricle of a fetal heart but also an aorta. In the current embodiment, the
image processing unit 120 may determine a center of an aorta observed from the 5-chamber view, as a reference point. - Then, the
image processing unit 120 may obtain a plurality of images split from the ultrasound volume data, based on the observation plane and the reference point. In other words, theimage processing unit 120 may obtain a plurality of images split from the ultrasound volume data by using the plurality ofsplit lines second region 620 of thescreen 600. - Unlike in the process of
FIG. 5 , the plurality of images may be obtained by splitting the ultrasound volume data such that planes that split the ultrasound volume data may intersect with respect to the reference point, according to the process ofFIG. 6 . - That is, the
image processing unit 120 may split the ultrasound volume data such that the plurality ofsplit lines split lines - As described above, since the ultrasound volume data is split, based on the plurality of
split lines display unit 130 may display the images of the planes that split the ultrasound volume data, on thethird region 630 of thescreen 600. - A process of storing an image selected from among the plurality of images according to an external input signal so as to match the observation operation, performed by the
storage unit 110, is as described above with reference toFIG. 5 . -
FIG. 7 illustrates a process of storing an image selected from among a plurality of images according to an external input signal such that the selected image may match an observation operation, according to an embodiment of the present invention. - As described above with reference to
FIGS. 5 and 6 , thedisplay unit 130 may display a plurality of images split from ultrasound volume data, and thestorage unit 110 may store an image selected from the plurality of images according to an external input signal such that the selected image may match an observation operation. Contents displayed on afirst region 710, asecond region 720, and athird region 730 of a screen of theultrasound apparatus 100 by thedisplay unit 130 are substantially the same as those described above with reference toFIGS. 5 and 6 . - In the current embodiment, the
ultrasound apparatus 100 may repeatedly perform the process described above with reference toFIG. 5 or 6 with respect to a plurality of observation operations. That is, theultrasound apparatus 100 stores the selected image to match a first observation operation from among the plurality of observation operations. - Then, the
ultrasound apparatus 100 selects a second observation operation as a next observation operation, and performs the process, which was performed with respect to the first observation operation, with respect to the second observation operation. However, in this case, a plurality of images are obtained using a reference point, an observation plane, and a split method matching the second observation operation. - After an image selected from among a plurality of images is stored to match the second observation operation, the
ultrasound apparatus 100 performs the process with respect to the other operations, e.g., a third observation operation, a fourth observation operation, and so on. As described above, an order of selecting the plurality of observation operations may be determined according to a protocol that has been input to theultrasound apparatus 100. - For example, after a user selects an
image 7301 from among a plurality of images of a 4-chamber view 7104 and theultrasound apparatus 100 stores theimage 7301 to match the 4-chamber view 7104, a 5-chamber view 7103 may be determined as a subsequent observation operation. Then, thestorage unit 110 may store animage 7302 selected from among the plurality of images based on an input received from a user, such that theimage 7301 may match the 5-chamber view 7103. Similarly, theultrasound apparatus 100 may display a plurality of images of a main pulmonary artery view 7102, and may store animage 7303 selected from among the plurality of images according to an external input signal such that theimage 7303 may match the main pulmonary artery view 7102. - Alternatively, a user himself or herself may select an observation operation f. After an image matching the third observation operation is stored, the
ultrasound apparatus 100 may receive an external input signal for selecting the first observation operation when the user desires to replace the image matching the first observation operation with a new image. - For example, a case where the observation operation is a 5-chamber view and the second observation operation is an LVOT view will now be described. As described above with reference to
FIG. 5 , theimage processing unit 120 may use a C-plane as an observation plane to obtain a plurality of images matching the 5-chamber view which is the first observation operation. In other words, theimage processing unit 120 may determine a center of a descending aorta shown in an image of the C-plane, as a reference point, and may split ultrasound volume data to obtain a plurality of A-plane images. - After the
storage unit 110 stores one of the plurality of A-plane images to match the 5-chamber view, thecontrol unit 140 may control theimage processing unit 120 to determine the LVOT view as the second observation operation, according to a predetermined order or an external input signal. When theimage processing unit 120 determines the LVOT view as the second observation operation, a C-plane image may be used as an observation plane. Further, theimage processing unit 120 may determine a new reference point of the C-plane image. That is, theimage processing unit 120 may determine a center of an aorta shown in the C-plane image as a new reference point of the second observation operation. - Examples of an observation operation may include not only the 4-chamber view and the 5-chamber view described above but also other various observation operations. A 3-vessel & trachea view, an LVOT/RVOT view, and an aortic arch view may be used as observation operations as described above. Also, a ductal arch view, a superior vena cava (SVC) view, an inferior vena cava (IVC) view, and upper abdomen with the stomach view, and the like may be used as observation operations.
- By repeatedly performing the process described above, the
ultrasound apparatus 100 may store images selected by a user such that the selected images may automatically match various observation operations, respectively. Thus, the user may himself or herself efficiently diagnose an object without having to process ultrasound volume data with respect to each of various observation operations. - According to an embodiment of the present invention, when the
ultrasound apparatus 100 performs image matching while changing observation operations, if the changed observation operations correspond to the same type of observation plane, then thedisplay unit 130 may display an image without changing observation planes. For example, all of the 4-chamber view, the 5-chamber view, and the 3-vessel & trachea view are observation operations corresponding to an A-plane and may thus match a C-plane as an observation plane. Thus, as a current observation operation is sequentially switched to other observation operations, theultrasound apparatus 100 may perform image display by simply switching between thesplit lines 7101, 7102, 7103, and 7104 to be thickly displayed on thefirst region 710 while maintaining theobservation plane 720. According to the current embodiment, theultrasound apparatus 100 may avoid unnecessary changing of observation planes. - In the previous embodiment, the
ultrasound apparatus 100 may determine an order of a plurality of observation operations according to the types of observation planes matching the plurality of observation operations. In other words, when theultrasound apparatus 100 displays observation operations corresponding to the same observation plane, e.g., an A-plane, a B-plane, or a C-plane, in a consecutive order, a change in content to be displayed may be minimized. - According to another embodiment of the present invention, although not shown in
FIG. 7 , thedisplay unit 130 may display a name of an observation operation on the screen of theultrasound apparatus 100. For example, thedisplay unit 130 may not only display images of respective observation operations but also display names of the respective observation operations in the form of text, on thefirst region 710 of the screen. Thus, a user may view the images of the respective observation operations displayed on thefirst region 710 and may further exactly identify a current observation operation. - According to another embodiment of the present invention, the
display unit 130 may mark theimages display unit 130 may express theimages third region 730 of the screen with a different chroma or color, mark a number indicating that the selectedimages images -
FIG. 8 illustrates a process of displaying an exemplary image of an observation operation, according to an embodiment of the present invention. - In
FIG. 8 , thedisplay unit 130 may display an exemplary image of adetermined observation operation 8104 on asecond region 820 of ascreen 800, instead of a plurality of split lines that split ultrasound volume data. In other words, thedisplay unit 130 may display exemplary images of respective observation operations, stored in thestorage unit 110, so that an image may be selected from among a plurality of images, based on an ideal ultrasound image of a selected observation operation. Here, an exemplary image may be a real image obtained from an object but may be a simple picture as illustrated inFIG. 8 . - Referring to the embodiment of
FIG. 8 , thedisplay unit 130 may display an exemplary image of a 4-chamber view on thesecond region 820, and a user may select an image that most exactly represents the 4-chamber view from among a plurality of images displayed on athird region 830 of thescreen 800, based on the exemplary image. -
FIG. 8 illustrates that thedisplay unit 130 displays the exemplary image on thesecond region 820, but split lines that split ultrasound volume data may be displayed on thesecond region 820 as illustrated inFIGS. 5 to 7 and the exemplary image may be displayed on a fourth region (not shown) of thescreen 800. According to another embodiment of the present invention, thedisplay unit 130 may further display atext 840 which a user may refer to, on thescreen 800 of theultrasound apparatus 100. - Before describing
FIGS. 9 and 10 , an additional embodiment of the present invention will be described with reference toFIG. 11 .FIG. 11 illustrates that thedisplay unit 130 displays ascreen 1100 including the first tothird regions 710 to 730 shown inFIG. 7 , together with aflowchart 1110 of an observation operation, according to an embodiment of the present invention. That is, thedisplay unit 130 may display an order of a current observation operation in all of a plurality of observation operations. - For example, as illustrated in
FIG. 11 , when a 4-chamber view, a 5-chamber view, and a 3-vessel & trachea view are sequentially performed, thedisplay unit 130 may display this order. Although not shown inFIG. 11 , if a current observation operation is the 5-chamber view, thedisplay unit 130 may display a region corresponding to the 5-chamber view, with a rectangle or with a different color so that the image region may be visually differentiated from other regions. - In relation to the embodiment of
FIG. 11 , theultrasound apparatus 100 may select another observation operation according to an external input signal. If a current observation operation is the 5-chamber view and a user desires to change an image stored to match the 4-chamber view, theultrasound apparatus 100 may receive an external input signal, e.g., an input of touching a region matching the 4-chamber view. Thus, theultrasound apparatus 100 may switch the current observation operation from the 5-chamber view to the 4-chamber view, and store a new image of the 4-chamber view by displaying images of planes that split ultrasound volume data again. - According to the embodiment of
FIG. 11 , a user may easily check an order of the current observation operation in all of the observation operations, and may arbitrarily change the order of the observation operations. -
FIGS. 9 and 10 are diagrams illustrating processes of displaying an observation operation together with an image corresponding to the observation operation, according to embodiments of the present invention. - According to an embodiment of the present invention, the
display unit 130 may display an image stored to match an observation operation, together with the observation operation. If images that are respectively selected for a plurality of observation operations according to an external input signal are stored, thedisplay unit 130 may display the selected images together with the plurality of observation operations corresponding thereto. - Referring to the embodiment illustrated in
FIG. 9 , thedisplay unit 130 may display not only fourobservation operations first region 910 but alsoimages observation operations - According to an embodiment of the present invention, unlike in
FIG. 9 , thedisplay unit 130 may expand and display only an image corresponding to an observation operation selected from among the fourobservation operations - Specifically, when one of the
observation operations display unit 130 may expand and display only image stored to correspond to the selected observation operation, without displaying all of theimages images - According to another embodiment of the present invention, while the
display unit 130 displays an observation operation and an image corresponding thereto, thecontrol unit 140 may control thestorage unit 110 to store a new image corresponding to the observation operation, based on a received user input. That is, when a user determines that the image stored to correspond to the observation operation is not appropriate and thus desires to store a new image corresponding to the observation operation, thecontrol unit 140 may control thedisplay unit 130 to display a plurality of images, which are obtained for the observation operation by using theimage processing unit 120, again. - For example, when a user desires to change an image stored for a 4-chamber view, the user input unit may receive an external input signal that instructs to select a new image from the user. Thus, the
control unit 140 may control a plurality of images for the 4-chamber view, which are obtained by splitting ultrasound volume data by theimage processing unit 120, to be displayed again. Then, thestorage unit 110 may store a new image for the 4-chamber view, based on the external input signal. - Alternatively, the
control unit 140 may control not only data stored for the observation operation but also data stored for a reference point and an observation plane, to be changed. Specifically, thecontrol unit 140 may control theimage processing unit 120 to determine a new reference point and a new observation plane. According to another embodiment of the present invention, thecontrol unit 140 may control images stored to match all observation operations other than images stored to match one observation operation, to be determined again. - Referring to
FIG. 10 , similar to the process ofFIG. 9 , thedisplay unit 130 may displayobservation operations first region 1010, andimages observation operations - Referring to
FIG. 9 , theimages observation operations FIG. 10 , theimages observation operations - According to an embodiment of the present invention, as illustrated in
FIGS. 9 and 10 , thedisplay unit 130 displays observation operations and images corresponding thereto, based on whether each of the observation planes corresponding to these images is an A-plane, a B-plane, or a C-plane. That is, thedisplay unit 130 may display the images stored to correspond to the observation operations such that the images may be classified according to observation planes. - The methods described above may be embodied as a computer program. The computer program may be stored in a computer-readable recording medium, and executed using a general digital computer. Data structures employed in the methods may be recorded on a computer-readable recording medium via various means. Devices that may be used to store programs including computer codes for performing various methods according to the present invention should not be understood as including temporary objects, such as carrier waves or signals. Examples of the computer-readable medium may include a magnetic recording medium (a ROM, a floppy disc, a hard disc, etc.), and an optical recording medium (a CD-ROM, a DVD, etc.).
- There are cases where a conventional ultrasound apparatus cannot efficiently diagnose an object, according to the skill level of a user. In particular, in the case of an object, the location and direction of which are irregular, e.g., a fetal heart, a result of diagnosing the object may often vary according to users. However, according to the above embodiments, a user may be automatically provided an image for an observation operation from ultrasound volume data. Accordingly, the object may be efficiently and conveniently diagnosed through an ultrasound examination.
- As described above, a user may easily obtain and interpret an ultrasound image of a fetal heart without cumbersome manipulations. Also, even inexperienced users are capable of efficiently obtaining an ultrasound image according to a predetermined protocol. That is, when an ultrasound image of an object, e.g., a fetal heart, is obtained and the object is diagnosed based on the ultrasound image, dependence upon a user and a rate of success may be improved.
- While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. The exemplary embodiments should be considered in descriptive sense only and not for purposes of limitation. Therefore, the scope of the invention is defined not by the detailed description of the invention but by the appended claims, and all differences within the scope will be construed as being included in the present invention.
Claims (29)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2012-0090898 | 2012-08-20 | ||
KR1020120090898A KR20140024190A (en) | 2012-08-20 | 2012-08-20 | Method for managing and displaying ultrasound image, and apparatus thereto |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140050381A1 true US20140050381A1 (en) | 2014-02-20 |
US9332965B2 US9332965B2 (en) | 2016-05-10 |
Family
ID=49035271
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/971,482 Active 2034-04-29 US9332965B2 (en) | 2012-08-20 | 2013-08-20 | Method and apparatus for managing and displaying ultrasound image according to an observation operation |
Country Status (5)
Country | Link |
---|---|
US (1) | US9332965B2 (en) |
EP (1) | EP2700364B1 (en) |
JP (1) | JP2014036863A (en) |
KR (1) | KR20140024190A (en) |
CN (1) | CN103622722B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150089442A1 (en) * | 2013-09-25 | 2015-03-26 | Samsung Electronics Co., Ltd. | Method for controlling window and electronic device for supporting the same |
US20150262353A1 (en) * | 2014-03-17 | 2015-09-17 | Samsung Medison Co., Ltd. | Method and apparatus for changing at least one of direction and position of plane selection line based on pattern |
US20150302638A1 (en) * | 2012-11-20 | 2015-10-22 | Koninklijke Philips N.V | Automatic positioning of standard planes for real-time fetal heart evaluation |
US9332965B2 (en) * | 2012-08-20 | 2016-05-10 | Samsung Medison Co., Ltd. | Method and apparatus for managing and displaying ultrasound image according to an observation operation |
US20180021019A1 (en) * | 2016-07-20 | 2018-01-25 | Samsung Medison Co., Ltd. | Ultrasound imaging apparatus and control method for the same |
US20190015076A1 (en) * | 2015-12-21 | 2019-01-17 | Koninklijke Philips N.V. | Ultrasound imaging apparatus and ultrasound imaging method for inspecting a volume of a subject |
CN110087551A (en) * | 2017-04-27 | 2019-08-02 | 深圳迈瑞生物医疗电子股份有限公司 | A kind of fetal rhythm supersonic detection method and ultrasonic image-forming system |
US11826198B2 (en) * | 2015-11-11 | 2023-11-28 | Samsung Medison Co. Ltd. | Ultrasound diagnosis apparatus and method of operating the same |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106999146B (en) | 2014-11-18 | 2020-11-10 | C·R·巴德公司 | Ultrasound imaging system with automatic image rendering |
CN107106124B (en) | 2014-11-18 | 2021-01-08 | C·R·巴德公司 | Ultrasound imaging system with automatic image rendering |
US10182537B1 (en) | 2016-01-14 | 2019-01-22 | Just Greens, Llc | Rotomolded vertical farming apparatus and system |
JP7080590B2 (en) * | 2016-07-19 | 2022-06-06 | キヤノンメディカルシステムズ株式会社 | Medical processing equipment, ultrasonic diagnostic equipment, and medical processing programs |
WO2020107144A1 (en) * | 2018-11-26 | 2020-06-04 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasonic diagnostic apparatus and method thereof for quickly distinguishing section, and storage medium |
CN111248941A (en) * | 2018-11-30 | 2020-06-09 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasonic image display method, system and equipment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050251036A1 (en) * | 2003-04-16 | 2005-11-10 | Eastern Virginia Medical School | System, method and medium for acquiring and generating standardized operator independent ultrasound images of fetal, neonatal and adult organs |
US20070167704A1 (en) * | 1998-02-13 | 2007-07-19 | Britton Chance | Transabdominal examination, monitoring and imaging of tissue |
US20070255139A1 (en) * | 2006-04-27 | 2007-11-01 | General Electric Company | User interface for automatic multi-plane imaging ultrasound system |
US20110181590A1 (en) * | 2007-02-22 | 2011-07-28 | Tomtec Imaging Systems Gmbh | Method and apparatus for representing 3d image records in 2d images |
US20110213249A1 (en) * | 2010-03-01 | 2011-09-01 | Yamaguchi University | Ultrasonic diagnostic apparatus |
US20110224546A1 (en) * | 2010-03-10 | 2011-09-15 | Medison Co., Ltd. | Three-dimensional (3d) ultrasound system for scanning object inside human body and method for operating 3d ultrasound system |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3410404B2 (en) | 1999-09-14 | 2003-05-26 | アロカ株式会社 | Ultrasound diagnostic equipment |
JP2001145631A (en) | 1999-11-22 | 2001-05-29 | Aloka Co Ltd | Ultrasonic diagnostic device |
MXPA05011120A (en) * | 2003-04-16 | 2005-12-15 | Eastern Viriginai Medical Scho | System and method for generating operator independent ultrasound images. |
KR100751852B1 (en) * | 2003-12-31 | 2007-08-27 | 주식회사 메디슨 | Apparatus and method for displaying slices of a target object utilizing 3 dimensional ultrasound data thereof |
JP4865575B2 (en) | 2007-01-17 | 2012-02-01 | 日立アロカメディカル株式会社 | Ultrasonic diagnostic equipment |
US9131918B2 (en) * | 2008-12-02 | 2015-09-15 | Samsung Medison Co., Ltd. | 3-dimensional ultrasound image provision using volume slices in an ultrasound system |
KR101117003B1 (en) | 2008-12-02 | 2012-03-19 | 삼성메디슨 주식회사 | Ultrasound system and method of providing 3-dimensional ultrasound images using volume slices |
JP2010148828A (en) | 2008-12-26 | 2010-07-08 | Toshiba Corp | Ultrasonic diagnostic device and control program of ultrasonic diagnostic device |
EP2238913A1 (en) * | 2009-04-01 | 2010-10-13 | Medison Co., Ltd. | 3-dimensional ultrasound image provision using volume slices in an ultrasound system |
KR101120726B1 (en) * | 2009-08-27 | 2012-04-12 | 삼성메디슨 주식회사 | Ultrasound system and method of providing a plurality of slice plane images |
JP5586203B2 (en) * | 2009-10-08 | 2014-09-10 | 株式会社東芝 | Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and ultrasonic image processing program |
JP5606832B2 (en) | 2010-03-05 | 2014-10-15 | 富士フイルム株式会社 | Image diagnosis support apparatus, method, and program |
RU2584127C2 (en) * | 2010-03-23 | 2016-05-20 | Конинклейке Филипс Электроникс Н.В. | Volumetric ultrasound image data reformatted as image plane sequence |
JP5631629B2 (en) * | 2010-05-17 | 2014-11-26 | 株式会社東芝 | Ultrasonic image processing apparatus and ultrasonic diagnostic apparatus |
JP5803909B2 (en) | 2010-12-24 | 2015-11-04 | コニカミノルタ株式会社 | Ultrasonic image generation apparatus and image generation method |
KR20140024190A (en) * | 2012-08-20 | 2014-02-28 | 삼성메디슨 주식회사 | Method for managing and displaying ultrasound image, and apparatus thereto |
-
2012
- 2012-08-20 KR KR1020120090898A patent/KR20140024190A/en active Application Filing
-
2013
- 2013-08-01 EP EP13178932.3A patent/EP2700364B1/en active Active
- 2013-08-19 JP JP2013169821A patent/JP2014036863A/en active Pending
- 2013-08-20 US US13/971,482 patent/US9332965B2/en active Active
- 2013-08-20 CN CN201310363393.8A patent/CN103622722B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070167704A1 (en) * | 1998-02-13 | 2007-07-19 | Britton Chance | Transabdominal examination, monitoring and imaging of tissue |
US20050251036A1 (en) * | 2003-04-16 | 2005-11-10 | Eastern Virginia Medical School | System, method and medium for acquiring and generating standardized operator independent ultrasound images of fetal, neonatal and adult organs |
US20070255139A1 (en) * | 2006-04-27 | 2007-11-01 | General Electric Company | User interface for automatic multi-plane imaging ultrasound system |
US20110181590A1 (en) * | 2007-02-22 | 2011-07-28 | Tomtec Imaging Systems Gmbh | Method and apparatus for representing 3d image records in 2d images |
US20110213249A1 (en) * | 2010-03-01 | 2011-09-01 | Yamaguchi University | Ultrasonic diagnostic apparatus |
US20110224546A1 (en) * | 2010-03-10 | 2011-09-15 | Medison Co., Ltd. | Three-dimensional (3d) ultrasound system for scanning object inside human body and method for operating 3d ultrasound system |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9332965B2 (en) * | 2012-08-20 | 2016-05-10 | Samsung Medison Co., Ltd. | Method and apparatus for managing and displaying ultrasound image according to an observation operation |
US20150302638A1 (en) * | 2012-11-20 | 2015-10-22 | Koninklijke Philips N.V | Automatic positioning of standard planes for real-time fetal heart evaluation |
US9734626B2 (en) * | 2012-11-20 | 2017-08-15 | Koninklijke Philips N.V. | Automatic positioning of standard planes for real-time fetal heart evaluation |
US10410409B2 (en) | 2012-11-20 | 2019-09-10 | Koninklijke Philips N.V. | Automatic positioning of standard planes for real-time fetal heart evaluation |
US9690456B2 (en) * | 2013-09-25 | 2017-06-27 | Samsung Electronics Co., Ltd. | Method for controlling window and electronic device for supporting the same |
US20150089442A1 (en) * | 2013-09-25 | 2015-03-26 | Samsung Electronics Co., Ltd. | Method for controlling window and electronic device for supporting the same |
KR102245202B1 (en) | 2014-03-17 | 2021-04-28 | 삼성메디슨 주식회사 | The method and apparatus for changing at least one of direction and position of plane selection line based on a predetermined pattern |
US20150262353A1 (en) * | 2014-03-17 | 2015-09-17 | Samsung Medison Co., Ltd. | Method and apparatus for changing at least one of direction and position of plane selection line based on pattern |
KR20150108226A (en) * | 2014-03-17 | 2015-09-25 | 삼성메디슨 주식회사 | The method and apparatus for changing at least one of direction and position of plane selection line based on a predetermined pattern |
US9747686B2 (en) * | 2014-03-17 | 2017-08-29 | Samsung Medison Co., Ltd. | Method and apparatus for changing at least one of direction and position of plane selection line based on pattern |
US11826198B2 (en) * | 2015-11-11 | 2023-11-28 | Samsung Medison Co. Ltd. | Ultrasound diagnosis apparatus and method of operating the same |
US20190015076A1 (en) * | 2015-12-21 | 2019-01-17 | Koninklijke Philips N.V. | Ultrasound imaging apparatus and ultrasound imaging method for inspecting a volume of a subject |
US20180021019A1 (en) * | 2016-07-20 | 2018-01-25 | Samsung Medison Co., Ltd. | Ultrasound imaging apparatus and control method for the same |
US11020091B2 (en) * | 2016-07-20 | 2021-06-01 | Samsung Medison Co., Ltd. | Ultrasound imaging apparatus and control method for the same |
US20200205772A1 (en) * | 2017-04-27 | 2020-07-02 | Shenzhen Mindray Bio-Medical Electronics Co., Ltd. | Ultrasonic detection method and ultrasonic imaging system for fetal heart |
US11534133B2 (en) * | 2017-04-27 | 2022-12-27 | Shenzhen Mindray Bio-Medical Electronics Co., Ltd. | Ultrasonic detection method and ultrasonic imaging system for fetal heart |
CN110087551A (en) * | 2017-04-27 | 2019-08-02 | 深圳迈瑞生物医疗电子股份有限公司 | A kind of fetal rhythm supersonic detection method and ultrasonic image-forming system |
Also Published As
Publication number | Publication date |
---|---|
EP2700364A3 (en) | 2014-07-02 |
KR20140024190A (en) | 2014-02-28 |
CN103622722A (en) | 2014-03-12 |
JP2014036863A (en) | 2014-02-27 |
CN103622722B (en) | 2018-02-06 |
EP2700364A2 (en) | 2014-02-26 |
EP2700364B1 (en) | 2017-03-29 |
US9332965B2 (en) | 2016-05-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9332965B2 (en) | Method and apparatus for managing and displaying ultrasound image according to an observation operation | |
JP5702922B2 (en) | An ultrasound system for visualizing an ultrasound probe on an object | |
US8942460B2 (en) | Medical image processing apparatus that normalizes a distance between an inner wall and outer wall of the myocardial region | |
US10121272B2 (en) | Ultrasonic diagnostic apparatus and medical image processing apparatus | |
US9220482B2 (en) | Method for providing ultrasound images and ultrasound apparatus | |
JP2007296334A (en) | User interface and method for displaying information in ultrasonic system | |
JP2007296335A (en) | User interface and method for specifying related information displayed in ultrasonic system | |
JP2010233961A (en) | Image processor and image processing method | |
EP2679158B1 (en) | Method and apparatus for displaying ultrasonic image and information related to the ultrasonic image | |
US20150117729A1 (en) | Polyp detection apparatus and method of operating the same | |
JP2014178458A (en) | Mobile display device for medical images | |
EP2601637B1 (en) | System and method for multi-modality segmentation of internal tissue with live feedback | |
US8636662B2 (en) | Method and system for displaying system parameter information | |
CN108269292B (en) | Method and device for generating two-dimensional projection images from three-dimensional image data sets | |
KR101517752B1 (en) | Diagnosis image apparatus and operating method thereof | |
EP2921114B1 (en) | Method and apparatus for changing a direction or position of plane selection line based on pattern | |
CN106028946B (en) | System for monitoring lesion size trend and method of operation thereof | |
JP2011251113A (en) | Three-dimensional ultrasonograph and method for operating the same | |
JP2010259536A (en) | Image processor, and method for controlling the same | |
EP2674108A1 (en) | Ultrasound diagnosis method and apparatus using electrocardiogram | |
JP2019162314A (en) | Information processing apparatus, information processing method, and program | |
JP6740051B2 (en) | Ultrasonic diagnostic device, medical image processing device, and medical image processing program | |
JP5693412B2 (en) | Image processing apparatus and image processing method | |
KR101545520B1 (en) | Method for displaying ultrasound image, and apparatus thereto | |
EP2807977B1 (en) | Ultrasound diagnosis method and aparatus using three-dimensional volume data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG MEDISON CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, SUNG-MO;KIM, SUNG-YOON;AHN, MI-JEOUNG;AND OTHERS;REEL/FRAME:031046/0597 Effective date: 20130819 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |