KR102030567B1 - Ultrasound system and method for displaying ultrasound images - Google Patents

Ultrasound system and method for displaying ultrasound images Download PDF

Info

Publication number
KR102030567B1
KR102030567B1 KR1020150185082A KR20150185082A KR102030567B1 KR 102030567 B1 KR102030567 B1 KR 102030567B1 KR 1020150185082 A KR1020150185082 A KR 1020150185082A KR 20150185082 A KR20150185082 A KR 20150185082A KR 102030567 B1 KR102030567 B1 KR 102030567B1
Authority
KR
South Korea
Prior art keywords
ultrasound
mode image
signal
processor
image
Prior art date
Application number
KR1020150185082A
Other languages
Korean (ko)
Other versions
KR20170075435A (en
Inventor
김동열
Original Assignee
지멘스 메디컬 솔루션즈 유에스에이, 인크.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 지멘스 메디컬 솔루션즈 유에스에이, 인크. filed Critical 지멘스 메디컬 솔루션즈 유에스에이, 인크.
Priority to KR1020150185082A priority Critical patent/KR102030567B1/en
Publication of KR20170075435A publication Critical patent/KR20170075435A/en
Application granted granted Critical
Publication of KR102030567B1 publication Critical patent/KR102030567B1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data

Abstract

An ultrasound system and method for displaying an ultrasound image is disclosed. The ultrasound system includes an ultrasound probe, a processor, a storage and a display. The ultrasound probe transmits the first ultrasound signal to the object and applies a compressive force to the object. The processor forms a B mode image of the object based on the first ultrasound echo signal, and determines the hardness of the object based on the change of the object due to the compression force. The storage unit stores the B mode image as a previous B mode image. The ultrasound probe transmits a second ultrasound signal to the object and receives a second ultrasound echo signal reflected from the object. The processor forms a new B mode image of the object based on the second ultrasound echo signal. The display unit displays the reference image including the old B mode image and the longitude together with the new B mode image.

Description

ULTRASOUND SYSTEM AND METHOD FOR DISPLAYING ULTRASOUND IMAGES

The present disclosure relates to an ultrasound system, and more particularly, to an ultrasound system and method for displaying an ultrasound image.

Ultrasound systems are widely used in the medical field to obtain information about interested objects in an object. The ultrasound system may provide high-resolution images of the subject in real time using high frequency sound waves without the need for surgical surgery to directly incision and observe the subject. Ultrasonic systems have non-invasive and non-destructive properties and are very important in the medical field.

The ultrasound system provides a B mode (brightness mode) image in which a reflection coefficient of an ultrasound signal (that is, an ultrasound echo signal) reflected from an object of interest in the object is displayed in a two-dimensional image. In such a B mode image, the reflection coefficient of the ultrasonic signal is represented by the brightness of the pixel on the screen. However, the reflection coefficients of abnormal tissues such as tumors, cancers, and diseased tissues are not different from those of normal tissues, and thus it is difficult to observe abnormal tissues in a B mode image.

Ultrasound systems provide an elastic imaging method that images the mechanical properties of abnormal tissues that cannot be observed in B-mode imaging. The elastic imaging method takes advantage of the fact that the elasticity of such tissues is generally different from that of normal tissues, which greatly helps the diagnosis of lesions. For example, abnormal tissues such as tumors, cancers, etc. are harder than normal tissues. Therefore, such abnormal tissues are less deformed than normal tissues when a compression force of the same magnitude is applied from the outside. As described above, the elastic imaging method uses a phenomenon in which the hard tissue is less deformed and the soft tissue is easily changed in shape when the tissue is deformed by applying the same compressive force from the outside.

As one of such elastic imaging methods, shear wave elasticity imaging (SWEI) using acoustic radiation force impulse (ARFI) is known. SWEI measures the stiffness of the object of interest in the object by sending a push pulse to the object to form a shear wave at the object of interest in the object, and sending a tracking pulse to the object to measure the velocity of the shear wave formed by the push pulse. Is an elastic imaging method. This SWEI is called quantitative elastic imaging and is also called virtual touch quantification (VTQ).

In general, the velocity of the transverse wave measured by the VTQ may be changed by environmental factors such as the selection of a position for transmitting the push pulse and the tracking pulse, the user's skill, and the like. For example, the hardness measured in the ROE may be changed according to a location of a region of excitation (ROE) set in the B mode image of the object. Therefore, in order to accurately measure the hardness of the object (ie, the object of interest), it is important to set the ROE at the same position of the B mode image of the same object.

Japanese Laid-Open Patent Publication No. 2015-131097 Japanese Laid-Open Patent Publication No. 2011-115456

The present disclosure provides an ultrasound system and method for displaying a reference image including a previous B mode image of an object and a hardness of the object together with a new B mode image of the object.

In one embodiment, the ultrasound system includes an ultrasound probe, a processor, a storage and a display. The ultrasound probe is configured to transmit a first ultrasound signal to an object, receive a first ultrasound echo signal reflected from the object, and apply a compressive force to the object. The processor is configured to form a B mode image based on the first ultrasonic echo signal, and determine the hardness of the object based on the change of the object caused by the compressive force. The storage unit stores the B mode image as a previous B mode image. The display unit is configured to display the B mode image. The ultrasound probe is further configured to transmit a second ultrasound signal to the object and to receive a second ultrasound echo signal reflected from the object. The processor is further configured to form a new B mode image based on the second ultrasonic echo signal. The display unit is further configured to display the reference image including the previous B mode image and the longitude together with the new B mode image.

In another exemplary embodiment, a method of displaying an ultrasound image of an object may include transmitting a first ultrasound signal to an object and receiving a first ultrasound echo signal reflected from the object to form a B mode image of the object; Storing a B mode image as a previous B mode image, determining a stiffness of the object based on a change of the object due to a compressive force applied to the object, and transmitting a second ultrasound signal to the object Transmitting and receiving a second ultrasound echo signal reflected from the object to form a new B mode image for the object, and a reference image including the previous B mode image and the longitude along with the new B mode image Displaying.

According to the present disclosure, a reference image including the previous B mode image of the object and the longitude of the object may be displayed together with the new B mode image of the object. By displaying the reference image and the new B mode image together, the user may set the ROE at the same position of the new B mode image based on a region of excitation (ROE) set in the previous B mode image of the reference image. Therefore, it is possible to increase the accuracy of the hardness of the object measured in the ROE.

1 is a block diagram schematically showing the configuration of an ultrasound system according to an embodiment of the present disclosure.
2 is an explanatory diagram showing a region of excitation (ROE) according to an embodiment of the present disclosure.
3 is a block diagram schematically illustrating a configuration of a processor according to an embodiment of the present disclosure.
4 illustrates an example of transmitting a second ultrasound signal to an object according to an exemplary embodiment of the present disclosure.
5 illustrates an example of transmitting a third ultrasound signal to an object according to an exemplary embodiment of the present disclosure.
6 illustrates an example of displaying a reference picture with a new B mode picture according to an embodiment of the present disclosure.
7 illustrates an example of displaying a reference image and a thumbnail image together with a new B mode image according to an embodiment of the present disclosure.
8 is a flowchart illustrating a procedure of displaying an ultrasound image according to an embodiment of the present disclosure.

Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. The term " part " used in this embodiment means software or a hardware component such as a field-programmable gate array (FPGA), an application specific integrated circuit (ASIC), or the like. However, "part" is not limited to hardware and software. The "unit" may be configured to be in an addressable storage medium, and may be configured to play one or more processors. Thus, as an example, "parts" means components such as software components, object-oriented software components, class components, and task components, and processors, functions, properties, procedures, subroutines, program code. Includes segments, drivers, firmware, microcode, circuits, data, databases, data structures, tables, arrays, and variables. Functions provided within a component and "part" may be combined into a smaller number of components and "parts" or further separated into additional components and "parts".

1 is a block diagram schematically showing the configuration of an ultrasound system 100 according to an embodiment of the present disclosure. The ultrasound system 100 includes a control panel 110, an ultrasound probe 120, a processor 130, a storage 140, and a display 150. In the present embodiment, the processor 130 controls the control panel 110, the ultrasonic probe 120, the storage 130, and the display 150.

The control panel 110 receives input information from a user and transmits the received input information to the processor 130. The control panel 110 may include an input device (not shown) that enables an interface between the user and the ultrasound system 100. The input device may include an input unit suitable for performing an operation such as selection of a diagnostic mode, control of a diagnostic operation, input of a command required for diagnosis, signal manipulation, output control, and the like, for example, a trackball, a keyboard, a button, and the like.

In one embodiment, the input information includes first input information that selects B mode as the diagnostic mode. For example, the B mode is a mode for obtaining a B mode image of an object.

In another embodiment, the input information includes second input information for setting a region of interest at a predetermined position in an ultrasound image (eg, a B mode image) of the object. For example, as shown in FIG. 2, the first input information may be input information for setting a region of excitation (ROE) 220 as a region of interest at a predetermined position in the B mode image 210 of the object. . In FIG. 2, reference numeral 230 denotes an object of interest in the object. The object of interest 230 may include a liver, a fat layer, and the like.

In another embodiment, the input information may further include third input information for selecting the quantitative elasticity mode as the diagnostic mode. For example, the quantitative elasticity mode may be a mode for determining the stiffness of the object (ie, the object of interest 230) in the ROE 220.

In another embodiment, the input information may further include fourth input information for selecting clinical information of the subject. The clinical information of the subject may include ultrasound velocity at the object of interest 230 of the subject. For example, the ultrasound velocity in the liver may be 1570 m / s, and the ultrasound velocity in the fat layer may be 1450 m / s.

The ultrasonic probe 120 includes an ultrasonic transducer (not shown) configured to mutually convert an electrical signal and an ultrasonic signal. The ultrasound probe 120 transmits an ultrasound signal to the object and receives an ultrasound signal (ie, an ultrasound echo signal) reflected from the object to form an electrical signal (hereinafter, referred to as a "receive signal").

The processor 130 may control the ultrasound probe 120 to transmit the ultrasound signal to the object and to receive the ultrasound echo signal reflected from the object in response to the input information received through the control panel 110. In addition, the processor 130 may form a plurality of ultrasound images and elastic information of the object based on the received signal provided from the ultrasound probe 120.

The storage 140 sequentially stores the received signal formed by the ultrasonic probe 120 for each frame. In addition, the storage 140 sequentially stores ultrasound images (eg, B mode images) formed by the processor 130. In addition, the storage 140 may store instructions for operating the ultrasound system 100.

The display unit 150 displays a plurality of ultrasound images (for example, a B mode image) formed in the processor 130. In addition, the display unit 150 displays elastic information formed in the processor 130. In addition, the display unit 150 may display suitable information about the ultrasound image or the ultrasound system 100.

3 is a block diagram schematically illustrating a configuration of a processor 130 according to an embodiment of the present disclosure. The processor 130 includes a transmitter 310. The transmitter 310 forms an electrical signal (hereinafter referred to as a "transmission signal") for acquiring ultrasonic data corresponding to a plurality of frames (for example, a B mode image). In addition, the transmitter 310 forms a transmission signal for determining the hardness of the object (that is, the object of interest 230) in the ROE 220.

In one embodiment, the transmitter 310 based on the first input information provided from the control panel 110, the transmission signal for obtaining each of the ultrasound data of the plurality of frames (that is, B mode image) (hereinafter, " A first transmission signal ". The first transmission signal is provided to the ultrasonic probe 120. The ultrasound probe 120 converts the first transmission signal into an ultrasound signal (hereinafter, referred to as a “first ultrasound signal”) and transmits the first ultrasound signal to the object. The ultrasound probe 120 receives an ultrasound echo signal reflected from an object and forms a reception signal (hereinafter, referred to as a “first reception signal”).

In another exemplary embodiment, the transmitter 310 may transmit a signal for applying a compression force to the object in response to the third input information provided from the control panel 110 (hereinafter referred to as a “second transmission signal”). And a transmission signal (hereinafter referred to as "third transmission signal") for detecting the change of the object due to the compression force. The compressive force may include acoustic radiation force, and the change of the object may include the velocity of the transverse wave formed by the compressive force.

For example, the transmitter 310 forms a second transmission signal for applying a compressive force (ie, acoustic radiation force) to the object in response to the third input information. The second transmission signal may be focused at a predetermined position spaced apart from the ROE 220 (see the center of FIG. 4). The second transmission signal is provided to the ultrasonic probe 120. The ultrasound probe 120 converts the second transmission signal into an ultrasound signal (hereinafter referred to as a “second ultrasound signal”) and transmits the second ultrasound signal to the object. The second ultrasound signal may be a push pulse signal of a sinusoidal signal having a predetermined frequency.

After a preset time, the transmitter 310 forms a transmission signal (hereinafter referred to as a "third transmission signal") for detecting the change of the object (ie, the velocity of the transverse wave) due to the compression force. As an example, the third transmission signal may be focused on the ROE 220, as shown in FIG. 5. The third transmission signal is provided to the ultrasonic probe 120. The ultrasound probe 120 converts the third transmission signal into an ultrasound signal (hereinafter referred to as a third ultrasound signal) and transmits the third ultrasound signal to the object. The third ultrasound signal may be a tracking pulse signal. The probe 120 receives an ultrasonic echo signal (ie, an echo pulse signal) reflected from an object and forms a reception signal (hereinafter, referred to as a “second reception signal”).

Referring back to FIG. 3, the processor 130 further includes a transmit / receive switch 320 and a receiver 330. The transmission and reception switch 320 serves as a duplexer for switching the transmitter 310 and the receiver 330. For example, when the ultrasound probe 120 alternately transmits and receives the transmission / reception switch 320, the transmitter 310 and the receiver 330 may be properly adapted to the ultrasound probe 120 (that is, the ultrasound transducer). It acts as a switching or electrical connection.

In the processor 130, the receiver 330 may be configured to amplify a received signal received from the ultrasound probe 120 through the transmit / receive switch 320 and convert the amplified received signal into a digital signal. The receiver 330 may include a time gain compensation (TGC) unit (not shown) for compensating for attenuation that normally occurs while the ultrasonic signal passes through the inside of the object, and an analog to digital signal for converting an analog signal into a digital signal. An analog to digital conversion unit (not shown) and the like.

In one embodiment, the receiver 330 amplifies the first received signal provided from the ultrasonic probe 120, and forms the digital signal (hereinafter referred to as "first digital signal") of the amplified first received signal. do. In addition, the receiver 330 amplifies the second received signal provided from the ultrasound probe 120 and forms the amplified second received signal as a digital signal (hereinafter referred to as a “second digital signal”).

The processor 130 further includes a signal processor 340. The signal processor 340 performs beamforming on the digital signal provided from the receiver 330 to form a reception focus signal. In addition, the signal processor 340 forms ultrasound data based on the received focus signal.

In one embodiment, the signal processing unit 340 performs a reception focusing on the first digital signal provided from the reception unit 330 to form a reception focusing signal (hereinafter referred to as a "first reception focusing signal"), One ultrasound data (hereinafter referred to as "first ultrasound data") is formed based on the received focusing signal. For example, the signal processor 340 may perform focusing on the first digital signal based on the fourth input information provided from the control panel 110 and the preset geometric information of the ultrasound probe 120. The geometric information may include at least one of a radius of curvature of a plurality of transducer elements (not shown) and a pitch length between two adjacent transducer elements in the ultrasonic transducer of the ultrasonic probe 120. The first ultrasound data may include radio frequency (RF) data, but is not limited thereto.

In addition, the signal processor 340 performs reception focusing on the second digital signal provided from the reception unit 330 to form a reception focusing signal (hereinafter referred to as a “second reception focusing signal”), and the second reception focusing signal. Based on the above, ultrasonic data (hereinafter referred to as "second ultrasonic data") is formed. For example, the signal processor 340 may perform focusing on the second digital signal based on the fourth input information provided from the control panel 110 and the preset geometric information of the ultrasound probe 120. Therefore, the signal to noise ratio (SNR) of the second digital signal (ie, the tracking pulse) can be increased. The second ultrasound data may include in-phase / quadrature (IQ) data, but is not necessarily limited thereto.

The processor 130 further includes an image forming unit 350. The image forming unit 350 forms a plurality of B mode images of the object based on the first ultrasound data provided from the signal processor 340.

The processor 130 further includes an elastic information forming unit 360. The elastic information forming unit 360 forms elastic information of the object (ie, the object of interest 230) in the ROE 220 based on the second ultrasound data provided from the signal processing unit 340. For example, the elastic information may include, but is not necessarily limited to, a stiffness of the object.

In one embodiment, the elastic information forming unit 360 determines the change of the object in the ROE 220, that is, the velocity of the transverse wave, based on the second ultrasound data provided from the signal processing unit 340. For example, the velocity of the transverse wave may be determined using 2D autocorrelation of the second ultrasound data, that is, the IQ data. In addition, the elastic information forming unit 360 determines the hardness based on the determined speed. Hardness may be determined using various known methods and thus will not be described in detail in this embodiment.

In other embodiments, the hardness may include, but is not necessarily limited to, at least one of mean, standard deviation, quartile and median values.

The processor 130 further includes an image processor 370. The image processor 370 displays the B mode image formed by the image forming unit 350 and the elastic information formed by the elastic information forming unit 360 based on the input information provided from the control panel 110. And control storage.

In an exemplary embodiment, the image processor 370 stores the B mode image formed by the image forming unit 350 as the previous B mode image based on the third input information provided from the control panel 110. Store in

In another embodiment, the image processor 370 may be formed of the previous B mode image and the elastic information forming unit 360 stored in the storage 140 based on the first input information provided from the control panel 110. A reference image including information (ie, longitude) is formed. Also, the image processor 370 may display the new B mode image and the reference image such that the new B mode image (for example, the current B mode image) and the reference image formed by the image forming unit 360 are displayed on the display unit 150. Control the display of the image. For example, as illustrated in FIG. 6, the image processor 370 displays the reference image 610 on the first screen area 151 of the display unit 150, and a new B mode image 210 cur is displayed. The display of the reference image 610 and the new B mode image 210 cur is controlled to be displayed on the second screen area 152 of the display unit 150. In FIG. 6, reference numeral 210 pre denotes a previous B mode image. Optionally, the previous B mode image 210 pre may be a B mode image selected by a user among a plurality of previous B mode images stored in the storage 140.

In another embodiment, the image processor 370 forms a plurality of thumbnail images based on the plurality of previous B mode images stored in the storage 140. In addition, the image processor 370 controls the display of the thumbnail image such that the plurality of thumbnail images are displayed in a predetermined region of the display unit 150. For example, the image processor 370 may display the plurality of thumbnail images 710_1, 710_2, and 710_3 in the lower region of the display unit 150, as shown in FIG. 7. And 710_3). The previous B mode image 210 pre may be a thumbnail image selected by a user among a plurality of thumbnail images 710_1, 710_2, and 710_3.

8 is a flowchart illustrating a procedure of displaying an ultrasound image according to an exemplary embodiment of the present disclosure. When the first input information is received from the control panel 110 (S802), the processor 130 obtains first ultrasound data in response to the first input information (S804).

Specifically, the processor 130 forms a first transmission signal in response to the first input information, and provides the formed first transmission signal to the ultrasound probe 120. The ultrasound probe 120 converts the first transmission signal into a first ultrasound signal, transmits the first ultrasound signal to the object, and receives an ultrasound echo signal reflected from the object to form a first reception signal. The processor 130 forms a first reception focusing signal based on the first reception signal provided from the ultrasound probe 120, and forms first ultrasound data based on the first reception focusing signal.

In some embodiments, the processor 130 may perform beamforming on the first received signal based on the fourth input information provided from the control panel 110 to form the first received focusing signal.

The processor 130 forms a B mode image of the object based on the first ultrasound data in operation S806. The B mode image formed by the processor 130 may be displayed on the display unit 150.

When the second input information is received from the control panel 110 (S806), the processor 130 sets the ROE 220 at a predetermined position of the B mode image displayed on the display unit 150 based on the second input information. (S808).

When the third input information is received from the control panel 110 (S810), the processor 130 stores the formed B mode image as the previous B mode image in the storage 140 (S814). The processor 130 determines a change (ie, longitude) of the object in the ROE 220 based on the third input information (S816).

Specifically, the processor 130 forms a second transmission signal for applying a compressive force to the object based on the third input information, and transmits the second transmission signal to the ultrasound probe 120. The ultrasound probe 120 converts the second transmission signal into a second ultrasound signal and transmits the second ultrasound signal to the object. The second ultrasound signal may be focused at a predetermined position spaced apart from the ROE 220. Therefore, a shear wave may be formed in the ROE 220 by the second ultrasonic signal.

After a preset time, the processor 130 forms a third transmission signal for detecting a change in the object due to the compression force, and provides the third transmission signal to the ultrasound probe 120. The ultrasound probe 120 converts the third transmission signal into a third ultrasound signal and transmits the third ultrasound signal to the object. The third ultrasound signal may be focused on the REO 220. The ultrasound probe 120 receives an ultrasound echo signal (ie, an echo pulse signal) reflected from the object to form a second reception signal. The processor 130 forms a second received focusing signal based on the second received signal provided from the ultrasound probe 120, forms second ultrasound data based on the second received focused signal, and applies the second ultrasound data to the second ultrasound data. The hardness of the object (ie, the object of interest 230) in the ROE 220 is determined based on the determination.

In some embodiments, the processor 130 may perform beamforming on the second received signal based on the fourth input information provided from the control panel 110 to form the second received focusing signal.

Referring back to FIG. 8, the processor 130 determines whether first input information has been received from the control panel 110 again (S818). When it is determined that the first input information is received again from the control panel 110, the processor 130 obtains new ultrasound data based on the first input information (S820). Since step S820 is similar to step S804, the description of step S820 will be omitted.

The processor 130 forms a new B mode image based on the new ultrasound data (S822), and controls the display of the reference image and the new B mode image (S824). In detail, the processor 130 forms a reference image including the longitude of the object in the ROE 220 and a previous B mode image. As an example, the previous B mode image may be a previous B mode image selected by the user among a plurality of previous B mode images stored in the storage 140. As another example, the previous B mode image may be a thumbnail image selected by a user among a plurality of thumbnail images displayed on the display unit 150. As illustrated in FIG. 6, the processor 130 displays a reference image 610 on the first screen area 151 of the display unit 150, and a new B mode image 210 cur is displayed on the display unit 150. The display of the reference image 610 and the new B mode image 210 cur is controlled to be displayed on the second screen area 152 of FIG.

While specific embodiments have been described, these embodiments are presented by way of example and should not be construed as limiting the scope of the disclosure. The novel methods and apparatus of the present disclosure may be embodied in a variety of other forms and furthermore, various omissions, substitutions and changes in the embodiments disclosed herein are possible without departing from the spirit of the present disclosure. The claims appended hereto and their equivalents should be construed to include all such forms and modifications as fall within the scope and spirit of the disclosure.

100: ultrasonic system 110: control panel
120: ultrasonic probe 130: processor
140: storage unit 150: display unit
310: transmitting unit 320: transmission and reception switch
330: receiver 340: signal processor
350: the image forming unit
360: elastic information forming unit
370: Image processor 210: B-mode image
210 pre : Previous B mode image 210 cur : Current B mode image
610: Reference image 710_1, 710_2, 710_3: Thumbnail image

Claims (20)

A method of displaying an ultrasound image of an object,
Transmitting, by the ultrasound probe of the ultrasound system, a first ultrasound signal to the object and receiving a first ultrasound echo signal reflected from the object;
Forming, by the processor of the ultrasound system, a B mode image of the object based on the first ultrasound echo signal;
Storing, by the processor, the B mode image as a previous B mode image in a storage unit of the ultrasound system;
Determining, by the processor, a stiffness of the object based on a change of the object by a compressive force applied to the object;
Transmitting, by the ultrasonic probe, a second ultrasonic signal to the object and receiving a second ultrasonic echo signal reflected from the object;
Forming, by the processor, a new B mode image of the object based on the second ultrasonic echo signal;
Forming, by the processor, a reference image including a region of excitation (ROE) set in the previous B mode image, the longitude, and the previous B mode image;
Displaying, by the display unit of the ultrasound system, the reference image together with the new B mode image.
How to include.
The method of claim 1, wherein the hardness is determined by virtual touch quantification (VTQ). The method of claim 2, wherein the determining of the hardness of the object comprises:
Setting the ROE at a predetermined position of the B mode image;
Transmitting a first pulse to the object to apply the compressive force;
Transmitting a second pulse to the object, receiving an echo pulse reflected from the object,
Detecting a change in the object in the ROE based on the echo pulses
How to include.
The method of claim 3, wherein the change of the object comprises the velocity of the transverse wave formed by the compressive force. The method of claim 3, wherein detecting the change of the subject further comprises performing beamforming on the echopulse based on clinical information of the subject. The method of claim 5, wherein the clinical information comprises ultrasound velocity in the subject. The method of claim 1, wherein the forming of the new B mode image further comprises performing beamforming on the second ultrasound echo signal based on clinical information of the subject. The method of claim 7, wherein the clinical information comprises ultrasound velocity in the subject. The method according to any one of claims 1 to 8,
Displaying the reference image as a thumbnail image
How to include more.
The method of claim 1, wherein the hardness comprises at least one of an average value, standard deviation value, quartile value, and median value. As an ultrasonic system,
An ultrasonic probe configured to transmit a first ultrasonic signal to the object, receive a first ultrasonic echo signal reflected from the object, and apply a compressive force to the object;
A processor configured to form a B mode image of the object based on the first ultrasonic echo signal, and determine the hardness of the object based on a change of the object by the compression force;
A storage unit for storing the B mode image as a previous B mode image;
A display unit configured to display the B mode image
Including,
The ultrasonic probe is further configured to transmit a second ultrasonic signal to the object and receive a second ultrasonic echo signal reflected from the object,
The processor is further configured to form a new B mode image based on the second ultrasonic echo signal, and form a reference image including the previous B mode image, the longitude, and a ROE set to the previous B mode image,
And the display unit is further configured to display the reference image together with the new B mode image.
The ultrasound system of claim 11, wherein the hardness is determined by virtual touch quantification (VTQ). The method of claim 12, wherein the ultrasonic probe
Transmitting a first pulse for applying the compression force to the object based on the ROE set at a predetermined position of the B mode image,
And transmit a second pulse to the object based on the ROE and receive echo pulses reflected from the object.
The method of claim 13, wherein the processor is configured to detect a change of the object in the ROE based on the echo pulse,
And the change of the object comprises the velocity of the transverse wave formed by the compressive force.
The ultrasound system of claim 13, wherein the processor is further configured to perform beamforming on the echopulse based on clinical information of the subject. The ultrasound system of claim 15, wherein the clinical information comprises ultrasound velocity in the subject. The ultrasound system of claim 11, wherein the processor is further configured to perform beamforming on the second ultrasound echo signal based on clinical information of the subject. 18. The ultrasound system of claim 17 wherein the clinical information comprises ultrasound velocity in the subject. The ultrasound system of any one of claims 11 to 18, wherein the display unit is further configured to display the reference image as a thumbnail image. 19. The ultrasound system of any one of claims 11 to 18, wherein the hardness comprises at least one of mean, standard deviation, quartile and median values.
KR1020150185082A 2015-12-23 2015-12-23 Ultrasound system and method for displaying ultrasound images KR102030567B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150185082A KR102030567B1 (en) 2015-12-23 2015-12-23 Ultrasound system and method for displaying ultrasound images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150185082A KR102030567B1 (en) 2015-12-23 2015-12-23 Ultrasound system and method for displaying ultrasound images

Publications (2)

Publication Number Publication Date
KR20170075435A KR20170075435A (en) 2017-07-03
KR102030567B1 true KR102030567B1 (en) 2019-10-10

Family

ID=59357789

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150185082A KR102030567B1 (en) 2015-12-23 2015-12-23 Ultrasound system and method for displaying ultrasound images

Country Status (1)

Country Link
KR (1) KR102030567B1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006141451A (en) 2004-11-16 2006-06-08 Toshiba Corp Ultrasonic diagnostic apparatus
JP4470187B2 (en) 2004-12-03 2010-06-02 株式会社日立メディコ Ultrasonic device, ultrasonic imaging program, and ultrasonic imaging method
JP2011115456A (en) * 2009-12-04 2011-06-16 Toshiba Corp Ultrasonic diagnostic apparatus and control program for image data display
JP5038304B2 (en) 2006-06-06 2012-10-03 株式会社日立メディコ Ultrasonic diagnostic equipment
JP2015131097A (en) * 2013-12-13 2015-07-23 株式会社東芝 Ultrasonic diagnostic apparatus, image processing apparatus and image processing method
JP2015522367A (en) 2012-07-18 2015-08-06 コーニンクレッカ フィリップス エヌ ヴェ Method and system for processing ultrasound imaging data

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101310219B1 (en) * 2006-09-28 2013-10-14 삼성메디슨 주식회사 Ultrasound system and method for providing a plurality of ultrasound images

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006141451A (en) 2004-11-16 2006-06-08 Toshiba Corp Ultrasonic diagnostic apparatus
JP4470187B2 (en) 2004-12-03 2010-06-02 株式会社日立メディコ Ultrasonic device, ultrasonic imaging program, and ultrasonic imaging method
JP5038304B2 (en) 2006-06-06 2012-10-03 株式会社日立メディコ Ultrasonic diagnostic equipment
JP2011115456A (en) * 2009-12-04 2011-06-16 Toshiba Corp Ultrasonic diagnostic apparatus and control program for image data display
JP2015522367A (en) 2012-07-18 2015-08-06 コーニンクレッカ フィリップス エヌ ヴェ Method and system for processing ultrasound imaging data
JP2015131097A (en) * 2013-12-13 2015-07-23 株式会社東芝 Ultrasonic diagnostic apparatus, image processing apparatus and image processing method

Also Published As

Publication number Publication date
KR20170075435A (en) 2017-07-03

Similar Documents

Publication Publication Date Title
KR101964213B1 (en) Tissue characterization in medical diagnostic ultrasound
CN108685596B (en) Tissue property estimation using ultrasound medical imaging
JP6987496B2 (en) Analyst
US11071525B2 (en) Ultrasonic diagnostic apparatus and method
JP2013138868A (en) Ultrasonic system and method for providing vector doppler image based on determination data
KR101313220B1 (en) Ultrasound system and method for providing color doppler mode image based on qualification curve
KR20060100283A (en) Ultrasonic image construction method and diagnostic ultrasound apparatus
KR20130075458A (en) Ultrasound system and method for providing motion profile information of target object
WO2020113397A1 (en) Ultrasonic imaging method and ultrasonic imaging system
CN106691502B (en) Ultrasound system and method for generating elastic images
KR20130075477A (en) Ultrasound system and method for providing vector motion mode image
KR20120067535A (en) Ultrasound system and method for providing high pulse rate frequency doppler image based on mid-point algorithm
KR20130075486A (en) Ultrasound system and method for dectecting vecotr information based on transmitting delay
KR102030567B1 (en) Ultrasound system and method for displaying ultrasound images
KR101817389B1 (en) Ultrasound system, method and computer program for providing doppler image
JP5656389B2 (en) Ultrasound system for adaptive persistence processing of elastic images
KR20100016731A (en) Ultrasound system and method for processing ultrasound data considering scan conversion
KR101117900B1 (en) Ultrasound system and method for setting eigenvectors
JP6334883B2 (en) Ultrasonic diagnostic apparatus and display control program
KR20120045696A (en) Ultrasound system and method for providing color motion mode image with pulse wave doppler image
CN112137650A (en) Ultrasound medical imaging with acoustic velocity optimized based on fat fraction
KR101511502B1 (en) Ultrasound system and method for dectecting vecotr information based on transmitting delay
JP5663640B2 (en) Ultrasonic diagnostic equipment
JP7242623B2 (en) Ultrasound image display system and its control program
JP6258070B2 (en) Ultrasonic diagnostic equipment

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant