KR102035991B1 - Method and ultrasound system for generating image of target object - Google Patents

Method and ultrasound system for generating image of target object Download PDF

Info

Publication number
KR102035991B1
KR102035991B1 KR1020150007908A KR20150007908A KR102035991B1 KR 102035991 B1 KR102035991 B1 KR 102035991B1 KR 1020150007908 A KR1020150007908 A KR 1020150007908A KR 20150007908 A KR20150007908 A KR 20150007908A KR 102035991 B1 KR102035991 B1 KR 102035991B1
Authority
KR
South Korea
Prior art keywords
image
needle
pixel
ultrasound
scanline
Prior art date
Application number
KR1020150007908A
Other languages
Korean (ko)
Other versions
KR20160088616A (en
Inventor
김상혁
이아영
Original Assignee
지멘스 메디컬 솔루션즈 유에스에이, 인크.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 지멘스 메디컬 솔루션즈 유에스에이, 인크. filed Critical 지멘스 메디컬 솔루션즈 유에스에이, 인크.
Priority to KR1020150007908A priority Critical patent/KR102035991B1/en
Publication of KR20160088616A publication Critical patent/KR20160088616A/en
Application granted granted Critical
Publication of KR102035991B1 publication Critical patent/KR102035991B1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Robotics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The present invention provides a method and an ultrasound system for providing an image of an object. The method according to the invention comprises the steps of determining a plurality of scanline angles; Acquiring a plurality of sets of ultrasound data for an object including a needle at a plurality of scanline angles; Forming a plurality of ultrasound images based on the plurality of sets of ultrasound data; Forming a mask image representing the needle based on the plurality of ultrasound images; And forming an image of the object based on the mask image and the plurality of ultrasound images.

Description

Method for forming an image of an object and an ultrasound system TECHNICAL FIELD OF GENERATING IMAGE OF TARGET OBJECT}

The present invention relates to an ultrasound system, and more particularly, to a method and an ultrasound system for forming an image of an object.

As medical technology develops, a small-sized hole is made in a living body without directly cutting the living body, and then a medical needle (needle) such as an ablator or a biopsy is placed on the lesion site while viewing an internal image of the living body. Techniques for insertion and treatment or examination have been used. This method is called "a procedure using an image" or "mediated procedure" because the procedure is performed while observing the inside of the living body with a medical imaging device. In other words, the interventional procedure can be used for the diagnosis or treatment by directly reaching the lesion requiring examination or treatment through the skin while looking at the image obtained from computerized tomography (CT) or magnetic resonance imager (MRI) used in the radiology department. Say the procedure.

These interventional procedures generally do not require general anesthesia, require less physical burden on the living body (e.g., patient), less pain or pain, shorter hospital stays, and shorter hospitalization compared to surgical treatments that require incisions. It is possible to quickly return to daily life, which is a great benefit in terms of medical cost and effectiveness.

When CT or MRI is used for interventional procedures, it is difficult to obtain images in real time. In addition, when performing an interventional procedure using CT, both the operator and the patient are at risk of prolonged exposure to radiation. In contrast, when using an ultrasound system, an ultrasound image can be obtained in real time and is almost harmless to a human body. However, the ultrasound image obtained by using the ultrasound system is difficult to clearly distinguish not only the lesions but also the needles (that is, medical needles), which makes it difficult to use the ultrasound system for interventional procedures.

Japanese Laid-Open Patent Publication No. 2012-213606 Patent Publication No. 10-2014-0066584

The present invention forms a mask image of a needle (that is, a medical needle) inserted into a living body using a plurality of ultrasound images, and uses a needle visualization image of visualizing a needle based on the mask image and the plurality of ultrasound images as an image of the object. It provides a method and an ultrasonic system.

According to one or more exemplary embodiments, a method of forming an image of an object in an ultrasound system includes: determining a plurality of scanline angles; Acquiring a plurality of sets of ultrasound data for an object including a needle at the plurality of scanline angles; Forming a plurality of ultrasound images based on the plurality of sets of ultrasound data; Forming a mask image representing the needle based on the plurality of ultrasound images; And forming an image of the object based on the mask image and the plurality of ultrasound images.

In addition, according to another embodiment of the present invention, an ultrasound system for forming an image of an object may include: a scanline angle determiner operative to determine a plurality of scanline angles; An ultrasound data acquisition unit operable to acquire a plurality of sets of ultrasound data for an object including a needle at the plurality of scanline angles; An image forming unit operable to form a plurality of ultrasound images based on the plurality of sets of ultrasound data; And an image processor configured to form a mask image representing the needle based on the plurality of ultrasound images, and to form an image of the object based on the mask image and the plurality of ultrasound images.

According to the present invention, not only the needle inserted into the living body can be clearly visualized, but also the image noise due to the post-processing of the ultrasound image is not generated. You can.

1 is a block diagram schematically showing the configuration of an ultrasonic system according to an embodiment of the present invention.
2 is a block diagram schematically showing a configuration of a processor according to an embodiment of the present invention.
3 is an exemplary view showing a scanline angle for steering a scanline according to an embodiment of the present invention.
4 is a flowchart illustrating a procedure of forming a needle visualization image as an image of an object according to an exemplary embodiment of the present invention.
5 is a flowchart illustrating a procedure for determining a plurality of scanline angles according to an embodiment of the present invention.
6 is an exemplary view showing a plurality of ultrasound images according to an embodiment of the present invention.
7 is a flowchart illustrating a procedure of forming a mask image of a needle according to an embodiment of the present invention.
8 is an exemplary view showing a pixel mask image according to an embodiment of the present invention.
9 is an exemplary view showing a binary mask image according to an embodiment of the present invention.
10 is a flowchart illustrating a procedure of checking a needle image according to an embodiment of the present invention.
11 is an exemplary view showing a candidate needle image according to an embodiment of the present invention.
12 is a flowchart illustrating a procedure of confirming a needle image according to another embodiment of the present invention.
13 is an exemplary view showing a mask image of a needle according to an embodiment of the present invention.
14 is an exemplary view showing an inverted mask image according to an embodiment of the present invention.
15 is an exemplary view showing an example of forming an image of an object according to an embodiment of the present invention.

Hereinafter, with reference to the accompanying drawings will be described an embodiment of the present invention.

1 is a block diagram schematically showing the configuration of an ultrasonic system according to an embodiment of the present invention. Referring to FIG. 1, the ultrasound system 100 includes an ultrasound probe 110.

The ultrasonic probe 110 includes a transducer (not shown) operative to convert electrical signals and ultrasonic signals into one another. The ultrasonic probe 110 transmits an ultrasonic signal to a living body along each of the plurality of scanlines and receives an ultrasonic signal (that is, an ultrasonic echo signal) reflected from the living body, thereby being referred to as an electrical signal (hereinafter referred to as a "receive signal"). ). In one embodiment, the living body includes a subject, and the subject includes a needle (ie, a medical needle) together with organs, blood flow, blood vessels, and the like.

The ultrasound system 100 further includes a control panel 120. The control panel 120 is provided with various input devices for performing an operation such as selection of a diagnostic mode, control of a diagnostic operation, input of a command required for diagnosis, signal manipulation, output control, etc. to enable an interface between a user and the device. As a component, input units such as a trackball, a keyboard, and a button are provided.

In one embodiment, the control panel 120 receives input information indicating the insertion angle and the insertion direction of the needle from the user. The insertion angle of the needle indicates the angle at which the needle is inserted in the living body, and the insertion direction of the needle indicates the direction in which the needle is inserted in the living body. For example, the insertion direction of the needle may be a direction in which the needle is inserted at a predetermined angle from the left side of the ultrasound probe 110 based on the ultrasound probe 110 (hereinafter, referred to as a “first needle insertion direction”) and the needle may be an ultrasound probe. It includes a direction (hereinafter referred to as "second needle insertion direction") to be inserted at a predetermined angle on the right side of 110.

The ultrasound system 100 further includes a processor 130. The processor 130 controls the transmission of the ultrasonic signal. In addition, the processor 130 forms an ultrasound image of the living body based on the received signal provided from the ultrasound probe 110. In addition, the processor 130 performs an image processing on the ultrasound image to form an image of the object.

2 is a block diagram schematically showing the configuration of a processor 130 according to an embodiment of the present invention. Referring to FIG. 2, the processor 130 includes a scanline angle determiner 210.

The scanline angle determiner 210 determines an angle of steering the plurality of scanlines (hereinafter, referred to as a “scanline angle”). That is, the scanline angle determiner 210 determines the insertion direction of the needle and determines the plurality of scanline angles based on the determined insertion direction of the needle.

In one embodiment, the scanline angle determiner 210 sets at least three scanline angles. For example, the scanline angle determiner 210 sets the first scanline angle, the second scanline angle, and the third scanline angle. The first scan line angle represents a scan line angle is not steering the scan lines (S i) to a steering angle of scan lines is 0 degrees, that is, the scan lines (S i) as shown in Fig. The second scan line angle is a, the scan line an angle of steering to a predetermined angle (e.g., + 40 °) to the left of scan lines (S i) relative to the first scan line angle as shown in Figure 3 Indicates. The third scan line angle is set to the third scan line of the steering angle at a first scan, based on the scan line an angle line (S i) a predetermined angle (e.g., -40 °) to the right to. That is, the second scan line angle and the third scan line angle have the same angle value and different angle orientations. Here, positive (+) indicates the direction of steering the plurality of scan lines to the left, minus (-) indicates the direction of steering the plurality of scan lines to the right.

The scanline angle determiner 210 obtains sample ultrasound data of the object at each of the first to third scanline angles, and forms a sample ultrasound image based on the obtained sample ultrasound data. The scanline angle determiner 210 determines the insertion direction of the needle based on the sample ultrasound image, and determines the plurality of scanline angles based on the needle insertion direction.

In another embodiment, the scanline angle determiner 210 determines the insertion direction of the needle based on input information provided from the control panel 120, and determines the plurality of scanline angles based on the insertion direction of the needle. do.

In yet another embodiment, the scanline angle determiner 210 determines a plurality of scanline angles based on a plurality of preset angle values. That is, the scanline angle determiner 210 may scan a plurality of scanline angles to steer the plurality of scanlines to the left and a plurality of scanline angles to steer the plurality of scanlines to the right based on a plurality of preset angle values. Determine.

The processor 130 further includes an ultrasonic data acquisition unit 220. The ultrasonic data acquisition unit 220 controls the transmission of the ultrasonic signal. In addition, the ultrasound data acquisition unit 220 acquires a plurality of sets of ultrasound data about a living body (ie, an object) at a plurality of scanline angles. In the present embodiment, the ultrasound data acquisition unit 220 includes a transmitter 221, a receiver 222, and an ultrasound data generator 223, as shown in FIG. 2.

The transmitter 221 controls the transmission of the ultrasonic signal. In addition, the transmitter 221 forms an electrical signal (hereinafter referred to as a "transmission signal") for obtaining an ultrasound image. In one embodiment, the transmitter 221 forms a transmission signal for obtaining an ultrasound image corresponding to each of the plurality of scanline angles determined by the scanline angle determiner 210, and transmits the formed transmission signal to the ultrasound probe 110. To provide. Therefore, the ultrasound probe 110 converts a transmission signal provided from the transmitter 221 into an ultrasound signal, transmits the converted ultrasound signal to a living body, receives an ultrasound echo signal reflected from the living body, and forms a received signal.

The receiver 222 amplifies the received signal provided from the ultrasonic probe 110 and converts the received signal into a digital signal to form a digital signal. The receiver 222 converts a time gain compression (TGC) unit (not shown) and an analog signal (ie, a received signal) into a digital signal to compensate for the attenuation generated while the ultrasonic signal passes through the living body. An analog to digital conversion unit (not shown) and the like.

The ultrasonic data forming unit 223 forms ultrasonic data based on the digital signal converted by the receiving unit 222. In one embodiment, the ultrasound data forming unit 223 is based on the time delay value for compensating the arrival time of the ultrasound echo signal reflected from the object of the living body, according to the position of the transducer of the ultrasound probe 110, Receive focusing is performed on the digital signal provided from the receiver 222 to form a receive focusing signal. In addition, the ultrasound data forming unit 223 forms ultrasound data based on the reception focus signal.

The processor 130 further includes an image forming unit 230. The image forming unit 230 forms an ultrasound image of the object by using the ultrasound data provided from the ultrasound data obtaining unit 220. The ultrasound image includes a B mode (brightness mode) image. However, the ultrasound image is not necessarily limited thereto. In an embodiment, the image forming unit 230 forms a plurality of ultrasound images based on a plurality of sets of ultrasound data provided from the ultrasound data obtaining unit 220.

The processor 130 further includes an image processor 240. The image processor 240 forms a mask image representing the needle based on the plurality of ultrasound images formed by the image forming unit 230, and based on the mask image and the plurality of ultrasound images, the image of the needle visualizes the needle. It is formed as an image of.

The processor 130 further includes a controller 250. The controller 250 controls the transmission and reception of the ultrasonic signal. In addition, the controller 250 controls the formation and image processing of the ultrasound image and the output of the image of the object. Moreover, the controller 250 controls the operation of each component of the ultrasound system 100.

Referring back to FIG. 1, the ultrasound system 100 further includes an output unit 140. The output unit 140 outputs an image of the object formed by the processor 130. In addition, the output unit 140 may output a plurality of ultrasound images formed by the processor 130. In one embodiment, the output unit 140 includes a display unit (not shown).

Hereinafter, a procedure of forming an image of an object will be described in detail with reference to FIGS. 4 to 15.

4 is a flowchart illustrating a procedure of forming an image of an object according to an exemplary embodiment of the present invention. Referring to FIG. 4, the scanline angle determiner 210 determines a plurality of scanline angles for steering a plurality of scanlines (S402).

In one embodiment, the scanline angle determiner 210 sets a first scanline angle, a second scanline angle and a third scanline angle, and sets a plurality of scanline angles based on the set first to third scanline angles. Determine the scanline angle.

5 is a flowchart illustrating a procedure of determining a plurality of scanline angles according to an embodiment of the present invention. Referring to FIG. 5, the scanline angle determiner 210 sets a first scanline angle, a second scanline angle, and a third scanline angle (S502).

As an example, the first scanline angle has an angle value of 0 °. That is, the first scanline angle is a scanline angle that does not steer the plurality of scanlines. The second scanline angle and the third scanline angle have the same angle value with respect to the first scanline angle and have different angular orientations. For example, the second scanline angle has an angle value of + 40 ° and the third scanline angle has an angle value of -40 °. However, the angle values of the second scan line angle and the third scan line angle are not necessarily limited thereto, and may be changed as necessary.

The scanline angle determiner 210 obtains sample ultrasound data of the object at each of the first scanline angle, the second scanline angle, and the third scanline angle (S504). That is, the scanline angle determiner 210 acquires sample ultrasound data (hereinafter, referred to as “first sample ultrasound data”) of the object at the first scanline angle and samples the object at the second scanline angle. Ultrasound data (hereinafter referred to as "second sample ultrasound data") is acquired, and sample ultrasound data (hereinafter referred to as "third sample ultrasound data") for the object at a third scanline angle are obtained.

As an example, the scan line angle determiner 210 forms a transmission signal (hereinafter, referred to as a “first sample transmission signal”) for obtaining a sample ultrasound image corresponding to the first scan line angle, and transmits the formed first sample. The signal is provided to the ultrasonic probe 110. Accordingly, the ultrasound probe 110 converts the first sample transmission signal provided from the scan line angle determiner 210 into an ultrasound signal, transmits the converted ultrasound signal to the living body, and receives the ultrasound echo signal reflected from the living body. To form a reception signal (hereinafter referred to as " first sample reception signal "). The scanline angle determiner 210 forms first sample ultrasound data based on the first sample received signal provided from the ultrasound probe 110. In addition, the scan line angle determiner 210 forms a transmission signal (hereinafter referred to as a “second sample transmission signal”) for obtaining a sample ultrasound image corresponding to the second scan line angle, and forms the second sample transmission signal. To the ultrasonic probe 110. Therefore, the ultrasound probe 110 converts the second sample transmission signal provided from the scan line angle determiner 210 into an ultrasound signal, transmits the converted ultrasound signal to the living body, and receives the ultrasound echo signal reflected from the living body. To form a reception signal (hereinafter referred to as "second sample reception signal"). The scanline angle determiner 210 forms second sample ultrasound data based on the second sample received signal provided from the ultrasound probe 110. In addition, the scan line angle determiner 210 forms a transmission signal (hereinafter referred to as a “third sample transmission signal”) for obtaining a sample ultrasound image corresponding to the third scan line angle, and forms the third sample transmission signal. To the ultrasonic probe 110. Therefore, the ultrasound probe 110 converts the third sample transmission signal provided from the scan line angle determiner 210 into an ultrasound signal, transmits the converted ultrasound signal to the living body, and receives the ultrasound echo signal reflected from the living body. To form a reception signal (hereinafter referred to as " third sample reception signal "). The scanline angle determiner 210 forms third sample ultrasound data based on the third sample received signal provided from the ultrasound probe 110.

As another example, the scanline angle determiner 210 forms information about the set first scanline angle, the second scanline angle, and the third scanline angle (hereinafter, referred to as “scanline angle information”), The formed scan line angle information is transmitted to the controller 250. The controller 250 controls acquisition of sample ultrasound data for each of the first to third scanline angles based on the scanline angle information provided from the scanline angle determiner 210.

In more detail, the control unit 250 is based on the scan line angle information, the control signal for obtaining the first sample ultrasound data for the object at the first scan line angle (hereinafter referred to as "first control signal") ). Accordingly, the ultrasound data acquisition unit 220 forms a first sample transmission signal for obtaining a sample ultrasound image corresponding to the first scan line angle according to the first control signal of the controller 250, and forms the first sample transmission signal. To the ultrasonic probe 110. The ultrasound probe 110 converts the first sample transmission signal provided from the ultrasound data acquisition unit 220 into an ultrasound signal, transmits the converted ultrasound signal to the living body, and based on the ultrasound echo signal reflected from the living body. A sample reception signal is formed. The ultrasound data acquisition unit 220 forms first sample ultrasound data based on the first sample received signal provided from the ultrasound probe 110. The formed first sample ultrasound data is provided to the scanline angle determiner 210.

Subsequently, the controller 250 forms a control signal (hereinafter, referred to as a “second control signal”) for acquiring second sample ultrasound data of the object at the second scan line angle based on the scan line angle information. . Accordingly, the ultrasound data acquisition unit 220 forms a second sample transmission signal for obtaining a sample ultrasound image corresponding to the second scan line angle according to the second control signal of the controller 250, and forms the second sample transmission signal. To the ultrasonic probe 110. The ultrasonic probe 110 converts the second sample transmission signal provided from the ultrasonic data acquisition unit 220 into an ultrasonic signal, transmits the converted ultrasonic signal to a living body, and based on the ultrasonic echo signal reflected from the living body, the second sample transmission signal. A sample reception signal is formed. The ultrasound data acquisition unit 220 forms second sample ultrasound data based on the second sample reception signal provided from the ultrasound probe 110. The formed second sample ultrasound data is provided to the scanline angle determiner 210.

Subsequently, the controller 250 forms a control signal (hereinafter, referred to as a “third control signal”) for acquiring third sample ultrasound data about the object at the third scan line angle based on the scan line angle information. . Therefore, the ultrasound data acquisition unit 220 forms a third sample transmission signal for obtaining a sample ultrasound image corresponding to the third scan line angle according to the third control signal of the controller 250, and forms the third sample transmission signal. To the ultrasonic probe 110. The ultrasound probe 110 converts the third sample transmission signal provided from the ultrasound data acquisition unit 220 into an ultrasound signal, transmits the converted ultrasound signal to the living body, and based on the ultrasound echo signal reflected from the living body. A sample reception signal is formed. The ultrasound data acquirer 220 forms third sample ultrasound data based on the third sample received signal provided from the ultrasound probe 110. The formed third sample ultrasound data is provided to the scanline angle determiner 210.

In the above examples, it has been described that the sample ultrasound data is acquired in the order of the first scanline angle, the second scanline angle, and the third scanline angle. However, the present invention is not limited thereto, and the order may be changed as necessary.

Referring to FIG. 5 again, the scanline angle determiner 210 forms a sample ultrasound image based on sample ultrasound data corresponding to each of the first scanline angle, the second scanline angle, and the third scanline angle ( S506). That is, the scanline angle determiner 210 forms a sample ultrasound image (hereinafter, referred to as a “first sample ultrasound image”) based on the first sample ultrasound data, and also based on the second sample ultrasound data. An image (hereinafter referred to as "second sample ultrasound image") is formed, and a sample ultrasound image (hereinafter referred to as "third sample ultrasound image") is formed based on the third sample ultrasound data.

In the above-described embodiment, the scan line angle determiner 210 forms the sample ultrasound image based on the sample ultrasound data. However, the scanline angle determiner 210 is not limited thereto, and the image forming unit 230 is controlled by the control unit 250. The sample ultrasound image may be formed based on the sample ultrasound data.

The scanline angle determiner 210 determines the insertion direction of the needle based on the sample ultrasound image (that is, the first sample ultrasound image, the second sample ultrasound image, and the third sample ultrasound image) (S508).

As an example, the scanline angle determiner 210 calculates a pixel difference value at the same pixel position between the first sample ultrasound image and the second sample ultrasound image. The scanline angle determiner 210 calculates the sum of the calculated pixel difference values (hereinafter, referred to as "first pixel difference value sum") and calculates the absolute value of the calculated first pixel difference value sum. Also, the scanline angle determiner 210 calculates a pixel difference value at the same pixel position between the first sample ultrasound image and the third sample ultrasound image, and adds the sum of the calculated pixel difference values (hereinafter, “second pixel difference”. Value sum ") to calculate the absolute value of the calculated second pixel difference value sum. Also, the scan line angle determiner 210 compares the absolute value of the sum of the first pixel difference values and the absolute value of the sum of the second pixel difference values so that the absolute value of the first pixel difference value sum is greater than the absolute value of the second pixel difference value sum. If judged to be large, the insertion direction of the needle is determined as the first needle insertion direction. On the other hand, if it is determined that the absolute value of the first pixel difference value sum is not greater than the absolute value of the second pixel difference value sum, that is, it is determined that the absolute value of the first pixel difference value sum is smaller than the absolute value of the second pixel difference value sum, The scanline angle determiner 210 determines the insertion direction of the needle as the second needle insertion direction.

Referring back to FIG. 5, the scanline angle determiner 210 determines a plurality of scanline angles based on a plurality of preset angle values according to the determined needle insertion direction (S510).

As an example, when it is determined that the insertion direction of the needle is the first needle insertion direction, the scanline angle determiner 210 determines a plurality of preset angle values (eg, 40 °, 35 °, 10 °, 5 °, A plurality of scanline angles (eg, + 40 °, + 35 °, + 10 °, + 5 °, + 0 °) are determined based on 0 °.

As another example, when it is determined that the insertion direction of the needle is the second needle insertion direction, the scanline angle determiner 210 determines a plurality of preset angle values (eg, 40 °, 35 °, 10 °, 5 °). , 0 °) to determine a plurality of scanline angles (e.g., -40 °, -35 °, -10 °, -5 °, -0 °).

Although the above-described examples have been described as determining five scanline angles, the present invention is not limited thereto, and the number of scanline angles may be increased or decreased as necessary.

In another embodiment, the scanline angle determiner 210 determines the insertion direction of the needle based on the input information provided from the control panel 120, and determines a plurality of preset angle values according to the determined insertion direction of the needle. Based on the plurality of scanline angles are determined.

As an example, the scan line angle determiner 210 analyzes input information provided from the control panel 120 to determine whether the needle insertion direction is the first needle insertion direction or the second needle insertion direction.

If it is determined that the insertion direction of the needle is the first needle insertion direction, the scanline angle determination unit 210 analyzes the input information, so that the insertion angle of the needle is in the first angle range, the second angle range, and the third angle range. It is determined which one is included in the range. Here, the first angle range is 30 ° to 40 °, the second angle range is 20 ° to 30 °, and the third angle range is 1 ° to 20 °. However, the first angle range, the second angle range and the third angle range are not necessarily limited thereto.

If it is determined that the insertion angle of the needle is within the first angle range, the scanline angle determiner 210 may determine a plurality of preset angle values (eg, 40 °, 37 °, 34 °, 30 °, and 0 °). Based on this, a plurality of scanline angles (eg, + 40 °, + 37 °, + 34 °, + 30 ° and 0 °) are determined.

On the other hand, if it is determined that the insertion angle of the needle is within the second angle range, the scanline angle determiner 210 may be configured with a plurality of preset angle values (eg, 30 °, 27 °, 24 °, 20 ° and 0 °). ) Determine a plurality of scanline angles (eg, + 30 °, + 27 °, + 24 °, + 20 ° and 0 °).

On the other hand, if it is determined that the insertion angle of the needle is within the third angle range, the scanline angle determiner 210 may determine a plurality of preset angle values (for example, 20 °, 15 °, 10 °, 5 °, and 0 °). ) Determine a plurality of scanline angles (+ 20 °, + 15 °, + 10 °, + 5 ° and 0 °).

Meanwhile, if it is determined that the needle insertion direction is the second needle insertion direction, the scanline angle determiner 210 may determine the plurality of scanline angles through the above-described procedure.

In yet another embodiment, the scanline angle determiner 210 determines a plurality of scanline angles based on a plurality of preset angle values. For example, the scanline angle determiner 210 may correspond to a plurality of preset needle angles (eg, 40 °, 35 °, 10 °, 5 °, 0 °) corresponding to the first needle insertion direction. Determine a scanline angle of (e.g., 0 °, + 5 °, + 10 °, + 35 ° and + 40 °), and determine a plurality of scanline angles corresponding to the second needle insertion direction (e.g., -40 °, -35 °, -10 °, -5 °).

Referring back to FIG. 4, the ultrasound data acquisition unit 220 obtains a plurality of sets of ultrasound data of the object at each of the plurality of scanline angles determined by the scanline angle determiner 210 (S404). Hereinafter, for convenience of description, the plurality of scanline angles are + 40 °, + 35 °, + 10 °, + 5 °, and + 0 °.

In one embodiment, the ultrasonic data acquisition unit 220 forms a transmission signal (hereinafter referred to as a "first transmission signal") for obtaining an ultrasound image corresponding to the scan line angle (eg, + 40 °). Then, the formed first transmission signal is provided to the ultrasonic probe 110. Accordingly, the ultrasound probe 110 converts the first transmission signal provided from the ultrasound data acquisition unit 220 into an ultrasound signal, transmits the converted ultrasound signal to the living body, and receives and receives the ultrasound echo signal reflected from the living body. A signal (hereinafter referred to as "first received signal") is formed. The ultrasound data acquisition unit 220 forms ultrasound data (hereinafter, referred to as “first ultrasound data”) based on the first received signal provided from the ultrasound probe 110.

In addition, the ultrasound data acquisition unit 220 forms a transmission signal (hereinafter referred to as a “second transmission signal”) for obtaining an ultrasound image corresponding to a scan line angle (eg, + 35 °), 2 transmits a transmission signal to the ultrasonic probe 110. Accordingly, the ultrasound probe 110 converts the second transmission signal provided from the ultrasound data acquisition unit 220 into an ultrasound signal, transmits the converted ultrasound signal to the living body, and receives and receives the ultrasound echo signal reflected from the living body. A signal (hereinafter referred to as a "second received signal"). The ultrasound data acquisition unit 220 forms ultrasound data (hereinafter referred to as "second ultrasound data") based on the second received signal provided from the ultrasound probe 110.

In addition, the ultrasound data acquisition unit 220 forms a transmission signal (hereinafter referred to as a “third transmission signal”) for obtaining an ultrasound image corresponding to a scan line angle (eg, + 10 °), 3 transmits a transmission signal to the ultrasonic probe 110. Accordingly, the ultrasound probe 110 converts the third transmission signal provided from the ultrasound data acquisition unit 220 into an ultrasound signal, transmits the converted ultrasound signal to the living body, and receives and receives the ultrasound echo signal reflected from the living body. A signal (hereinafter referred to as "third received signal") is formed. The ultrasound data acquisition unit 220 forms ultrasound data (hereinafter referred to as “third ultrasound data”) based on the third received signal provided from the ultrasound probe 110.

In addition, the ultrasound data acquisition unit 220 forms a transmission signal (hereinafter referred to as a “fourth transmission signal”) for obtaining an ultrasound image corresponding to a scan line angle (eg, + 5 °), 4 transmits a transmission signal to the ultrasonic probe 110. Accordingly, the ultrasound probe 110 converts the fourth transmission signal provided from the ultrasound data acquisition unit 220 into an ultrasound signal, transmits the converted ultrasound signal to the living body, and receives and receives the ultrasound echo signal reflected from the living body. A signal (hereinafter referred to as "fourth received signal") is formed. The ultrasound data acquisition unit 220 forms ultrasound data (hereinafter referred to as “fourth ultrasound data”) based on the fourth received signal provided from the ultrasound probe 110.

In addition, the ultrasound data acquisition unit 220 forms a transmission signal (hereinafter referred to as a “fifth transmission signal”) for obtaining an ultrasound image corresponding to a scan line angle (for example, 0 °), and forms a fifth The transmission signal is provided to the ultrasonic probe 110. Accordingly, the ultrasound probe 110 converts the fifth transmission signal provided from the ultrasound data acquisition unit 220 into an ultrasound signal, transmits the converted ultrasound signal to the living body, and receives and receives the ultrasound echo signal reflected from the living body. Form a signal (hereinafter referred to as " a fifth received signal "). The ultrasound data acquisition unit 220 forms ultrasound data (hereinafter, referred to as “fifth ultrasound data”) based on the fifth received signal provided from the ultrasound probe 110.

Although the above-described embodiment has been described as acquiring the ultrasound data in the order of scanline angles of + 40 °, + 35 °, + 10 °, + 5 °, and 0 °, the present invention is not necessarily limited thereto. The order of scanline angles may be changed.

The image forming unit 230 forms a plurality of ultrasound images based on the plurality of sets of ultrasound data provided from the ultrasound data obtaining unit 220 (S406). Each of the plurality of ultrasound images includes a plurality of pixels, and each pixel may be characterized by a pixel value and a pixel position.

In an exemplary embodiment, the image forming unit 230 forms the first ultrasound image UI 1 based on the first ultrasound data corresponding to the scanline angle (+ 40 °), as shown in FIG. 6. And forming a second ultrasound image UI 2 based on the second ultrasound data corresponding to the scanline angle (+ 35 °), and based on the third ultrasound data corresponding to the scanline angle (+ 10 °). 3 forms an ultrasound image UI 3 , forms a fourth ultrasound image UI 4 based on fourth ultrasound data corresponding to the scanline angle (+ 5 °), and corresponds to the scanline angle (0 °) A fifth ultrasound image UI 5 is formed based on the fifth ultrasound data.

The image processor 240 forms a mask image representing a needle based on the plurality of ultrasound images in operation S408. Step S408 will be described in more detail with reference to FIGS. 7 to 13.

7 is a flowchart illustrating a procedure of forming a mask image representing a needle according to an exemplary embodiment of the present invention. Referring to FIG. 7, the image processor 240 forms a binary mask image based on pixels of a plurality of ultrasound images (S702).

In an exemplary embodiment, the image processor 240 determines a pixel value of the pixel mask image for the pixel position based on the pixel value of each pixel position in the plurality of ultrasound images to form a pixel mask image. That is, the image processor 240 forms a pixel mask image PMI as illustrated in FIG. 8 based on pixel values of respective pixel positions in the plurality of ultrasound images. In the pixel mask image PMI, one pixel value of the pixel value is selected at a corresponding pixel position of the plurality of ultrasound images UI 1 to UI 5 . For example, the selected pixel value may be a maximum pixel value among pixel values at corresponding pixel positions of the plurality of ultrasound images UI 1 to UI 5 . However, the selected pixel value is not necessarily limited thereto.

The image processor 240 forms a binary mask image BMI as shown in FIG. 9 from the pixel mask image PMI based on a preset threshold pixel value. For example, the image processor 240 sets a statistical value (maximum value, average value, etc.) of pixels among the plurality of ultrasound images UI 1 to UI 5 as a threshold pixel value (threshold value), and sets the threshold pixel value to the set threshold pixel value. Threshold processing is performed on the pixel mask image PMI to form a binary mask image BMI.

Referring to FIG. 7 again, the image processor 240 checks the needle image in the binary mask image BMI (S704). Step S704 will be described in more detail with reference to FIGS. 10 to 12.

10 is a flowchart illustrating a procedure of checking a needle image according to an embodiment of the present invention. Referring to FIG. 10, the image processor 240 detects at least one start point of at least one candidate needle in a binary mask image BMI (S1002).

As an example, the image processor 240 may start at least one needle (ie, seed point) based on pixel values of pixels existing in a first column on the left or right side of the binary mask image BMI according to the insertion direction of the needle. Is detected. At this time, the starting point becomes a pixel whose pixel value is not 0 in the pixels existing in the first column.

As another example, the image processor 240 detects a start point (ie, seed point) of at least one needle based on the pixel value of the pixel present in the first column on the left side of the binary mask image BMI, and detects the binary mask image ( The starting point of the at least one needle is detected based on the pixel values of the pixels present in the first column on the right side of the BMI).

The image processor 240 forms at least one candidate needle image for at least one candidate needle based on at least one starting point in operation S1004. For example, the image processor 240 performs seeded region growing based on at least one start point in the binary mask image BMI to form at least one candidate needle image.

The image processor 240 selects a needle image from at least one candidate needle image (S1006 to S1012). More specifically, the image processor 240 detects a start pixel and an end pixel of at least one candidate needle image (S1006). For example, as illustrated in FIG. 11, the image processor 240 detects a start pixel and an end pixel in the first candidate needle image CNI 1 , and further includes a start pixel and a second pixel in the second candidate needle image CNI 2 . Detect the end pixel.

The image processor 240 sets an area in at least one candidate needle image based on the detected start pixel and the end pixel (S1008). For example, the image processor 240 sets a region ROI 1 (hereinafter, referred to as a “first region”) based on the start pixel and the end pixel with respect to the first candidate needle image CNI 1 . Also, the image processor 240 sets an area ROI 2 (hereinafter referred to as a “second area”) based on the start pixel and the end pixel for the second candidate needle image CNI 2 .

The image processor 240 compares the size of the set area (S1010) and selects a candidate needle image having a maximum size as a needle image. For example, the image processor 240 detects the first candidate needle image CNI 1 having the largest size area by comparing the size of the first area ROI 1 with the size of the second area ROI 2 . The detected first candidate needle image CNI 1 is selected as the needle image.

12 is a flowchart illustrating a procedure of checking a needle image according to another embodiment of the present invention. Referring to FIG. 12, the image processor 240 detects at least one starting point for at least one candidate needle in a binary mask image BMI (S1202), and applies the at least one candidate needle based on at least one starting point. At least one candidate needle image is formed (S1204). Since step S1202 and step S1204 are the same as step S1002 and step S1004 of FIG. 10, detailed description thereof will be omitted.

The image processor 240 selects a needle image from at least one candidate needle image (S1206 to S1212). More specifically, the image processor 240 detects a start pixel and an end pixel of at least one candidate needle image (S1206). For example, the image processor 240 detects the start pixel and the end pixel in the first candidate needle image CNI 1 and the start pixel and the end pixel in the second candidate needle image CNI 2 , as shown in FIG. 10. Detect the pixel.

The image processor 240 counts the number of pixels having no pixel value of 0 based on the start pixel and the end pixel detected in the at least one candidate needle image (S1208). For example, the image processor 240 counts the number of pixels having no pixel value of 0 based on the start pixel and the end pixel in the first candidate needle image CNI 1 . In addition, the image processor 240 counts the number of pixels having no pixel value of 0 based on the start pixel and the end pixel in the second candidate needle image CNI 2 .

The image processor 240 compares the counted number of pixels (S1210) and selects a candidate needle image having the maximum number of pixels as the needle image (S1212). For example, the image processor 240 compares the number of pixels in the first candidate needle image CNI 1 with the number of pixels in the second candidate needle image CNI 2 , and thus, the first candidate needle having the maximum number of pixels. The image CNI 1 is detected, and the detected first candidate needle image CNI 1 is selected as the needle image.

Optionally, the image processor 240 may perform a filtering process on the binary mask image BMI to remove noise (ie, pixel values excluding the needle) to form a needle image representing only the needle. The filtering process may be performed using a smoothing filter, a median filter, a minimum value filter, or the like.

Referring to FIG. 7 again, the image processor 240 forms a mask image representing the needle based on the identified needle image (S706). For example, the image processor 240 forms a mask image NMI as shown in FIG. 13 based on the identified needle image (first candidate needle image CNI 1 ).

Referring back to FIG. 4, the image processor 240 forms an image of the object based on the mask image and the plurality of ultrasound images (S410).

In one embodiment, the image processor 240 forms an inverted mask image based on the pixel value of the mask image NMI. That is, the image processor 240 inverts the pixel value of the mask image NMI to form an inverted mask image RMI as illustrated in FIG. 14.

As illustrated in FIG. 15, the image processor 240 multiplies the pixel value at each pixel position between the mask image NMI and the pixel mask image PMI (see the upper right corner of FIG. 15) to represent the needle. (Not shown). That is, the image processor 240 forms a first image representing only the needle based on the mask image NMI and the pixel mask image PMI.

Optionally, the image processor 240 may apply a predetermined weight to the pixel mask image PMI. Here, the weight is a weight for adjusting the brightness of the needle image in the first image.

The image processor 240 forms a second image by removing the needle by multiplying pixel values at respective pixel positions of the inverted mask image RMI and at least one ultrasound image formed at a preset scanline angle. For example, as illustrated in FIG. 15, the image processor 240 may display each pixel between the inverted mask image RMI and the fifth ultrasound image UI 5 corresponding to the scan line angle (for example, 0 °). The pixel value is multiplied at the position (see the upper left of FIG. 15) to form a second image (not shown) having the needle removed. That is, the image processor 240 forms a second image from which only the needle is removed based on the inverted mask image RMI and the fifth ultrasound image UI 5 .

As illustrated in FIG. 15, the image processor 240 synthesizes a first image and a second image to form an image (ie, a needle visualization image) NIV of an object (see lower left of FIG. 15).

While the invention has been described and illustrated by way of preferred embodiments, it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the appended claims.

100: ultrasonic system 110: ultrasonic probe
120: control panel 130: processor
140: output unit 210: scan line angle determination unit
220: Ultrasonic data acquisition unit 221: Transmitter
222: receiving unit 223: ultrasonic data forming unit
230: Image forming unit 240: Image processing unit
250: control unit S i : scan line
UI 1 ~ UI 5 : Ultrasound Image PMI: Pixel Mask Image
BMI: binary mask image CNI 1 : first candidate needle image
CNI 2 : Second candidate needle image ROI 1 : First region
ROI 2 : Second area NMI: mask image
RMI: Inverted Mask Image NIV: Image of Object (needle visualization image)

Claims (20)

A method of forming an image of an object in an ultrasound system,
Determining a plurality of scanline angles;
Acquiring a plurality of sets of ultrasound data for an object including a needle at the plurality of scanline angles;
Forming a plurality of ultrasound images based on the plurality of sets of ultrasound data, each of the plurality of ultrasound images including a plurality of pixels, each of the plurality of pixels being characterized by a pixel value and a pixel position;
Determining a plurality of pixel values of a pixel mask image for the pixel position based on the pixel values of the respective pixel positions in the plurality of ultrasound images;
Forming a binary mask image from the pixel mask image based on a threshold pixel value;
Forming a mask image representing the needle based on the binary mask image; And
Forming an image of the object based on the mask image and the plurality of ultrasound images
How to include.
The method of claim 1, wherein the forming of the mask image comprises:
Identifying a needle image from the binary mask image; And
Forming the mask image based on the identified needle image
How to include.
delete 2. The pixel mask image of claim 1, wherein each of the plurality of pixel values in the pixel mask image is selected from one of pixel values at a corresponding pixel position of the plurality of ultrasound images, and the selected pixel values are selected from the plurality of ultrasound images. The maximum pixel value among the pixel values at the corresponding pixel location. The method of claim 2, wherein the checking of the needle image comprises:
Detecting at least one starting point for at least one candidate needle in the binary mask image;
Forming at least one candidate needle image for the at least one candidate needle based on the at least one starting point; And
Selecting the needle image from the at least one candidate needle image
How to include.
The method of claim 5, wherein the selecting of the needle image comprises:
Detecting a start pixel and an end pixel of the at least one candidate needle image;
Determining at least one region based on the start pixel and the end pixel of the at least one candidate needle image; And
Selecting a candidate needle image having a maximum area in the at least one candidate needle image as the needle image;
How to include.
The method of claim 5, wherein the selecting of the needle image comprises:
Detecting a start pixel and an end pixel of the at least one candidate needle image;
Counting the number of pixels of the at least one candidate needle image between the start pixel and the end pixel; And
Selecting a candidate needle image having the maximum number of pixels in the at least one candidate needle image as the needle image;
How to include.
The method of claim 1, wherein at least one ultrasound image of the plurality of ultrasound images is formed at a preset scanline angle,
Forming an image of the object,
Forming a first image representing the needle by multiplying a pixel value at each pixel position with respect to the mask image and the pixel mask image;
Forming an inverted mask image based on the pixel value of the mask image;
Forming a second image by removing the needle by multiplying a pixel value at each pixel position of the inverted mask image and the at least one ultrasound image; And
Synthesizing the first image and the second image to form an image of the object;
How to include.
The method of claim 1, wherein the determining of the plurality of scanline angles comprises:
Receiving input information indicating an insertion angle and an insertion direction of the needle; And
Determining the plurality of scanline angles based on the input information
How to include.
The method of claim 1, wherein the determining of the plurality of scanline angles comprises:
Set a third scanline angle having the same angle value as the second scanline angle and having a different angular orientation with respect to the first scanline angle, the second scanline angle, and the first scanline angle where the scanline angle is 0 degrees. Making;
Acquiring sample ultrasound data of the object at each of the first to third scanline angles;
Forming a sample ultrasound image for each of the first to third scanline angles based on the obtained sample ultrasound data;
Determining an insertion direction of the needle based on the sample ultrasound image; And
Determining the plurality of scanline angles based on an insertion direction of the needle
How to include.
An ultrasound system for forming an image of an object,
A scanline angle determiner operable to determine a plurality of scanline angles;
An ultrasound data acquisition unit operable to acquire a plurality of sets of ultrasound data for an object including a needle at the plurality of scanline angles;
An image forming unit operable to form a plurality of ultrasound images based on the plurality of sets of ultrasound data, each of the plurality of ultrasound images including a plurality of pixels, each of which is characterized by a pixel value and a pixel position -; And
Determine a plurality of pixel values of a pixel mask image for the pixel position in the plurality of ultrasound images based on pixel values of the pixel positions, and form a binary mask image from the pixel mask image based on a threshold pixel value; And an image processor configured to form a mask image representing the needle based on the binary mask image, and to form an image of the object based on the mask image and the plurality of ultrasound images.
Ultrasound system comprising a.
The method of claim 11, wherein the image processing unit,
Checking the needle image from the binary mask image,
And an ultrasound system operative to form the mask image based on the identified needle image.
delete 12. The method of claim 11, wherein each of the plurality of pixel values in the pixel mask image is selected from one of the pixel values at corresponding pixel positions of the plurality of ultrasound images, and the selected pixel values are selected from the plurality of ultrasound images. The maximum pixel value among the pixel values at the corresponding pixel position. The method of claim 12, wherein the image processor,
Detecting at least one starting point for at least one candidate needle in the binary mask image,
Forming at least one candidate needle image for the at least one candidate needle based on the at least one starting point,
And select the needle image from the at least one candidate needle image.
The method of claim 15, wherein the image processing unit,
Detecting a start pixel and an end pixel of the at least one candidate needle image;
Determine at least one region based on the start pixel and the end pixel of the at least one candidate needle image,
And select a candidate needle image having a maximum area in the at least one candidate needle image as the needle image.
The method of claim 15, wherein the image processing unit,
Detecting a start pixel and an end pixel of the at least one candidate needle image;
Counting the number of pixels of the at least one candidate needle image between the start pixel and the end pixel,
And select a candidate needle image having the maximum number of pixels in the at least one candidate needle image as the needle image.
The method of claim 11, wherein at least one ultrasound image of the plurality of ultrasound images is formed at a preset scanline angle,
The image processor,
Forming a first image representing the needle by multiplying a pixel value at each pixel position with respect to the mask image and the pixel mask image,
Forming an inverted mask image based on the pixel value of the mask image,
Multiplying pixel values at respective pixel positions of the inverted mask image and the at least one ultrasound image to form a second image from which the needle is removed;
And an ultrasound system configured to synthesize the first image and the second image to form an image of the object.
The method of claim 11, 12, 14 to 18, wherein the scan line angle determiner,
Receiving input information indicating the insertion angle and the insertion direction of the needle,
And determine the plurality of scanline angles based on the input information.
The method of claim 11, 12, 14 to 18, wherein the scan line angle determiner,
Set a third scanline angle having the same angle value as the second scanline angle and having a different angular orientation with respect to the first scanline angle, the second scanline angle, and the first scanline angle where the scanline angle is 0 degrees. and,
Obtaining sample ultrasound data of the object from each of the first to third scanline angles,
Forming a sample ultrasound image for each of the first to third scanline angles based on the obtained sample ultrasound data,
The insertion direction of the needle is determined based on the sample ultrasound image.
And determine the plurality of scanline angles based on an insertion direction of the needle.
KR1020150007908A 2015-01-16 2015-01-16 Method and ultrasound system for generating image of target object KR102035991B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150007908A KR102035991B1 (en) 2015-01-16 2015-01-16 Method and ultrasound system for generating image of target object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150007908A KR102035991B1 (en) 2015-01-16 2015-01-16 Method and ultrasound system for generating image of target object

Publications (2)

Publication Number Publication Date
KR20160088616A KR20160088616A (en) 2016-07-26
KR102035991B1 true KR102035991B1 (en) 2019-10-25

Family

ID=56680838

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150007908A KR102035991B1 (en) 2015-01-16 2015-01-16 Method and ultrasound system for generating image of target object

Country Status (1)

Country Link
KR (1) KR102035991B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200091624A (en) * 2019-01-23 2020-07-31 삼성메디슨 주식회사 Ultrasound diagnostic apparatus and method for operating the same

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002521168A (en) 1998-07-30 2002-07-16 ボストン サイエンティフィック リミテッド Method and apparatus for spatially and temporally filtering ultrasound image processing data in a blood vessel
JP2006150069A (en) 2004-10-20 2006-06-15 Toshiba Corp Ultrasonic diagnostic equipment, and control method therefor
JP2006320378A (en) 2005-05-17 2006-11-30 Ge Medical Systems Global Technology Co Llc Ultrasonic diagnostic device, ultrasonic image generation method and ultrasonic image generation program
JP2008012150A (en) 2006-07-07 2008-01-24 Toshiba Corp Ultrasonic diagnostic equipment and control program of ultrasonic diagnostic equipment
JP2010183935A (en) 2009-02-10 2010-08-26 Toshiba Corp Ultrasonic diagnostic apparatus and control program for ultrasonic diagnostic apparatus
JP2012213606A (en) * 2011-04-01 2012-11-08 Toshiba Corp Ultrasonic diagnostic apparatus, and control program
US8348848B1 (en) 2010-11-04 2013-01-08 Hitachi Aloka Medical, Ltd. Methods and apparatus for ultrasound imaging

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140066584A (en) 2012-11-23 2014-06-02 삼성메디슨 주식회사 Ultrasound system and method for providing guide line of needle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002521168A (en) 1998-07-30 2002-07-16 ボストン サイエンティフィック リミテッド Method and apparatus for spatially and temporally filtering ultrasound image processing data in a blood vessel
JP2006150069A (en) 2004-10-20 2006-06-15 Toshiba Corp Ultrasonic diagnostic equipment, and control method therefor
JP2006320378A (en) 2005-05-17 2006-11-30 Ge Medical Systems Global Technology Co Llc Ultrasonic diagnostic device, ultrasonic image generation method and ultrasonic image generation program
JP2008012150A (en) 2006-07-07 2008-01-24 Toshiba Corp Ultrasonic diagnostic equipment and control program of ultrasonic diagnostic equipment
JP2010183935A (en) 2009-02-10 2010-08-26 Toshiba Corp Ultrasonic diagnostic apparatus and control program for ultrasonic diagnostic apparatus
US8348848B1 (en) 2010-11-04 2013-01-08 Hitachi Aloka Medical, Ltd. Methods and apparatus for ultrasound imaging
JP2012213606A (en) * 2011-04-01 2012-11-08 Toshiba Corp Ultrasonic diagnostic apparatus, and control program

Also Published As

Publication number Publication date
KR20160088616A (en) 2016-07-26

Similar Documents

Publication Publication Date Title
US10278670B2 (en) Ultrasound diagnostic apparatus and method of controlling ultrasound diagnostic apparatus
JP4758355B2 (en) System for guiding medical equipment into a patient's body
US10231710B2 (en) Ultrasound diagnosis apparatus and ultrasound imaging method
US20170095226A1 (en) Ultrasonic diagnostic apparatus and medical image diagnostic apparatus
KR101478623B1 (en) Ultrasound system and method for providing guide line of needle
WO2014081006A1 (en) Ultrasonic diagnostic device, image processing device, and image processing method
CN110403681B (en) Ultrasonic diagnostic apparatus and image display method
EP2823766A1 (en) Ultrasound system and method for providing object information
JP4989262B2 (en) Medical diagnostic imaging equipment
JP5797364B1 (en) Ultrasonic observation apparatus, operation method of ultrasonic observation apparatus, and operation program of ultrasonic observation apparatus
WO2007114375A1 (en) Ultrasound diagnostic device and control method for ultrasound diagnostic device
KR20180054360A (en) Ultrasonic diagnostic apparatus and method for controlling the same
KR20180090052A (en) Ultrasonic diagnostic apparatus and operating method for the same
US20150173721A1 (en) Ultrasound diagnostic apparatus, medical image processing apparatus and image processing method
KR20140066584A (en) Ultrasound system and method for providing guide line of needle
US8663110B2 (en) Providing an optimal ultrasound image for interventional treatment in a medical system
JP2018079070A (en) Ultrasonic diagnosis apparatus and scanning support program
JP2006246974A (en) Ultrasonic diagnostic equipment with reference image display function
US20210153847A1 (en) Ultrasonic imaging apparatus and control method thereof
KR101501517B1 (en) The method and apparatus for indicating a medical equipment on an ultrasound image
KR20120046539A (en) Ultrasound system and method for providing body mark
JP2018050655A (en) Ultrasonic diagnostic apparatus and medical image processing program
KR102035991B1 (en) Method and ultrasound system for generating image of target object
JP5468759B2 (en) Method and system for collecting a volume of interest based on position information
KR101143663B1 (en) Medical system and methdo for providing optimal ultrasound image for interventional treatment

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right