US20160174932A1 - Ultrasonic diagnostic device and ultrasonic image generation method - Google Patents
Ultrasonic diagnostic device and ultrasonic image generation method Download PDFInfo
- Publication number
- US20160174932A1 US20160174932A1 US15/055,143 US201615055143A US2016174932A1 US 20160174932 A1 US20160174932 A1 US 20160174932A1 US 201615055143 A US201615055143 A US 201615055143A US 2016174932 A1 US2016174932 A1 US 2016174932A1
- Authority
- US
- United States
- Prior art keywords
- needle
- needle tip
- image
- unit
- ultrasonic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 15
- 230000005540 biological transmission Effects 0.000 claims description 38
- 239000000523 sample Substances 0.000 claims description 30
- 238000006243 chemical reaction Methods 0.000 claims description 9
- 238000012545 processing Methods 0.000 description 40
- 230000000875 corresponding effect Effects 0.000 description 14
- 238000001514 detection method Methods 0.000 description 13
- 230000015654 memory Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 230000004048 modification Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 238000003780 insertion Methods 0.000 description 4
- 230000037431 insertion Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 3
- 239000002033 PVDF binder Substances 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 229910052451 lead zirconate titanate Inorganic materials 0.000 description 2
- 229920002981 polyvinylidene fluoride Polymers 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- QNRATNLHPGXHMA-XZHTYLCXSA-N (r)-(6-ethoxyquinolin-4-yl)-[(2s,4s,5r)-5-ethyl-1-azabicyclo[2.2.2]octan-2-yl]methanol;hydrochloride Chemical compound Cl.C([C@H]([C@H](C1)CC)C2)CN1[C@@H]2[C@H](O)C1=CC=NC2=CC=C(OCC)C=C21 QNRATNLHPGXHMA-XZHTYLCXSA-N 0.000 description 1
- FYYHWMGAXLPEAU-UHFFFAOYSA-N Magnesium Chemical compound [Mg] FYYHWMGAXLPEAU-UHFFFAOYSA-N 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 239000000919 ceramic Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- HFGPZNIAWCZYJU-UHFFFAOYSA-N lead zirconate titanate Chemical compound [O-2].[O-2].[O-2].[O-2].[O-2].[Ti+4].[Zr+4].[Pb+2] HFGPZNIAWCZYJU-UHFFFAOYSA-N 0.000 description 1
- 229910052749 magnesium Inorganic materials 0.000 description 1
- 239000011777 magnesium Substances 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
- A61B8/0841—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
- A61B8/145—Echo-tomography characterised by scanning multiple planes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4483—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
- A61B8/4488—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer the transducer being a phased array
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5207—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8909—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
- G01S15/8915—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52023—Details of receivers
- G01S7/52036—Details of receivers using analysis of echo signal for target characterisation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3403—Needle locating or guiding means
- A61B2017/3413—Needle locating or guiding means guided by ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M25/00—Catheters; Hollow probes
- A61M25/01—Introducing, guiding, advancing, emplacing or holding catheters
- A61M25/0105—Steering means as part of the catheter or advancing means; Markers for positioning
Definitions
- the present invention relates to an ultrasonic diagnostic device and an ultrasonic image generation method, and in particular, to an ultrasonic diagnostic device that visualizes the positional relationship between the needle direction and the target tissue and the positional relationship between the needle tip and the target tissue by visualizing the needle tip of the needle inserted into the subject in an ultrasonic image.
- this kind of ultrasonic diagnostic device includes an ultrasonic probe with built-in ultrasonic transducers and a device body connected to the ultrasonic probe, and generates an ultrasonic image by transmitting an ultrasonic wave toward a subject from the ultrasonic probe, receiving an ultrasonic echo from the subject using the ultrasonic probe, and performing electrical processing on the reception signal in the device body.
- the needle that is inserted so as to be inclined at a predetermined angle with respect to the skin surface of the subject is inclined with respect to the ultrasonic wave transmitting and receiving surface of the ultrasonic probe, as shown in FIG. 16A .
- the specular reflection wave from the needle may deviate from the reception opening.
- the reflection at the needle tip is not a perfect specular reflection, a slight reflection returns to the reception opening.
- the received signal strength is low, it is difficult to visualize the needle to the extent that the needle can be visually recognized.
- the visualization depth is limited by tilting the ultrasonic beam, it is not possible to draw either the needle tip or the target tissue even if it is possible to draw the needle. Accordingly, the positional relationship between the needle direction or the needle tip and the target tissue is not known.
- JP2010-183935A focuses on the fact that the amount of high frequency components in the reflection signal from the needle tip portion is smaller than that in the reflection signal from portions other than the needle tip portion, and an ultrasonic image in which the position of the needle tip portion can be easily visually recognized is generated by capturing an image of the low frequency band and an image of the high frequency band, taking a difference therebetween, and superimposing the difference image on an image of another high frequency band.
- JP2012-213606A improves the visibility of both the body tissue and the puncture needle in a displayed image by capturing reflected waves from the puncture needle by performing a plurality of scans while changing the transmission direction of the ultrasonic wave, generating ultrasonic images with improved visibility of the puncture needle, generating a needle image based on the plurality of ultrasonic images with the changed transmission directions and the normal tissue image, and combining the normal tissue image and the needle image.
- JP2010-183935A the frequency difference between the reflection signal from the needle tip portion and reflection signals from portions other than the needle tip portion is small. Therefore, also in the portions other than the needle tip portion, the same frequency may be obtained due to isolated point-like reflection or reflection conditions. For this reason, it is difficult to visualize only the needle tip.
- JP2012-213606A does not describe the visualization of the needle tip even though a plurality of needle images are generated by performing a scan in a plurality of directions.
- the present invention provides an ultrasonic diagnostic device that transmits an ultrasonic wave toward a subject from an ultrasonic probe and generates an ultrasonic image based on obtained reception data.
- the ultrasonic diagnostic device includes: a tissue image generation unit that generates a tissue image of the subject by transmitting a transmission wave in a normal direction of an ultrasonic wave transmitting and receiving surface of the ultrasonic probe and receiving a reception wave from the normal direction of the subject; a needle information generation unit that generates needle information of a needle inserted into the subject by steering at least one of the transmission wave and the reception wave; a needle direction estimation unit that estimates a direction of the needle based on the needle information generated by the needle information generation unit; a search region setting unit that sets a search region of a needle tip in the tissue image based on the needle direction estimated by the needle direction estimation unit; a needle tip search unit that searches for the needle tip in the search region set by the search region setting unit; and a needle tip visualizing unit that visualizes the needle tip
- the needle information generation unit generates a plurality of pieces of the needle information with different steering directions by changing a steering direction to steer at least one of the transmission wave and the reception wave, and the needle direction estimation unit estimates the needle direction based on the plurality of pieces of needle information with different steering directions. It is preferable that the needle information generated by the needle information generation unit is needle image data.
- the needle direction estimation unit can estimate the needle direction by Hough conversion.
- the search region setting unit sets the search region that extends to both sides of the needle direction estimated by the needle direction estimation unit with a predetermined width.
- the needle tip search unit searches for a point, at which a brightness value is a maximum, in the search region as the needle tip.
- the needle tip search unit includes a needle tip pattern of the needle tip and searches for a point, at which a correlation with the needle tip pattern is a maximum, in the search region as the needle tip.
- the needle tip visualizing unit visualizes a point image having a predetermined size at a position of the needle tip.
- the needle tip visualizing unit visualizes a frame of a predetermined range from a position of the needle tip.
- the needle tip visualizing unit may change a brightness value or a color of the tissue image inside or outside the frame, or the needle tip visualizing unit may apply a translucent mask onto the tissue image inside or outside the frame.
- the needle tip search unit may compare a tissue image before movement of the needle tip with a tissue image after movement of the needle tip and search for the needle tip based on a change between the tissue images.
- the present invention provides an ultrasonic image generation method of transmitting an ultrasonic wave toward a subject from an ultrasonic probe and generating an ultrasonic image based on obtained reception data.
- the ultrasonic image generation method includes: generating a tissue image of the subject by transmitting a transmission wave from the ultrasonic probe and receiving a reception wave from the subject; generating needle information of a needle inserted into the subject by steering at least one of the transmission wave and the reception wave; estimating a direction of the needle based on the needle information; setting a search region of a needle tip in the tissue image based on the estimated needle direction; searching for the needle tip in the set search region; and visualizing the needle tip on the tissue image based on the found needle tip.
- the present invention during the insertion, by specifying the position of the needle tip present in a deep portion of the subject and visualizing the needle tip in the tissue image, it is possible to visualize the positional relationship between the needle direction and the target tissue and the positional relationship between the needle tip and the target tissue in the tissue image.
- FIG. 1 is a block diagram showing the overall configuration of an ultrasonic diagnostic device according to an embodiment of the present invention.
- FIG. 2 is a flowchart showing the operation of the ultrasonic diagnostic device shown in FIG. 1 .
- FIG. 3A is an explanatory view for explaining a scanning line V_i in a normal direction and a scanning line H_i in a steering direction in the ultrasonic diagnostic device shown in FIG. 1
- FIG. 3B is an explanatory view of a tissue image corresponding to the scanning line V_i in the normal direction
- FIG. 3C is an explanatory view of a needle image corresponding to the scanning line H_i in the steering direction.
- FIG. 4 is an example of a tissue image, which is generated by the ultrasonic diagnostic device shown in FIG. 1 and in which a needle direction L estimated from the needle image is visualized.
- FIG. 5 is an example of a tissue image in which a search region F, which is set based on the needle direction L in the tissue image shown in FIG. 4 , is visualized.
- FIG. 6 is an enlarged extraction image of a region W shown in FIG. 5 .
- FIG. 7 is an example when a needle tip N, which is a point image in FIG. 6 , is visualized.
- FIG. 8 is an example when a needle tip N, a needle tip region NF, a needle body NB, and a search region F are visualized in the tissue image generated by the ultrasonic diagnostic device shown in FIG. 1 .
- FIG. 9 is an example when the brightness value inside or outside the needle tip region NF in the tissue image shown in FIG. 8 is changed.
- FIG. 10 is an example when a translucent mask is applied onto the inside or the outside of the needle tip region NF in the tissue image shown in FIG. 8 .
- FIG. 11 is an explanatory view when performing transmission focus processing in the normal direction and reception focus processing in the needle direction in the ultrasonic diagnostic device shown in FIG. 1 .
- FIG. 12 is an explanatory view when selecting a needle image for estimating the needle direction from a plurality of needle images with different steering directions.
- FIG. 13 is a schematic diagram showing an example of the needle tip pattern.
- FIG. 14 is an explanatory view when searching for the needle tip based on the needle tip pattern.
- FIG. 15A is an example of a tissue image captured before the movement of the needle tip when capturing a plurality of tissue images with the movement of the needle tip
- FIG. 15B is an example of a tissue image captured after the movement of the needle tip.
- FIG. 16A is a diagram showing that the specular reflection of a needle by the ultrasonic beam in the normal direction deviates from the reception opening in the subject into which the needle is inserted
- FIG. 16B is a diagram showing that an ultrasonic echo based on reflection from the needle can be received by transmitting the ultrasonic beam in a state in which the ultrasonic beam is steered in the needle direction in the subject into which the needle is inserted.
- FIG. 1 is a block diagram showing the overall configuration of an ultrasonic diagnostic device according to an embodiment of the present invention.
- the ultrasonic diagnostic device includes an ultrasonic probe 1 , and a transmission circuit 2 and a reception circuit 3 are connected to the ultrasonic probe 1 .
- a tissue image generation unit 4 and a needle image generation unit 5 are connected in parallel to the reception circuit 3 .
- a needle tip visualizing unit 9 is connected to the tissue image generation unit 4
- a display unit 11 is connected to the needle tip visualizing unit 9 through a display control unit 10 .
- a needle direction estimation unit 6 is connected to the needle image generation unit 5
- a needle tip search unit 8 is connected to the needle direction estimation unit 6 through a search region setting unit 7
- the needle tip search unit 8 is connected to the needle tip visualizing unit 9 .
- the search region setting unit 7 is connected to the tissue image generation unit 4 .
- a control unit 12 is connected to the transmission circuit 2 , the reception circuit 3 , the tissue image generation unit 4 , the needle image generation unit 5 , the needle tip visualizing unit 9 , the needle direction estimation unit 6 , the search region setting unit 7 , the needle tip search unit 8 , and the display control unit 10 .
- An operation unit 13 and a storage unit 14 are connected to the control unit 12 .
- the tissue image generation unit 4 includes a phasing addition section 15 A, a detection processing section 16 A, a digital scan converter (DSC) 17 A, and an image processing section 18 A, which are connected sequentially from the reception circuit 3 , and an image memory 19 A connected to the DSC 17 A.
- a phasing addition section 15 A a detection processing section 16 A
- a digital scan converter (DSC) 17 A a digital scan converter (DSC) 17 A
- an image processing section 18 A which are connected sequentially from the reception circuit 3 , and an image memory 19 A connected to the DSC 17 A.
- the needle image generation unit 5 includes a phasing addition section 15 B, a detection processing section 16 B, a digital scan converter (DSC) 17 B, and an image processing section 18 B, which are connected sequentially from the reception circuit 3 , and an image memory 19 B connected to the DSC 17 B.
- a phasing addition section 15 B a detection processing section 16 B, a digital scan converter (DSC) 17 B, and an image processing section 18 B, which are connected sequentially from the reception circuit 3 , and an image memory 19 B connected to the DSC 17 B.
- DSC digital scan converter
- the ultrasonic probe 1 includes a plurality of elements arranged in a one-dimensional or two-dimensional array, and transmits an ultrasonic beam (transmission wave) based on a transmission signal supplied from the transmission circuit 2 , receives an ultrasonic echo (reception wave) from the subject, and outputs a reception signal.
- each element that forms the ultrasonic probe 1 is formed by a transducer in which electrodes are formed at both ends of the piezoelectric body formed of piezoelectric ceramic represented by lead zirconate titanate (PZT), a polymer piezoelectric element represented by polyvinylidene fluoride (PVDF), piezoelectric single crystal represented by lead magnesium niobate-lead titanate (PMN-PT), or the like.
- PZT lead zirconate titanate
- PVDF polymer piezoelectric element represented by polyvinylidene fluoride
- PMN-PT piezoelectric single crystal represented by lead magnesium niobate-lead titanate
- the piezoelectric body When a pulsed or continuous-wave transmission signal voltage is applied to the electrodes of the transducer, the piezoelectric body expands and contracts to generate pulsed or continuous-wave ultrasonic waves from each transducer. By combination of these ultrasonic waves, an ultrasonic beam is formed. In addition, the respective transducers expand and contract by receiving the propagating ultrasonic waves, thereby generating electrical signals. These electrical signals are output as reception signals of the ultrasonic waves.
- the transmission circuit 2 includes a plurality of pullers, for example.
- the transmission circuit 2 performs transmission focus processing so that ultrasonic waves transmitted from the plurality of elements of the ultrasonic probe 1 form an ultrasonic beam based on the transmission delay pattern selected according to the control signal from the control unit 12 , adjusts the amount of delay of each transmission signal, and supplies the adjusted signals to the plurality of elements.
- the ultrasonic beam from the ultrasonic probe 1 can be steered at a predetermined angle with respect to the normal direction of the ultrasonic wave transmitting and receiving surface.
- the reception circuit 3 performs amplification and A/D conversion of the analog reception signals output from the plurality of elements of the ultrasonic probe 1 , and outputs digital reception signals to the phasing addition section 15 A of the tissue image generation unit 4 or the phasing addition section 15 B of the needle image generation unit 5 or to both of the phasing addition section 15 A of the tissue image generation unit 4 and the phasing addition section 15 B of the needle image generation unit 5 in response to the instruction from the control unit 12 .
- the phasing addition section 15 A of the tissue image generation unit 4 acquires the digital reception signals from the reception circuit 3 in response to the instruction from the control unit 12 , and performs reception focus processing by delaying the reception signals based on the reception delay pattern from the control unit 12 and adding the delayed reception signals.
- reception focus processing reception data (sound ray signal) based on the ultrasonic echo from the target tissue is generated.
- the detection processing section 16 A generates a B-mode image signal, which is tomographic image information regarding a tissue within the subject, by correcting the attenuation due to the distance according to the depth of the reflection position of the ultrasonic wave and then performing envelope detection processing for the reception data.
- the DSC 17 A converts the B-mode image signal generated by the detection processing section 16 A into an image signal according to the normal television signal scanning method (raster conversion). In addition, by converting the B-mode image signal in the DSC 17 A, it is possible to grasp the positional relationship or the distance corresponding to the tissue of the actual subject on the B-mode image.
- the image processing section 18 A generates a B-mode image signal of the tissue image by performing various kinds of required image processing, such as gradation processing, on the B-mode image signal input from the DSC 17 A.
- the phasing addition section 15 B of the needle image generation unit 5 acquires the digital reception signals from the reception circuit 3 in response to the instruction from the control unit 12 , and performs reception focus processing by delaying the reception signals based on the reception delay pattern from the control unit 12 and adding the delayed reception signals.
- the phasing addition section 15 B generates reception data (sound ray signal) based on the ultrasonic echo from the needle, which is steered at a predetermined angle with respect to the normal direction of the ultrasonic wave transmitting and receiving surface by adjusting the amount of delay of each reception signal.
- the detection processing section 16 B Similar to the detection processing section 16 A, the detection processing section 16 B generates a B-mode image signal, which is tomographic image information regarding a tissue within the subject, by correcting the attenuation due to the distance according to the depth of the reflection position of the ultrasonic wave and then performing envelope detection processing for the reception data.
- a B-mode image signal which is tomographic image information regarding a tissue within the subject
- the DSC 17 B converts the B-mode image signal generated by the detection processing section 16 B into an image signal according to the normal television signal scanning method (raster conversion).
- the B-mode image signal in the DSC 17 B it is possible to grasp the positional relationship or the distance corresponding to the tissue of the actual subject on the B-mode image.
- the image processing section 18 B generates a B-mode image signal of the needle image from the B-mode image signal input from the DSC 17 B.
- the needle direction estimation unit 6 estimates a needle direction, which indicates a direction in which the needle inserted into the subject is present, from the B-mode image signal of the needle image output from the image processing section 18 B, and generates needle direction information indicating the position of the needle direction.
- the search region setting unit 7 acquires the needle direction information from the needle direction estimation unit 6 and acquires the B-mode image signal of the tissue image from the image processing section 18 A of the tissue image generation unit 4 , visualizes a needle direction on the tissue image based on the needle direction information, and sets a search region for searching for the needle tip based on the needle direction on the tissue image. For example, a region that extends to both sides of the needle direction with a predetermined width may be set as a search region.
- the needle tip search unit 8 generates the position information of the needle tip by searching for the needle tip in the search region, which is set by the search region setting unit 7 , in the tissue image in which the needle direction and the search region are set.
- the needle tip visualizing unit 9 acquires the position information of the needle tip from the needle tip search unit 8 and acquires the B-mode image signal of the tissue image from the image processing section 18 A of the tissue image generation unit 4 , and visualizes the needle tip on the tissue image.
- the needle tip visualizing unit 9 may visualize a needle direction from the needle tip to the base of the needle based on the needle direction information, or may visualize a search region based on the information of the search region.
- the display control unit 10 acquires a B-mode image signal of the tissue image in which the needle tip is visualized by the needle tip visualizing unit 9 , and displays the tissue image in which the needle tip is visualized on the display unit 11 .
- the display unit 11 includes a display device, such as an LCD, and displays a tissue image, which is an ultrasonic image, under the control of the display control unit 10 .
- the control unit 12 controls each unit based on the instruction input from the operation unit by the operator. As described above, the control unit 12 selects and outputs a transmission delay pattern for the transmission circuit 2 or selects and outputs a reception delay pattern for the reception circuit 3 , and outputs an instruction on phasing addition or the correction of attenuation and envelope detection processing, based on the reception delay pattern or the transmission delay pattern, to the phasing addition section 15 A or the detection processing section 16 A of the tissue image generation unit 4 or to the phasing addition section 15 B or the detection processing section 16 B of the needle image generation unit 5 .
- the operation unit 13 is used when the operator performs an input operation, and can be formed by a keyboard, a mouse, a trackball, a touch panel, and the like.
- Various kinds of information input from the operation unit 13 information based on the above-described transmission delay pattern or reception delay pattern, information regarding the sound speed in an inspection target region of the subject, the focal position of the ultrasonic beam, and the transmission opening and the reception opening of the ultrasonic probe 1 , an operation program required for the control of each unit, and the like are stored in the storage unit 14 .
- Recording media such as a hard disk, a flexible disk, an MO, an MT, a RAM, a CD-ROM, and a DVD-ROM, can be used as the storage unit 14 .
- FIG. 2 is a flowchart showing the operation of an embodiment.
- i is the order of the scanning line of the ultrasonic probe 1 , and the ultrasonic probe 1 acquires a reception signal corresponding to each scanning line.
- step S 2 corresponding to the scanning line V_ 1 in the normal direction, the ultrasonic probe 1 acquires a reception signal corresponding to the scanning line V_ 1 in the normal direction by transmitting the ultrasonic beam toward the target tissue T in the normal direction of the ultrasonic wave transmitting and receiving surface S and receiving the ultrasonic echo from the normal direction of the ultrasonic wave transmitting and receiving surface S, and the tissue image generation unit 4 generates a tissue image corresponding to the normal direction scanning line V_ 1 shown in FIG. 3B and stores the tissue image in the image memory 19 A.
- step S 3 corresponding to the scanning line H_ 1 in the steering direction, the ultrasonic probe 1 acquires a reception signal corresponding to the scanning line H_ 1 in the steering direction by transmitting the ultrasonic beam in the steering direction, which is steered by the predetermined angle ⁇ toward the needle direction from the normal direction of the ultrasonic wave transmitting and receiving surface S, and receiving the ultrasonic echo from the steering direction, and the needle image generation unit 5 generates a needle image corresponding to the steering direction scanning line H_ 1 shown in FIG. 3C and stores the needle image in the image memory 19 B.
- the predetermined angle ⁇ may be a fixed value set in advance, or may be acquired from a device (not shown) for calculating the angle formed by the normal direction and the insertion angle of the probe. Alternatively, a direction in which a strong signal is returned after transmitting and receiving signals in a plurality of directions in advance may be set as the predetermined angle.
- step S 5 to increase i by 1, that is, to move to the second scanning line, and steps S 2 to S 4 are repeated to generate B-mode image signals of the corresponding tissue image and needle image.
- steps S 2 and S 3 are repeated.
- step S 6 when B-mode image signals of tissue images for all of the “n” scanning lines V_ 1 to V_n and B-mode image signals of needle images for all of the “n” scanning lines H_ 1 to H_n are generated, the process proceeds to step S 6 from step S 4 .
- the needle direction estimation unit 6 estimates a needle direction L based on the B-mode image signal obtained by performing image processing by scan-converting the needle image stored in the image memory 19 B.
- the needle direction is performed by calculating the brightness distribution in the entire needle image or in a predetermined region in which it is assumed that a needle is included, detecting a straight line in the entire needle image or in the predetermined region by Hough conversion and setting it as a needle direction, and setting the position information of the needle direction as needle direction information.
- a straight line is detected by Hough conversion, a brightness value may be multiplied as a weighting factor when converting each pixel to the curve in the Op coordinate system and superimposing the curves on each other. Through this method, a high-brightness straight line, such as a needle, can be easily detected.
- the needle direction information of the needle direction L estimated by the needle direction estimation unit 6 is output to the search region setting unit 7 .
- step S 7 the search region setting unit 7 superimposes a signal in the needle direction L on the B-mode image signal, which is obtained by performing image processing by scan-converting the tissue image stored in the image memory 19 B, based on the needle direction information output from the needle direction estimation unit 6 as shown in FIG. 4 , and sets a search region F extending from the needle direction L of the tissue image to both sides of the needle direction L with a predetermined width r as shown in FIG. 5 .
- the B-mode image signal of the tissue image in which the needle direction L and the search region F are set is output to the needle tip search unit 8 .
- the predetermined width r may be set to three to five times the width of the needle based on the width of the needle inserted into the body.
- the needle tip search unit 8 calculates the brightness distribution of the tissue image, and determines a maximum brightness point B in a search region F as a needle tip as shown in FIG. 6 obtained by extracting a region W in FIG. 5 in an enlarged manner.
- the needle tip search unit 8 may have a needle tip pattern, such as an image of the needle tip, in advance, take a correlation with the needle tip pattern in the tissue image in the search region F, and determine a point at which the correlation is the maximum as the needle tip.
- the position information of the needle tip found by the needle tip search unit 8 is output to the needle tip visualizing unit 9 .
- the needle tip search unit 8 may output needle direction information or the information of the search region together with the position information of the needle tip.
- step S 9 the needle tip visualizing unit 9 visualizes a needle tip N, which is a point image having a predetermined size, in the tissue image from the position information of the needle tip found by the needle tip search unit 8 .
- the B-mode image signal of the tissue image in which the needle tip is visualized is output to the display control unit 10 , and is displayed as a tissue image in which the needle tip is visualized on the display unit 11 .
- the needle tip visualizing unit 9 visualizes the needle tip N in the tissue image, but also it is possible to adopt various kinds of display methods for making the needle tip clear in the tissue image.
- a circular frame showing a needle tip region NF that extends by a predetermined radius from the position of the needle tip may be displayed, or the search region F may be displayed based on the information of the above-described search region, or a needle body NB, which is obtained by visualizing the needle direction L from the needle tip N or which is obtained by connecting the needle tip N to the base portion of the needle direction L in a straight line may be visualized based on the needle direction L.
- the needle tip visualizing unit 9 may change the brightness value or the color of a tissue image inside or outside the needle tip region NF surrounded by the circular frame as shown in FIG. 9 , or may apply a translucent mask onto the tissue image of the inside or the outside of the needle tip region NF surrounded by the circular frame as shown in FIG. 10 .
- a frame having a predetermined shape for example, a rectangular frame or a rhombic frame having a position of the needle tip on its center, may be displayed.
- the needle tip By emphasizing the needle tip by visualizing the needle tip in the tissue image as described above, the needle tip can be easily visually recognized in the tissue image. Therefore, it is possible to clearly grasp the positional relationship between the needle direction and the target tissue and the positional relationship between the needle tip and the target tissue.
- a tissue image can be generated by performing transmission focus processing on the ultrasonic wave toward a predetermined focal point in the normal direction of the ultrasonic wave receiving surface and performing reception focus processing on the ultrasonic echo from the target tissue in the normal direction of the ultrasonic wave transmitting and receiving surface, and a needle image can be generated by performing reception focus processing on the ultrasonic echo from the needle in the R direction indicated by the dotted arrow.
- the needle direction is estimated based on one needle image.
- a plurality of needle images with different steering directions may be generated by changing the steering direction for steering at least one of the direction of transmission focus processing, which is the transmission direction of the ultrasonic beam, and the direction of reception focus processing of the ultrasonic echo, the sharpest needle image among the plurality of needle images may be selected, and the above-described needle direction may be estimated based on the selected sharpest needle image.
- the needle direction estimation unit 6 acquires a plurality of needle images with different steering directions from the needle image generation unit 5 , and selects a needle image in which the needle is visualized best as shown in FIG. 12 .
- the brightness distribution in the entire needle image or a predetermined region in which it is assumed that a needle is included may be calculated for each needle image, and a needle image including a point of the highest brightness value may be selected or a needle image having a maximum average brightness value may be selected, for example.
- it is possible to estimate the needle direction by selecting a needle image in which the needle is visualized best. That is, a direction perpendicular to the steering direction in which the needle is visualized best can be estimated to be the needle direction.
- the brightness distribution in the entire needle image or a predetermined region in which it is assumed that a needle is included may be calculated for a plurality of needle images with different steering directions, a straight line may be detected by the Hough conversion or the like, and a needle image in which the average brightness value of the straight line is the maximum may be selected.
- a needle image having a point of the maximum brightness value on the straight line, which is higher than points of the maximum brightness values on the straight lines in the other needle images may also be selected.
- the predetermined region in which it is assumed that a needle is included is assumed from the approximate angle of the insertion, for example.
- the needle direction estimation unit 6 estimates the needle direction based on the selected needle image.
- the needle direction estimation unit 6 may calculate the brightness distribution in the entire needle image or a predetermined region in which it is assumed that a needle is included using all of a plurality of needle images with different steering directions, detect a straight line based on the brightness distribution in the entire needle image or the predetermined region by the Hough conversion or the like, and set the straight line as a needle direction.
- the needle tip search unit 8 determines the maximum brightness point B in the search region F as a needle tip.
- the needle tip search unit 8 may have a needle tip pattern in advance and search for the needle tip based on the needle tip pattern.
- a needle tip pattern for example, as shown in FIG. 13 , there is an image that is a line segment, which connects the needle tip and the end of the cut surface of the needle to each other and has a predetermined length d, and that has a high-brightness point at both ends of the line segment.
- the needle tip search unit 8 may have the above-described needle tip pattern. Then, as shown in FIG. 14 , the needle tip search unit 8 may search for a high-brightness point B 1 and a high-brightness point B 2 , which are considered to be the most correlated with the needle tip pattern in the search region F, and determine the high-brightness point B 1 located in a deep portion of the subject, between the high-brightness point B 1 and the high-brightness point B 2 , as a needle tip.
- the needle tip search unit 8 may search for the needle tip by comparing the tissue image before the movement with the tissue image after the movement.
- the needle tip search unit 8 may calculate the brightness distribution in each of the tissue image before movement and the tissue image after movement and search for the needle tip based on the change in the brightness value.
- FIG. 15A that is a tissue image before movement
- FIG. 15B that is a tissue image after movement
- a point P 2 where the brightness value becomes suddenly large in FIG. 15B may be determined as the needle tip
- a point P 1 where the brightness value becomes suddenly small may be determined as the needle tip.
- the point P 2 may be determined as the needle tip based on the fact that the point P 2 where the brightness value becomes suddenly large and the point P 1 where the brightness value becomes suddenly small are adjacent to each other.
- a needle tip pattern image of the brightness change including the point P 2 where the brightness value becomes suddenly large and the point P 1 where the brightness value becomes suddenly small may be prepared in advance, and a point considered to be the most correlated with the needle tip pattern in the search region F in the image of the brightness change between the tissue image before movement and the tissue image after movement may be searched for and determined to be the needle tip.
- the needle tip search unit 8 may compare the tissue image before movement with the tissue image after movement, calculate the amount of movement and the movement direction between the images at each point in a predetermined region including the needle tip by the two-dimensional correlation operation or the like, and determine a point of the largest amount of movement or a point of the largest spatial change in the amount of movement or the movement direction as the needle tip.
- the needle tip search unit 8 may compare the tissue image before movement with the tissue image after movement, calculate a change before and after movement in the image pattern near each point in a predetermined region including the needle tip by the two-dimensional correlation operation or the like, and determine a point of the largest image pattern change or a point of the largest spatial change of the image pattern change as the needle tip.
- the needle image generation unit 5 generates a needle image
- the needle direction estimation unit 6 estimates a needle direction based on the needle image.
- the needle direction may be estimated based on the reception signal from each element of the ultrasonic probe 1
- the needle direction may be estimated based on the reception data (sound ray signal) after phasing addition.
- control unit 12 control unit
- 19 A, 19 B image memory
- V_i normal direction scanning line
Abstract
Description
- This application is a continuation application of International Application PCT/JP2014/062064 filed on May 1, 2014, which claims priority under 35 U.S.C. 119(a) to Application No. 2013-179830 filed in Japan on Aug. 30, 2013 all of which are hereby expressly incorporated by reference into the present application.
- 1. Field of the Invention
- The present invention relates to an ultrasonic diagnostic device and an ultrasonic image generation method, and in particular, to an ultrasonic diagnostic device that visualizes the positional relationship between the needle direction and the target tissue and the positional relationship between the needle tip and the target tissue by visualizing the needle tip of the needle inserted into the subject in an ultrasonic image.
- 2. Description of the Related Art
- Conventionally, in the medical field, an ultrasonic diagnostic device using an ultrasonic image has been put into practical use. In general, this kind of ultrasonic diagnostic device includes an ultrasonic probe with built-in ultrasonic transducers and a device body connected to the ultrasonic probe, and generates an ultrasonic image by transmitting an ultrasonic wave toward a subject from the ultrasonic probe, receiving an ultrasonic echo from the subject using the ultrasonic probe, and performing electrical processing on the reception signal in the device body.
- When visualizing the needle inserted into the subject in an ultrasonic image, the needle that is inserted so as to be inclined at a predetermined angle with respect to the skin surface of the subject is inclined with respect to the ultrasonic wave transmitting and receiving surface of the ultrasonic probe, as shown in
FIG. 16A . Accordingly, when transmitting the ultrasonic beam toward the target tissue from the transmission and reception opening, the specular reflection wave from the needle may deviate from the reception opening. In this case, it is known that a needle image cannot be visualized since the reception opening cannot receive the reflected wave from the needle. In addition, since the reflection at the needle tip is not a perfect specular reflection, a slight reflection returns to the reception opening. However, since the received signal strength is low, it is difficult to visualize the needle to the extent that the needle can be visually recognized. - In contrast, as shown in
FIG. 16B , measures for receiving the reflected wave from the needle by tilting the ultrasonic beam so as to be perpendicular to the needle have been made. - However, since the visualization depth is limited by tilting the ultrasonic beam, it is not possible to draw either the needle tip or the target tissue even if it is possible to draw the needle. Accordingly, the positional relationship between the needle direction or the needle tip and the target tissue is not known.
- JP2010-183935A focuses on the fact that the amount of high frequency components in the reflection signal from the needle tip portion is smaller than that in the reflection signal from portions other than the needle tip portion, and an ultrasonic image in which the position of the needle tip portion can be easily visually recognized is generated by capturing an image of the low frequency band and an image of the high frequency band, taking a difference therebetween, and superimposing the difference image on an image of another high frequency band.
- In addition, JP2012-213606A improves the visibility of both the body tissue and the puncture needle in a displayed image by capturing reflected waves from the puncture needle by performing a plurality of scans while changing the transmission direction of the ultrasonic wave, generating ultrasonic images with improved visibility of the puncture needle, generating a needle image based on the plurality of ultrasonic images with the changed transmission directions and the normal tissue image, and combining the normal tissue image and the needle image.
- In JP2010-183935A, however, the frequency difference between the reflection signal from the needle tip portion and reflection signals from portions other than the needle tip portion is small. Therefore, also in the portions other than the needle tip portion, the same frequency may be obtained due to isolated point-like reflection or reflection conditions. For this reason, it is difficult to visualize only the needle tip.
- In addition, JP2012-213606A does not describe the visualization of the needle tip even though a plurality of needle images are generated by performing a scan in a plurality of directions.
- It is an object of the present invention to provide an ultrasonic diagnostic device and an ultrasonic image generation method for specifying the position of the needle tip present in a deep portion of the subject and visualizing the needle tip in a tissue image in the ultrasonic diagnosis with the insertion of the needle into the subject.
- In order to solve the aforementioned problem, the present invention provides an ultrasonic diagnostic device that transmits an ultrasonic wave toward a subject from an ultrasonic probe and generates an ultrasonic image based on obtained reception data. The ultrasonic diagnostic device includes: a tissue image generation unit that generates a tissue image of the subject by transmitting a transmission wave in a normal direction of an ultrasonic wave transmitting and receiving surface of the ultrasonic probe and receiving a reception wave from the normal direction of the subject; a needle information generation unit that generates needle information of a needle inserted into the subject by steering at least one of the transmission wave and the reception wave; a needle direction estimation unit that estimates a direction of the needle based on the needle information generated by the needle information generation unit; a search region setting unit that sets a search region of a needle tip in the tissue image based on the needle direction estimated by the needle direction estimation unit; a needle tip search unit that searches for the needle tip in the search region set by the search region setting unit; and a needle tip visualizing unit that visualizes the needle tip on the tissue image based on the needle tip found by the needle tip search unit.
- Preferably, the needle information generation unit generates a plurality of pieces of the needle information with different steering directions by changing a steering direction to steer at least one of the transmission wave and the reception wave, and the needle direction estimation unit estimates the needle direction based on the plurality of pieces of needle information with different steering directions. It is preferable that the needle information generated by the needle information generation unit is needle image data.
- The needle direction estimation unit can estimate the needle direction by Hough conversion.
- It is preferable that the search region setting unit sets the search region that extends to both sides of the needle direction estimated by the needle direction estimation unit with a predetermined width.
- It is preferable that the needle tip search unit searches for a point, at which a brightness value is a maximum, in the search region as the needle tip.
- It is preferable that the needle tip search unit includes a needle tip pattern of the needle tip and searches for a point, at which a correlation with the needle tip pattern is a maximum, in the search region as the needle tip.
- It is preferable that the needle tip visualizing unit visualizes a point image having a predetermined size at a position of the needle tip.
- It is preferable that the needle tip visualizing unit visualizes a frame of a predetermined range from a position of the needle tip. The needle tip visualizing unit may change a brightness value or a color of the tissue image inside or outside the frame, or the needle tip visualizing unit may apply a translucent mask onto the tissue image inside or outside the frame.
- When generating a series of plural tissue images with movement of the needle tip, the needle tip search unit may compare a tissue image before movement of the needle tip with a tissue image after movement of the needle tip and search for the needle tip based on a change between the tissue images.
- In addition, the present invention provides an ultrasonic image generation method of transmitting an ultrasonic wave toward a subject from an ultrasonic probe and generating an ultrasonic image based on obtained reception data. The ultrasonic image generation method includes: generating a tissue image of the subject by transmitting a transmission wave from the ultrasonic probe and receiving a reception wave from the subject; generating needle information of a needle inserted into the subject by steering at least one of the transmission wave and the reception wave; estimating a direction of the needle based on the needle information; setting a search region of a needle tip in the tissue image based on the estimated needle direction; searching for the needle tip in the set search region; and visualizing the needle tip on the tissue image based on the found needle tip.
- According to the present invention, during the insertion, by specifying the position of the needle tip present in a deep portion of the subject and visualizing the needle tip in the tissue image, it is possible to visualize the positional relationship between the needle direction and the target tissue and the positional relationship between the needle tip and the target tissue in the tissue image.
-
FIG. 1 is a block diagram showing the overall configuration of an ultrasonic diagnostic device according to an embodiment of the present invention. -
FIG. 2 is a flowchart showing the operation of the ultrasonic diagnostic device shown inFIG. 1 . -
FIG. 3A is an explanatory view for explaining a scanning line V_i in a normal direction and a scanning line H_i in a steering direction in the ultrasonic diagnostic device shown inFIG. 1 ,FIG. 3B is an explanatory view of a tissue image corresponding to the scanning line V_i in the normal direction, andFIG. 3C is an explanatory view of a needle image corresponding to the scanning line H_i in the steering direction. -
FIG. 4 is an example of a tissue image, which is generated by the ultrasonic diagnostic device shown inFIG. 1 and in which a needle direction L estimated from the needle image is visualized. -
FIG. 5 is an example of a tissue image in which a search region F, which is set based on the needle direction L in the tissue image shown inFIG. 4 , is visualized. -
FIG. 6 is an enlarged extraction image of a region W shown inFIG. 5 . -
FIG. 7 is an example when a needle tip N, which is a point image inFIG. 6 , is visualized. -
FIG. 8 is an example when a needle tip N, a needle tip region NF, a needle body NB, and a search region F are visualized in the tissue image generated by the ultrasonic diagnostic device shown inFIG. 1 . -
FIG. 9 is an example when the brightness value inside or outside the needle tip region NF in the tissue image shown inFIG. 8 is changed. -
FIG. 10 is an example when a translucent mask is applied onto the inside or the outside of the needle tip region NF in the tissue image shown inFIG. 8 . -
FIG. 11 is an explanatory view when performing transmission focus processing in the normal direction and reception focus processing in the needle direction in the ultrasonic diagnostic device shown inFIG. 1 . -
FIG. 12 is an explanatory view when selecting a needle image for estimating the needle direction from a plurality of needle images with different steering directions. -
FIG. 13 is a schematic diagram showing an example of the needle tip pattern. -
FIG. 14 is an explanatory view when searching for the needle tip based on the needle tip pattern. -
FIG. 15A is an example of a tissue image captured before the movement of the needle tip when capturing a plurality of tissue images with the movement of the needle tip, andFIG. 15B is an example of a tissue image captured after the movement of the needle tip. -
FIG. 16A is a diagram showing that the specular reflection of a needle by the ultrasonic beam in the normal direction deviates from the reception opening in the subject into which the needle is inserted, andFIG. 16B is a diagram showing that an ultrasonic echo based on reflection from the needle can be received by transmitting the ultrasonic beam in a state in which the ultrasonic beam is steered in the needle direction in the subject into which the needle is inserted. - Hereinafter, an ultrasonic diagnostic device and an ultrasonic image generation method according to the present invention will be described in detail with reference to the accompanying diagrams.
-
FIG. 1 is a block diagram showing the overall configuration of an ultrasonic diagnostic device according to an embodiment of the present invention. - The ultrasonic diagnostic device includes an
ultrasonic probe 1, and atransmission circuit 2 and areception circuit 3 are connected to theultrasonic probe 1. A tissue image generation unit 4 and a needleimage generation unit 5 are connected in parallel to thereception circuit 3. A needletip visualizing unit 9 is connected to the tissue image generation unit 4, and adisplay unit 11 is connected to the needletip visualizing unit 9 through adisplay control unit 10. A needledirection estimation unit 6 is connected to the needleimage generation unit 5, a needletip search unit 8 is connected to the needledirection estimation unit 6 through a search region setting unit 7, and the needletip search unit 8 is connected to the needletip visualizing unit 9. The search region setting unit 7 is connected to the tissue image generation unit 4. - A
control unit 12 is connected to thetransmission circuit 2, thereception circuit 3, the tissue image generation unit 4, the needleimage generation unit 5, the needletip visualizing unit 9, the needledirection estimation unit 6, the search region setting unit 7, the needletip search unit 8, and thedisplay control unit 10. Anoperation unit 13 and astorage unit 14 are connected to thecontrol unit 12. - The tissue image generation unit 4 includes a
phasing addition section 15A, adetection processing section 16A, a digital scan converter (DSC) 17A, and animage processing section 18A, which are connected sequentially from thereception circuit 3, and animage memory 19A connected to theDSC 17A. - Similarly, the needle
image generation unit 5 includes aphasing addition section 15B, adetection processing section 16B, a digital scan converter (DSC) 17B, and animage processing section 18B, which are connected sequentially from thereception circuit 3, and animage memory 19B connected to theDSC 17B. - The
ultrasonic probe 1 includes a plurality of elements arranged in a one-dimensional or two-dimensional array, and transmits an ultrasonic beam (transmission wave) based on a transmission signal supplied from thetransmission circuit 2, receives an ultrasonic echo (reception wave) from the subject, and outputs a reception signal. For example, each element that forms theultrasonic probe 1 is formed by a transducer in which electrodes are formed at both ends of the piezoelectric body formed of piezoelectric ceramic represented by lead zirconate titanate (PZT), a polymer piezoelectric element represented by polyvinylidene fluoride (PVDF), piezoelectric single crystal represented by lead magnesium niobate-lead titanate (PMN-PT), or the like. - When a pulsed or continuous-wave transmission signal voltage is applied to the electrodes of the transducer, the piezoelectric body expands and contracts to generate pulsed or continuous-wave ultrasonic waves from each transducer. By combination of these ultrasonic waves, an ultrasonic beam is formed. In addition, the respective transducers expand and contract by receiving the propagating ultrasonic waves, thereby generating electrical signals. These electrical signals are output as reception signals of the ultrasonic waves.
- The
transmission circuit 2 includes a plurality of pullers, for example. Thetransmission circuit 2 performs transmission focus processing so that ultrasonic waves transmitted from the plurality of elements of theultrasonic probe 1 form an ultrasonic beam based on the transmission delay pattern selected according to the control signal from thecontrol unit 12, adjusts the amount of delay of each transmission signal, and supplies the adjusted signals to the plurality of elements. By adjusting the amount of delay of each transmission signal in thetransmission circuit 2, the ultrasonic beam from theultrasonic probe 1 can be steered at a predetermined angle with respect to the normal direction of the ultrasonic wave transmitting and receiving surface. - The
reception circuit 3 performs amplification and A/D conversion of the analog reception signals output from the plurality of elements of theultrasonic probe 1, and outputs digital reception signals to thephasing addition section 15A of the tissue image generation unit 4 or thephasing addition section 15B of the needleimage generation unit 5 or to both of thephasing addition section 15A of the tissue image generation unit 4 and thephasing addition section 15B of the needleimage generation unit 5 in response to the instruction from thecontrol unit 12. - The
phasing addition section 15A of the tissue image generation unit 4 acquires the digital reception signals from thereception circuit 3 in response to the instruction from thecontrol unit 12, and performs reception focus processing by delaying the reception signals based on the reception delay pattern from thecontrol unit 12 and adding the delayed reception signals. By the reception focus processing, reception data (sound ray signal) based on the ultrasonic echo from the target tissue is generated. - The
detection processing section 16A generates a B-mode image signal, which is tomographic image information regarding a tissue within the subject, by correcting the attenuation due to the distance according to the depth of the reflection position of the ultrasonic wave and then performing envelope detection processing for the reception data. - The
DSC 17A converts the B-mode image signal generated by thedetection processing section 16A into an image signal according to the normal television signal scanning method (raster conversion). In addition, by converting the B-mode image signal in theDSC 17A, it is possible to grasp the positional relationship or the distance corresponding to the tissue of the actual subject on the B-mode image. - The
image processing section 18A generates a B-mode image signal of the tissue image by performing various kinds of required image processing, such as gradation processing, on the B-mode image signal input from theDSC 17A. - The
phasing addition section 15B of the needleimage generation unit 5 acquires the digital reception signals from thereception circuit 3 in response to the instruction from thecontrol unit 12, and performs reception focus processing by delaying the reception signals based on the reception delay pattern from thecontrol unit 12 and adding the delayed reception signals. Thephasing addition section 15B generates reception data (sound ray signal) based on the ultrasonic echo from the needle, which is steered at a predetermined angle with respect to the normal direction of the ultrasonic wave transmitting and receiving surface by adjusting the amount of delay of each reception signal. - Similar to the
detection processing section 16A, thedetection processing section 16B generates a B-mode image signal, which is tomographic image information regarding a tissue within the subject, by correcting the attenuation due to the distance according to the depth of the reflection position of the ultrasonic wave and then performing envelope detection processing for the reception data. - Similar to the
DSC 17A, theDSC 17B converts the B-mode image signal generated by thedetection processing section 16B into an image signal according to the normal television signal scanning method (raster conversion). In addition, by converting the B-mode image signal in theDSC 17B, it is possible to grasp the positional relationship or the distance corresponding to the tissue of the actual subject on the B-mode image. - The
image processing section 18B generates a B-mode image signal of the needle image from the B-mode image signal input from theDSC 17B. - The needle
direction estimation unit 6 estimates a needle direction, which indicates a direction in which the needle inserted into the subject is present, from the B-mode image signal of the needle image output from theimage processing section 18B, and generates needle direction information indicating the position of the needle direction. - The search region setting unit 7 acquires the needle direction information from the needle
direction estimation unit 6 and acquires the B-mode image signal of the tissue image from theimage processing section 18A of the tissue image generation unit 4, visualizes a needle direction on the tissue image based on the needle direction information, and sets a search region for searching for the needle tip based on the needle direction on the tissue image. For example, a region that extends to both sides of the needle direction with a predetermined width may be set as a search region. - The needle
tip search unit 8 generates the position information of the needle tip by searching for the needle tip in the search region, which is set by the search region setting unit 7, in the tissue image in which the needle direction and the search region are set. - The needle
tip visualizing unit 9 acquires the position information of the needle tip from the needletip search unit 8 and acquires the B-mode image signal of the tissue image from theimage processing section 18A of the tissue image generation unit 4, and visualizes the needle tip on the tissue image. - Instead of visualizing the needle tip in the tissue image, for example, the needle
tip visualizing unit 9 may visualize a needle direction from the needle tip to the base of the needle based on the needle direction information, or may visualize a search region based on the information of the search region. - The
display control unit 10 acquires a B-mode image signal of the tissue image in which the needle tip is visualized by the needletip visualizing unit 9, and displays the tissue image in which the needle tip is visualized on thedisplay unit 11. - For example, the
display unit 11 includes a display device, such as an LCD, and displays a tissue image, which is an ultrasonic image, under the control of thedisplay control unit 10. - The
control unit 12 controls each unit based on the instruction input from the operation unit by the operator. As described above, thecontrol unit 12 selects and outputs a transmission delay pattern for thetransmission circuit 2 or selects and outputs a reception delay pattern for thereception circuit 3, and outputs an instruction on phasing addition or the correction of attenuation and envelope detection processing, based on the reception delay pattern or the transmission delay pattern, to thephasing addition section 15A or thedetection processing section 16A of the tissue image generation unit 4 or to thephasing addition section 15B or thedetection processing section 16B of the needleimage generation unit 5. - The
operation unit 13 is used when the operator performs an input operation, and can be formed by a keyboard, a mouse, a trackball, a touch panel, and the like. - Various kinds of information input from the
operation unit 13, information based on the above-described transmission delay pattern or reception delay pattern, information regarding the sound speed in an inspection target region of the subject, the focal position of the ultrasonic beam, and the transmission opening and the reception opening of theultrasonic probe 1, an operation program required for the control of each unit, and the like are stored in thestorage unit 14. Recording media, such as a hard disk, a flexible disk, an MO, an MT, a RAM, a CD-ROM, and a DVD-ROM, can be used as thestorage unit 14. - Next, the operation of the ultrasonic diagnostic device according to an embodiment of the present invention to generate an ultrasonic image, in which a target tissue to be observed by the user is clearly imaged and the needle tip of the inserted needle is visualized, will be described.
-
FIG. 2 is a flowchart showing the operation of an embodiment. - First, in step S1, i=1 is set in a scanning line V_i (i=1 to n) in a normal direction with respect to an ultrasonic wave transmitting and receiving surface S of the
ultrasonic probe 1 and a scanning line H_i (i=1 to n) in a steering direction that is steered by a predetermined angle θ in the needle direction from the normal direction of the ultrasonic wave transmitting and receiving surface S, which are shown inFIG. 3A . Here, i is the order of the scanning line of theultrasonic probe 1, and theultrasonic probe 1 acquires a reception signal corresponding to each scanning line. - Then, in step S2, corresponding to the scanning line V_1 in the normal direction, the
ultrasonic probe 1 acquires a reception signal corresponding to the scanning line V_1 in the normal direction by transmitting the ultrasonic beam toward the target tissue T in the normal direction of the ultrasonic wave transmitting and receiving surface S and receiving the ultrasonic echo from the normal direction of the ultrasonic wave transmitting and receiving surface S, and the tissue image generation unit 4 generates a tissue image corresponding to the normal direction scanning line V_1 shown inFIG. 3B and stores the tissue image in theimage memory 19A. - Then, in step S3, corresponding to the scanning line H_1 in the steering direction, the
ultrasonic probe 1 acquires a reception signal corresponding to the scanning line H_1 in the steering direction by transmitting the ultrasonic beam in the steering direction, which is steered by the predetermined angle θ toward the needle direction from the normal direction of the ultrasonic wave transmitting and receiving surface S, and receiving the ultrasonic echo from the steering direction, and the needleimage generation unit 5 generates a needle image corresponding to the steering direction scanning line H_1 shown inFIG. 3C and stores the needle image in theimage memory 19B. The predetermined angle θ may be a fixed value set in advance, or may be acquired from a device (not shown) for calculating the angle formed by the normal direction and the insertion angle of the probe. Alternatively, a direction in which a strong signal is returned after transmitting and receiving signals in a plurality of directions in advance may be set as the predetermined angle. - Thus, when a B-mode image signal of the tissue image corresponding to the first normal direction scanning line V_1 and a B-mode image signal of the needle image corresponding to the first steering direction scanning line H_1 are stored in the
image memories ultrasonic probe 1 in step S4. - In this case, since the value of i is still 1, the process proceeds to step S5 to increase i by 1, that is, to move to the second scanning line, and steps S2 to S4 are repeated to generate B-mode image signals of the corresponding tissue image and needle image. Similarly, until i=n, the value of i is increased by 1 in a sequential manner, and steps S2 and S3 are repeated.
- In this manner, when B-mode image signals of tissue images for all of the “n” scanning lines V_1 to V_n and B-mode image signals of needle images for all of the “n” scanning lines H_1 to H_n are generated, the process proceeds to step S6 from step S4.
- In step S6, the needle
direction estimation unit 6 estimates a needle direction L based on the B-mode image signal obtained by performing image processing by scan-converting the needle image stored in theimage memory 19B. For example, the needle direction is performed by calculating the brightness distribution in the entire needle image or in a predetermined region in which it is assumed that a needle is included, detecting a straight line in the entire needle image or in the predetermined region by Hough conversion and setting it as a needle direction, and setting the position information of the needle direction as needle direction information. When a straight line is detected by Hough conversion, a brightness value may be multiplied as a weighting factor when converting each pixel to the curve in the Op coordinate system and superimposing the curves on each other. Through this method, a high-brightness straight line, such as a needle, can be easily detected. The needle direction information of the needle direction L estimated by the needledirection estimation unit 6 is output to the search region setting unit 7. - In step S7, the search region setting unit 7 superimposes a signal in the needle direction L on the B-mode image signal, which is obtained by performing image processing by scan-converting the tissue image stored in the
image memory 19B, based on the needle direction information output from the needledirection estimation unit 6 as shown inFIG. 4 , and sets a search region F extending from the needle direction L of the tissue image to both sides of the needle direction L with a predetermined width r as shown inFIG. 5 . The B-mode image signal of the tissue image in which the needle direction L and the search region F are set is output to the needletip search unit 8. For example, the predetermined width r may be set to three to five times the width of the needle based on the width of the needle inserted into the body. - In step S8, the needle
tip search unit 8 calculates the brightness distribution of the tissue image, and determines a maximum brightness point B in a search region F as a needle tip as shown inFIG. 6 obtained by extracting a region W inFIG. 5 in an enlarged manner. Alternatively, the needletip search unit 8 may have a needle tip pattern, such as an image of the needle tip, in advance, take a correlation with the needle tip pattern in the tissue image in the search region F, and determine a point at which the correlation is the maximum as the needle tip. The position information of the needle tip found by the needletip search unit 8 is output to the needletip visualizing unit 9. In addition, the needletip search unit 8 may output needle direction information or the information of the search region together with the position information of the needle tip. - In step S9, as shown in
FIG. 7 , the needletip visualizing unit 9 visualizes a needle tip N, which is a point image having a predetermined size, in the tissue image from the position information of the needle tip found by the needletip search unit 8. The B-mode image signal of the tissue image in which the needle tip is visualized is output to thedisplay control unit 10, and is displayed as a tissue image in which the needle tip is visualized on thedisplay unit 11. - By visualizing the needle tip N in the tissue image, it is possible to clearly grasp the positional relationship between the needle tip and the target tissue in the tissue image.
- Not only does the needle
tip visualizing unit 9 visualize the needle tip N in the tissue image, but also it is possible to adopt various kinds of display methods for making the needle tip clear in the tissue image. For example, as shown inFIG. 8 , a circular frame showing a needle tip region NF that extends by a predetermined radius from the position of the needle tip may be displayed, or the search region F may be displayed based on the information of the above-described search region, or a needle body NB, which is obtained by visualizing the needle direction L from the needle tip N or which is obtained by connecting the needle tip N to the base portion of the needle direction L in a straight line may be visualized based on the needle direction L. - The needle
tip visualizing unit 9 may change the brightness value or the color of a tissue image inside or outside the needle tip region NF surrounded by the circular frame as shown inFIG. 9 , or may apply a translucent mask onto the tissue image of the inside or the outside of the needle tip region NF surrounded by the circular frame as shown inFIG. 10 . - Instead of the circular frame described above, a frame having a predetermined shape, for example, a rectangular frame or a rhombic frame having a position of the needle tip on its center, may be displayed.
- By emphasizing the needle tip by visualizing the needle tip in the tissue image as described above, the needle tip can be easily visually recognized in the tissue image. Therefore, it is possible to clearly grasp the positional relationship between the needle direction and the target tissue and the positional relationship between the needle tip and the target tissue.
- In the ultrasonic diagnostic device according to the embodiment described above, when generating a needle image, transmission focus processing is performed by steering the ultrasonic beam by a predetermined angle in the needle direction and reception focus processing is performed by steering the ultrasonic echo by the predetermined angle θ in the needle direction, thereby generating a needle image. For example, as shown in
FIG. 11 , a tissue image can be generated by performing transmission focus processing on the ultrasonic wave toward a predetermined focal point in the normal direction of the ultrasonic wave receiving surface and performing reception focus processing on the ultrasonic echo from the target tissue in the normal direction of the ultrasonic wave transmitting and receiving surface, and a needle image can be generated by performing reception focus processing on the ultrasonic echo from the needle in the R direction indicated by the dotted arrow. - According to the reception focus processing in the modification example 1, in addition to the effect of the embodiment described above, it is possible to improve the refresh rate of the displayed image since the tissue image and the needle image can be generated at the same time by one transmission of the ultrasonic wave.
- In the ultrasonic diagnostic device according to the embodiment described above, the needle direction is estimated based on one needle image. However, for example, a plurality of needle images with different steering directions may be generated by changing the steering direction for steering at least one of the direction of transmission focus processing, which is the transmission direction of the ultrasonic beam, and the direction of reception focus processing of the ultrasonic echo, the sharpest needle image among the plurality of needle images may be selected, and the above-described needle direction may be estimated based on the selected sharpest needle image.
- The needle
direction estimation unit 6 acquires a plurality of needle images with different steering directions from the needleimage generation unit 5, and selects a needle image in which the needle is visualized best as shown inFIG. 12 . For the selection of a needle image in which the needle is visualized best, the brightness distribution in the entire needle image or a predetermined region in which it is assumed that a needle is included may be calculated for each needle image, and a needle image including a point of the highest brightness value may be selected or a needle image having a maximum average brightness value may be selected, for example. As described above, it is possible to estimate the needle direction by selecting a needle image in which the needle is visualized best. That is, a direction perpendicular to the steering direction in which the needle is visualized best can be estimated to be the needle direction. - In addition, for example, the brightness distribution in the entire needle image or a predetermined region in which it is assumed that a needle is included may be calculated for a plurality of needle images with different steering directions, a straight line may be detected by the Hough conversion or the like, and a needle image in which the average brightness value of the straight line is the maximum may be selected. Alternatively, a needle image having a point of the maximum brightness value on the straight line, which is higher than points of the maximum brightness values on the straight lines in the other needle images, may also be selected.
- The predetermined region in which it is assumed that a needle is included is assumed from the approximate angle of the insertion, for example.
- The needle
direction estimation unit 6 estimates the needle direction based on the selected needle image. Alternatively, the needledirection estimation unit 6 may calculate the brightness distribution in the entire needle image or a predetermined region in which it is assumed that a needle is included using all of a plurality of needle images with different steering directions, detect a straight line based on the brightness distribution in the entire needle image or the predetermined region by the Hough conversion or the like, and set the straight line as a needle direction. - In the ultrasonic diagnostic device according to the embodiment described above, as shown in
FIGS. 6 and 7 , the needletip search unit 8 determines the maximum brightness point B in the search region F as a needle tip. However, the needletip search unit 8 may have a needle tip pattern in advance and search for the needle tip based on the needle tip pattern. As a needle tip pattern, for example, as shown inFIG. 13 , there is an image that is a line segment, which connects the needle tip and the end of the cut surface of the needle to each other and has a predetermined length d, and that has a high-brightness point at both ends of the line segment. - A portion in which the reflection angle is changed, such as the tip of the needle or the end of the cut surface of the needle, tends to have high brightness in a tissue image. Therefore, for example, the needle
tip search unit 8 may have the above-described needle tip pattern. Then, as shown inFIG. 14 , the needletip search unit 8 may search for a high-brightness point B1 and a high-brightness point B2, which are considered to be the most correlated with the needle tip pattern in the search region F, and determine the high-brightness point B1 located in a deep portion of the subject, between the high-brightness point B1 and the high-brightness point B2, as a needle tip. - In the ultrasonic diagnostic device according to the embodiment described above, when the needle moves in the subject, when capturing a plurality of tissue images at least before and after the movement or when capturing tissue images of a plurality of frames as a moving image with the movement of the needle, the needle
tip search unit 8 may search for the needle tip by comparing the tissue image before the movement with the tissue image after the movement. - For example, the needle
tip search unit 8 may calculate the brightness distribution in each of the tissue image before movement and the tissue image after movement and search for the needle tip based on the change in the brightness value. By comparingFIG. 15A that is a tissue image before movement withFIG. 15B that is a tissue image after movement, a point P2 where the brightness value becomes suddenly large inFIG. 15B may be determined as the needle tip, or a point P1 where the brightness value becomes suddenly small may be determined as the needle tip. In addition, the point P2 may be determined as the needle tip based on the fact that the point P2 where the brightness value becomes suddenly large and the point P1 where the brightness value becomes suddenly small are adjacent to each other. A needle tip pattern image of the brightness change including the point P2 where the brightness value becomes suddenly large and the point P1 where the brightness value becomes suddenly small may be prepared in advance, and a point considered to be the most correlated with the needle tip pattern in the search region F in the image of the brightness change between the tissue image before movement and the tissue image after movement may be searched for and determined to be the needle tip. - Instead of the brightness value change described above, for example, the needle
tip search unit 8 may compare the tissue image before movement with the tissue image after movement, calculate the amount of movement and the movement direction between the images at each point in a predetermined region including the needle tip by the two-dimensional correlation operation or the like, and determine a point of the largest amount of movement or a point of the largest spatial change in the amount of movement or the movement direction as the needle tip. - For example, the needle
tip search unit 8 may compare the tissue image before movement with the tissue image after movement, calculate a change before and after movement in the image pattern near each point in a predetermined region including the needle tip by the two-dimensional correlation operation or the like, and determine a point of the largest image pattern change or a point of the largest spatial change of the image pattern change as the needle tip. - In the ultrasonic diagnostic device according to the embodiment described above, the needle
image generation unit 5 generates a needle image, and the needledirection estimation unit 6 estimates a needle direction based on the needle image. However, it is possible to estimate a needle direction even if a needle image is not generated. For example, the needle direction may be estimated based on the reception signal from each element of theultrasonic probe 1, and the needle direction may be estimated based on the reception data (sound ray signal) after phasing addition. - Also in the modification examples 2 to 5, it is possible to visualize the needle tip in the tissue image as in the embodiment described above. Therefore, since the needle tip can be easily visually recognized in the tissue image, it is possible to clearly grasp the positional relationship between the needle direction and the target tissue and the positional relationship between the needle tip and the target tissue.
- While the ultrasonic diagnostic device and the ultrasonic image generation method of the present invention have been described in detail, the present invention is not limited to the embodiments described above, and various modifications or changes may be made without departing from the scope and spirit of the present invention.
- 1: ultrasonic probe
- 2: transmission circuit
- 3: reception circuit
- 4: tissue image generation unit
- 5: needle image generation unit
- 6: needle direction estimation unit
- 7: search region setting unit
- 8: needle tip search unit
- 9: needle tip visualizing unit
- 10: display control unit
- 11: display unit
- 12: control unit
- 13: operation unit
- 14: storage unit
- 15A, 15B: phasing addition section
- 16A, 16B: detection processing section
- 17A, 17B: DSC
- 18A, 18B: image processing section
- 19A, 19B: image memory
- V_i: normal direction scanning line
- H_i: steering direction scanning line
- L: needle direction
- r: predetermined width
- F: search region
- W: region
- B: maximum brightness point
- N: needle tip
- NF: needle tip region
- NB: needle body
- d: predetermined length
- B1, B2: high-brightness point
- θ: steering angle
Claims (13)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/681,492 US20220175343A1 (en) | 2013-08-30 | 2022-02-25 | Ultrasonic diagnostic device and ultrasonic image generation method |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013179830 | 2013-08-30 | ||
JP2013-179830 | 2013-08-30 | ||
PCT/JP2014/062064 WO2015029499A1 (en) | 2013-08-30 | 2014-05-01 | Ultrasonic diagnostic device and ultrasonic image generation method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/062064 Continuation WO2015029499A1 (en) | 2013-08-30 | 2014-05-01 | Ultrasonic diagnostic device and ultrasonic image generation method |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/681,492 Division US20220175343A1 (en) | 2013-08-30 | 2022-02-25 | Ultrasonic diagnostic device and ultrasonic image generation method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160174932A1 true US20160174932A1 (en) | 2016-06-23 |
Family
ID=52586081
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/055,143 Pending US20160174932A1 (en) | 2013-08-30 | 2016-02-26 | Ultrasonic diagnostic device and ultrasonic image generation method |
US17/681,492 Pending US20220175343A1 (en) | 2013-08-30 | 2022-02-25 | Ultrasonic diagnostic device and ultrasonic image generation method |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/681,492 Pending US20220175343A1 (en) | 2013-08-30 | 2022-02-25 | Ultrasonic diagnostic device and ultrasonic image generation method |
Country Status (4)
Country | Link |
---|---|
US (2) | US20160174932A1 (en) |
JP (1) | JP6097258B2 (en) |
CN (1) | CN105491955B (en) |
WO (1) | WO2015029499A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3278738A4 (en) * | 2015-04-03 | 2018-05-02 | Fujifilm Corporation | Acoustic wave image generation device and method |
JP2020506005A (en) * | 2017-02-14 | 2020-02-27 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Path Tracking in Ultrasound System for Device Tracking |
US10646198B2 (en) * | 2015-05-17 | 2020-05-12 | Lightlab Imaging, Inc. | Intravascular imaging and guide catheter detection methods and systems |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6589619B2 (en) * | 2015-01-09 | 2019-10-16 | コニカミノルタ株式会社 | Ultrasonic diagnostic equipment |
JP6746895B2 (en) * | 2015-11-06 | 2020-08-26 | コニカミノルタ株式会社 | Ultrasonic diagnostic device and ultrasonic signal processing method |
JP6668817B2 (en) * | 2016-02-26 | 2020-03-18 | コニカミノルタ株式会社 | Ultrasound diagnostic apparatus and control program |
WO2020038766A1 (en) * | 2018-08-22 | 2020-02-27 | Koninklijke Philips N.V. | System, device and method for constraining sensor tracking estimates in interventional acoustic imaging |
CN110251210B (en) * | 2019-05-28 | 2021-01-01 | 聚融医疗科技(杭州)有限公司 | Puncture enhancement method and device based on block RHT |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6048312A (en) * | 1998-04-23 | 2000-04-11 | Ishrak; Syed Omar | Method and apparatus for three-dimensional ultrasound imaging of biopsy needle |
JP2004208859A (en) * | 2002-12-27 | 2004-07-29 | Toshiba Corp | Ultrasonic diagnostic equipment |
JP2006346477A (en) * | 2006-08-21 | 2006-12-28 | Olympus Corp | Ultrasonic diagnostic apparatus |
US20070270687A1 (en) * | 2004-01-13 | 2007-11-22 | Gardi Lori A | Ultrasound Imaging System and Methods Of Imaging Using the Same |
US20100056917A1 (en) * | 2008-08-26 | 2010-03-04 | Fujifilm Corporation | Ultrasonic diagnostic apparatus |
US20100298705A1 (en) * | 2009-05-20 | 2010-11-25 | Laurent Pelissier | Freehand ultrasound imaging systems and methods for guiding fine elongate instruments |
US20120078103A1 (en) * | 2010-09-28 | 2012-03-29 | Fujifilm Corporation | Ultrasound diagnostic system, ultrasound image generation apparatus, and ultrasound image generation method |
US20120253181A1 (en) * | 2011-04-01 | 2012-10-04 | Toshiba Medical Systems Corporation | Ultrasound diagnosis apparatus and controlling method |
US8343054B1 (en) * | 2010-09-30 | 2013-01-01 | Hitachi Aloka Medical, Ltd. | Methods and apparatus for ultrasound imaging |
US20140031673A1 (en) * | 2012-07-26 | 2014-01-30 | Ge Medical Systems Global Technology Company, Llc | Ultrasonic diagnostic apparatus and control program thereof |
US20150320386A9 (en) * | 2013-06-27 | 2015-11-12 | Ge Medical Systems Global Technology Company, Llc | Ultrasonic diagnostic device and control program for the same |
US9642592B2 (en) * | 2013-01-03 | 2017-05-09 | Siemens Medical Solutions Usa, Inc. | Needle enhancement in diagnostic ultrasound imaging |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6951542B2 (en) * | 2002-06-26 | 2005-10-04 | Esaote S.P.A. | Method and apparatus for ultrasound imaging of a biopsy needle or the like during an ultrasound imaging examination |
JP5416900B2 (en) * | 2007-11-22 | 2014-02-12 | 株式会社東芝 | Ultrasonic diagnostic apparatus and puncture support control program |
CN101744639A (en) * | 2008-12-19 | 2010-06-23 | Ge医疗系统环球技术有限公司 | Ultrasonic imaging method and device |
JP5495593B2 (en) * | 2009-03-23 | 2014-05-21 | 株式会社東芝 | Ultrasonic diagnostic apparatus and puncture support control program |
EP2394582B9 (en) * | 2009-11-16 | 2013-04-10 | Olympus Medical Systems Corp. | Ultrasound observation apparatus |
US8861822B2 (en) * | 2010-04-07 | 2014-10-14 | Fujifilm Sonosite, Inc. | Systems and methods for enhanced imaging of objects within an image |
CN102869308B (en) * | 2010-05-03 | 2015-04-29 | 皇家飞利浦电子股份有限公司 | Apparatus and method for ultrasonic tracking of ultrasound transducer(s) aboard an interventional tool |
JP5486449B2 (en) * | 2010-09-28 | 2014-05-07 | 富士フイルム株式会社 | Ultrasonic image generating apparatus and method of operating ultrasonic image generating apparatus |
JP5645628B2 (en) * | 2010-12-09 | 2014-12-24 | 富士フイルム株式会社 | Ultrasonic diagnostic equipment |
EP2454996A1 (en) * | 2010-11-17 | 2012-05-23 | Samsung Medison Co., Ltd. | Providing an optimal ultrasound image for interventional treatment in a medical system |
JP5435751B2 (en) * | 2011-03-03 | 2014-03-05 | 富士フイルム株式会社 | Ultrasonic diagnostic apparatus, ultrasonic transmission / reception method, and ultrasonic transmission / reception program |
CN103732152B (en) * | 2012-06-25 | 2016-04-27 | 株式会社东芝 | Diagnostic ultrasound equipment and image processing method |
WO2015025183A1 (en) * | 2013-08-19 | 2015-02-26 | Ultrasonix Medical Corporation | Ultrasound imaging instrument visualization |
-
2014
- 2014-05-01 CN CN201480047268.9A patent/CN105491955B/en active Active
- 2014-05-01 WO PCT/JP2014/062064 patent/WO2015029499A1/en active Application Filing
- 2014-09-01 JP JP2014177374A patent/JP6097258B2/en active Active
-
2016
- 2016-02-26 US US15/055,143 patent/US20160174932A1/en active Pending
-
2022
- 2022-02-25 US US17/681,492 patent/US20220175343A1/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6048312A (en) * | 1998-04-23 | 2000-04-11 | Ishrak; Syed Omar | Method and apparatus for three-dimensional ultrasound imaging of biopsy needle |
JP2004208859A (en) * | 2002-12-27 | 2004-07-29 | Toshiba Corp | Ultrasonic diagnostic equipment |
US20070270687A1 (en) * | 2004-01-13 | 2007-11-22 | Gardi Lori A | Ultrasound Imaging System and Methods Of Imaging Using the Same |
JP2006346477A (en) * | 2006-08-21 | 2006-12-28 | Olympus Corp | Ultrasonic diagnostic apparatus |
US20100056917A1 (en) * | 2008-08-26 | 2010-03-04 | Fujifilm Corporation | Ultrasonic diagnostic apparatus |
US20100298705A1 (en) * | 2009-05-20 | 2010-11-25 | Laurent Pelissier | Freehand ultrasound imaging systems and methods for guiding fine elongate instruments |
US20120078103A1 (en) * | 2010-09-28 | 2012-03-29 | Fujifilm Corporation | Ultrasound diagnostic system, ultrasound image generation apparatus, and ultrasound image generation method |
US8343054B1 (en) * | 2010-09-30 | 2013-01-01 | Hitachi Aloka Medical, Ltd. | Methods and apparatus for ultrasound imaging |
US20120253181A1 (en) * | 2011-04-01 | 2012-10-04 | Toshiba Medical Systems Corporation | Ultrasound diagnosis apparatus and controlling method |
US20140031673A1 (en) * | 2012-07-26 | 2014-01-30 | Ge Medical Systems Global Technology Company, Llc | Ultrasonic diagnostic apparatus and control program thereof |
US9642592B2 (en) * | 2013-01-03 | 2017-05-09 | Siemens Medical Solutions Usa, Inc. | Needle enhancement in diagnostic ultrasound imaging |
US20150320386A9 (en) * | 2013-06-27 | 2015-11-12 | Ge Medical Systems Global Technology Company, Llc | Ultrasonic diagnostic device and control program for the same |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3278738A4 (en) * | 2015-04-03 | 2018-05-02 | Fujifilm Corporation | Acoustic wave image generation device and method |
US10646198B2 (en) * | 2015-05-17 | 2020-05-12 | Lightlab Imaging, Inc. | Intravascular imaging and guide catheter detection methods and systems |
US11850089B2 (en) | 2015-11-19 | 2023-12-26 | Lightlab Imaging, Inc. | Intravascular imaging and guide catheter detection methods and systems |
JP2020506005A (en) * | 2017-02-14 | 2020-02-27 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Path Tracking in Ultrasound System for Device Tracking |
US11357473B2 (en) | 2017-02-14 | 2022-06-14 | Koninklijke Philips N.V. | Path tracking in ultrasound system for device tracking |
Also Published As
Publication number | Publication date |
---|---|
JP6097258B2 (en) | 2017-03-15 |
US20220175343A1 (en) | 2022-06-09 |
CN105491955A (en) | 2016-04-13 |
WO2015029499A1 (en) | 2015-03-05 |
JP2015062668A (en) | 2015-04-09 |
CN105491955B (en) | 2018-07-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220175343A1 (en) | Ultrasonic diagnostic device and ultrasonic image generation method | |
US10588598B2 (en) | Ultrasonic inspection apparatus | |
US10687786B2 (en) | Ultrasound inspection apparatus, ultrasound inspection method and recording medium | |
US11439368B2 (en) | Acoustic wave processing device, signal processing method for acoustic wave processing device, and program | |
US11278262B2 (en) | Ultrasonic diagnostic device and ultrasonic image generation method | |
US11096665B2 (en) | Ultrasound diagnostic device, ultrasound diagnostic method, and ultrasound diagnostic program | |
US11666310B2 (en) | Ultrasound diagnostic apparatus and method for controlling ultrasound diagnostic apparatus using predetermined imaging conditions for B-mode image generation | |
US11116475B2 (en) | Ultrasound diagnostic apparatus and method for controlling ultrasound diagnostic apparatus | |
US20160157830A1 (en) | Ultrasonic diagnostic device and ultrasonic image generation method | |
US20140031687A1 (en) | Ultrasonic diagnostic apparatus | |
US10980515B2 (en) | Acoustic wave processing apparatus, signal processing method, and program for acoustic wave processing apparatus | |
US9907532B2 (en) | Ultrasound inspection apparatus, signal processing method for ultrasound inspection apparatus, and recording medium | |
US11812920B2 (en) | Ultrasound diagnostic apparatus and method for controlling ultrasound diagnostic apparatus | |
CN115103634A (en) | Ultrasonic diagnostic apparatus, method of controlling ultrasonic diagnostic apparatus, and processor for ultrasonic diagnostic apparatus | |
US10788459B2 (en) | Ultrasound diagnostic apparatus, ultrasound image generation method, and recording medium | |
US10792014B2 (en) | Ultrasound inspection apparatus, signal processing method for ultrasound inspection apparatus, and recording medium | |
US20150025382A1 (en) | Ultrasound diagnostic apparatus and method of producing ultrasound image | |
JP5829198B2 (en) | Ultrasonic inspection apparatus, signal processing method and program for ultrasonic inspection apparatus | |
US20130060142A1 (en) | Ultrasound diagnostic apparatus and method of producing ultrasound image | |
US20160139252A1 (en) | Ultrasound diagnostic device, method for generating acoustic ray signal of ultrasound diagnostic device, and program for generating acoustic ray signal of ultrasound diagnostic device | |
JP5836241B2 (en) | Ultrasonic inspection apparatus, signal processing method and program for ultrasonic inspection apparatus | |
JP2008048951A (en) | Ultrasonic diagnostic system | |
JP6275960B2 (en) | Image display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FUJIFILM CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KATSUYAMA, KIMITO;REEL/FRAME:037854/0137 Effective date: 20160106 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
STCV | Information on status: appeal procedure |
Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS |
|
STCV | Information on status: appeal procedure |
Free format text: BOARD OF APPEALS DECISION RENDERED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |