WO2016098929A1 - 초음파 영상 장치 및 그 제어방법 - Google Patents
초음파 영상 장치 및 그 제어방법 Download PDFInfo
- Publication number
- WO2016098929A1 WO2016098929A1 PCT/KR2014/012581 KR2014012581W WO2016098929A1 WO 2016098929 A1 WO2016098929 A1 WO 2016098929A1 KR 2014012581 W KR2014012581 W KR 2014012581W WO 2016098929 A1 WO2016098929 A1 WO 2016098929A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- signal
- group
- sampling
- beamforming
- motion
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5269—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
- A61B8/5276—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts due to motion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
- A61B8/145—Echo-tomography characterised by scanning multiple planes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4483—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
- A61B8/4488—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer the transducer being a phased array
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/54—Control of the diagnostic device
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8909—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
- G01S15/8915—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8909—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
- G01S15/8915—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
- G01S15/8927—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array using simultaneously or sequentially two or more subarrays or subapertures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8993—Three dimensional imaging systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52023—Details of receivers
- G01S7/52034—Data rate converters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52046—Techniques for image enhancement involving transmitter or receiver
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52085—Details related to the ultrasound signal acquisition, e.g. scan sequences
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10K—SOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
- G10K11/00—Methods or devices for transmitting, conducting or directing sound in general; Methods or devices for protecting against, or for damping, noise or other acoustic waves in general
- G10K11/18—Methods or devices for transmitting, conducting or directing sound
- G10K11/26—Sound-focusing or directing, e.g. scanning
- G10K11/34—Sound-focusing or directing, e.g. scanning using electrical steering of transducer arrays, e.g. beam steering
- G10K11/341—Circuits therefor
- G10K11/346—Circuits therefor using phase variation
Definitions
- the present invention relates to an ultrasonic imaging apparatus and a method of controlling the ultrasonic imaging apparatus for detecting a motion of a target portion and adjusting a sampling of beamforming according to the motion of the target portion.
- the ultrasound imaging apparatus irradiates ultrasound signals from the body surface of the subject to a desired part of the body, and acquires an image of soft tissue tomography or blood flow as non-invasive using information of the reflected ultrasound signal (ultrasound echo signal). to be.
- the ultrasound imaging apparatus is compact, inexpensive, and real-time displayable when compared to other imaging apparatuses such as X-ray probes, X-ray CT scanners, magnetic resonance images (MRIs), and nuclear medicine probes. Since there is no exposure to the gland, etc., and has a high safety advantage, it is widely used for diagnosing the heart, abdomen, urinary gynecology.
- the ultrasound imaging apparatus includes an ultrasound probe for transmitting an ultrasound signal to the test subject to obtain an ultrasound image of the test subject, and for receiving an ultrasound echo signal reflected from the test subject.
- the ultrasonic probe includes an acoustic module.
- the acoustic module reduces the difference in acoustic impedance between the transducer and the inspected object so that the piezoelectric material vibrates and converts the electrical signal and the acoustic signal to each other, and the ultrasonic waves generated from the transducer can be transmitted to the inspected object as much as possible. It may include a matching layer, a lens layer for focusing the ultrasonic waves traveling in front of the transducer at a specific point, and a sound absorbing layer to prevent the ultrasonic waves from traveling to the rear of the transducer to prevent image distortion.
- the ultrasound imaging apparatus may perform beamforming to estimate the reflected wave size of a specific space from the plurality of channel data collected by the ultrasound probe.
- the beamforming corrects the time difference of the echo signal input through the plurality of transducers, and adds a predetermined weight to each input echo signal to emphasize the signal at a specific position or relatively attenuate the signal at another position.
- the ultrasound imaging apparatus may generate an ultrasound image suitable for identifying an internal structure of the object and display the ultrasound image to a user.
- the motion vector of each group is detected, and the group with more motion among the target site increases the sampling frequency of the beam forming, and the group with less motion decreases the sampling frequency of the beam forming.
- An embodiment of the ultrasound imaging apparatus may include an ultrasound probe that transmits ultrasound waves to an object, receives an ultrasound wave reflected from the object, a beam forming unit that beam-forms the received ultrasound wave, and outputs a beam forming signal, and an amount of motion of the object.
- the sampling unit may include a sampling unit for differently controlling the sampling frequency of the beamforming signal and an image processor for matching and synthesizing the sampled signal.
- the apparatus may include a motion detector configured to divide the beamforming signal into groups and to compare and calculate the beamforming signals of one group and the other group to calculate and store a motion vector.
- the motion vector calculated by the motion detector may be calculated by comparing the beamforming signal of each group divided by the group with the previous beamforming signal.
- the sampling unit adjusts the sampling period of the beamforming signal to be less than or equal to the preset period, and if the motion vector of the divided group exceeds the preset value.
- the sampling period of the beamforming signal may be adjusted to exceed a predetermined period.
- the sampling unit may adjust the sampling period and the sampling time point differently for groups with less motion.
- groups located at adjacent heights may be divided into different groups.
- the image processor may replace the beamforming output signal in which the group having a lot of motion is sampled with the interpolated signal using linear interpolation.
- control method of the ultrasound imaging apparatus may include transmitting ultrasound to an object, receiving ultrasound reflected from the object, beamforming the received ultrasound to output a beamforming signal, and subjecting the ultrasound to the object. And sampling the beamforming signal by differently controlling the sampling frequency of the beamforming signal, and matching and synthesizing the sampled signal.
- a sampling period and a sampling time point of dividing a target portion into a plurality of groups to detect motion vectors of the divided groups, and then sampling the beamforming output signal according to the motion.
- Ultrasonic images may be obtained by differently. Therefore, when diagnosing a target part having a different motion, distortion of the ultrasound image generated by the motion of the target part can be reduced.
- FIG. 1 is a perspective view of an ultrasound diagnostic system to which an ultrasound imaging apparatus is applied, according to an exemplary embodiment.
- FIG. 2 is a block diagram illustrating a configuration of an ultrasound imaging apparatus, according to an exemplary embodiment.
- FIG 3 is a cross-sectional view of a plurality of transducer arrays corresponding to a plurality of channels in an ultrasonic probe, according to an exemplary embodiment.
- FIG. 4 is a block diagram illustrating a configuration of a transmission beamformer.
- FIG. 5 is a block diagram illustrating a configuration of a reception beamformer.
- FIG. 6 is a conceptual diagram illustrating matching and synthesis of sampling signals by differently sampling beamforming output signals for each group according to an exemplary embodiment.
- 7A is a conceptual diagram for sampling a beamforming output signal to have a different period for each group according to an embodiment.
- 7B is a conceptual diagram of interpolating sampling of the beamforming output signal to have different periods for each group according to an embodiment.
- 7C is a conceptual diagram for synthesizing an interpolated sampling signal according to an embodiment.
- 8A is a conceptual diagram of sampling a beamforming output signal to have different periods and viewpoints for each group according to an embodiment.
- FIG. 8B is a conceptual diagram of interpolating sampling of the beamforming output signal to have different periods and viewpoints for each group according to an embodiment.
- 8C is a conceptual diagram for synthesizing an interpolated sampling signal according to an embodiment.
- 9A is a conceptual diagram for sampling a beamforming output signal such that groups of adjacent heights have different periods, according to an embodiment.
- FIG. 9B is a conceptual diagram of interpolating sampling a beamforming output signal such that groups of adjacent heights have different periods, according to an embodiment.
- FIG. 10A illustrates an example of an ultrasound image in a case in which there is no motion through the ultrasound imaging apparatus, according to an exemplary embodiment.
- FIG. 10B illustrates an example of an ultrasound image when there is motion in an axial direction through the ultrasound imaging apparatus, according to an exemplary embodiment.
- FIG. 10C illustrates an example of an ultrasound image interpolated when there is motion in an axial direction through an ultrasound imaging apparatus, according to an exemplary embodiment.
- FIG. 11 is a conceptual diagram illustrating a 3D volume and observation information according to an exemplary embodiment.
- FIG. 12 is a diagram illustrating an example of a configuration of a 3D mode processor, according to an exemplary embodiment.
- FIG. 13 is a conceptual diagram for describing creating a 3D volume according to an embodiment.
- FIG. 14 is a conceptual diagram illustrating a 3D volume conversion according to an embodiment.
- FIG. 15 is a conceptual diagram for describing rendering processed by an ultrasound imaging apparatus, according to an exemplary embodiment.
- FIG. 16 illustrates a method of dividing a target part into groups according to an exemplary embodiment, detecting motion vectors of each group, adjusting sampling of the beamforming output signal according to the detected motion vectors, and matching and sampling the sampling signals of each group.
- FIG 1 illustrates the appearance of an ultrasound diagnostic system to which an ultrasound imaging apparatus is applied.
- the apparatus may include a main body 11, an ultrasonic probe 10, an input unit 17, a sub display unit 18, and a main display unit 19.
- the main body 11 may accommodate a transmission signal generator of the ultrasonic diagnostic system 1.
- the transmission signal generator may generate a transmission signal and transmit the transmission signal to the ultrasound probe 10.
- One or more female connectors 15 may be provided at one side of the main body 11.
- the female connector 15 may be physically coupled to a male connector 14 connected to the cable 13.
- the transmission signal generated by the transmission signal generator may be transmitted to the ultrasonic probe 10 via the male connector 14 and the cable 13 connected to the female connector 15 of the main body 11.
- a plurality of casters 16 for mobility of the ultrasound diagnosis system 1 may be provided below the main body 11.
- the plurality of casters 16 may fix the ultrasonic diagnostic system 1 at a specific place or move it in a specific direction.
- the ultrasound probe 10 is a part that contacts the body surface of the object and may transmit or receive ultrasound. Specifically, the ultrasound probe 10 converts the generated signal provided from the main body 11 into an ultrasound signal, irradiates the converted ultrasound signal into the body of the object, and receives the ultrasound echo signal reflected from a specific part of the body of the object. It receives and transmits to the main body 11.
- one end of the ultrasonic probe 10 may be provided with a plurality of acoustic modules for generating ultrasonic waves in accordance with the electrical signal.
- the acoustic module may generate ultrasonic waves according to the applied AC power.
- AC power may be supplied from a power supply device outside the sound module or a power storage device inside.
- the transducer of the acoustic module may generate ultrasonic waves by vibrating according to the supplied AC power.
- a plurality of acoustic modules may be arranged in a matrix (Matrix Array), may be arranged in a linear array (Linear Array), may be arranged in a convex curve (Convex Array).
- the sound modules may be phased arrays or concave arrays.
- a cover for covering the acoustic module may be provided at an upper portion of the acoustic module.
- a cable 13 is connected to the other end of the ultrasonic probe 10, and a male connector 14 may be connected to the end of the cable 13.
- the male connector 14 may be physically coupled to the female connector 15 of the main body 11.
- the input unit 17 is a part capable of receiving a command related to the operation of the ultrasound diagnosis system 1. For example, through the input unit 17, a mode selection command such as A-mode (Amplitude mode), B-mode (Brightness mode), D-mode (Doppler mode), M-mode (Motion mode), 3D mode,
- A-mode Amplitude mode
- B-mode Brightness mode
- D-mode Doppler mode
- M-mode Motion mode
- 3D mode 3D mode
- the ultrasound diagnosis start command may be input.
- the command input through the input unit 17 may be transmitted to the main body 11 by wire or wireless communication.
- the input unit 17 may include at least one of a touch pad, a keyboard, a foot switch, and a foot pedal.
- the touch pad or keyboard may be implemented in hardware, and may be positioned above the main body 11.
- the keyboard may include at least one of a switch, a key, a wheel, a joystick, a trackball, and a knob.
- the keyboard may be implemented in software such as a graphical user interface. In this case, the keyboard may be displayed through the sub display unit 18 or the main display unit 19.
- a foot switch or a foot pedal may be provided below the main body 11, and an operator may control the operation of the ultrasound diagnosis system 1 using the foot pedal.
- Probe holder 12 for mounting the ultrasonic probe 10 may be provided around the input unit 17.
- the user may store the ultrasonic probe 10 on the probe holder 12. 1 illustrates a case in which one probe holder 12 is provided around the input unit 17, but the disclosed invention is not limited thereto, and the position or number of the probe holders 12 may be determined by the ultrasonic diagnostic system 1. It can be changed in various ways depending on the overall design of the) or the design or position of some components.
- the sub display unit 18 may be provided in the main body 11. 1 illustrates a case in which the sub display unit 18 is provided above the input unit 17.
- the sub display unit 18 may be implemented by a cathode ray tube (CRT), a liquid crystal display (LCD), or the like.
- the sub display unit 18 may display menus or guides necessary for ultrasound diagnosis.
- the main display unit 19 may be provided in the main body 11. 1 illustrates a case where the main display unit 19 is provided above the sub display unit 18.
- the main display unit 19 may be implemented as a CRT or a liquid crystal display.
- the main display unit 19 may display the ultrasound image acquired during the ultrasound diagnosis process.
- the ultrasound image displayed through the main display unit 19 may include at least one of a 2D black and white ultrasound image, a 2D color ultrasound image, a 3D black and white ultrasound image, and a 3D color ultrasound image.
- FIG. 1 illustrates a case where both the sub display unit 18 and the main display unit 19 are provided in the ultrasound diagnosis system 1, but the sub display unit 18 may be omitted in some cases. In this case, an application or a menu displayed through the sub display unit 18 may be displayed through the main display unit 19.
- At least one of the sub display unit 18 and the main display unit 19 may be implemented to be detachable from the main body 11.
- FIG. 2 illustrates a configuration of an ultrasound imaging apparatus.
- the ultrasound imaging apparatus 2 may include a probe 10, a beamforming unit 100, a motion detector 200, a sampling unit 130, an image processor 300, a communication unit 400,
- the memory 500 may include a display unit 600, a control unit 700, and an input unit 800.
- the above-described components may be connected to each other through a bus 900.
- the ultrasound imaging apparatus 2 may be implemented not only in a cart type but also in a portable type.
- Examples of the portable ultrasound diagnostic apparatus may include, but are not limited to, a PACS viewer, a smart phone, a laptop computer, a PDA, a tablet PC, and the like.
- the probe 10 transmits an ultrasonic signal to the object 20 according to a driving signal applied from the beamformer 100 and receives an echo signal reflected from the object 20.
- the probe 10 includes a plurality of transducers, and the plurality of transducers vibrate according to an electrical signal transmitted and generate ultrasonic waves which are acoustic energy.
- the probe 10 may be connected to the main body of the ultrasound imaging apparatus 2 in a wired or wireless manner, and the ultrasound imaging apparatus 2 may include a plurality of probes 10 according to an implementation form.
- the transmission beamformer 110 supplies a driving signal to the probe 10, and may include a transmission signal generator 112, a time delay unit 114, and a pulser 116.
- the transmission signal generator 112 may generate a pulse for forming the transmission ultrasound according to a predetermined pulse repetition frequency (PRF), and the time delay unit 114 may transmit transmission directionality.
- the delay time for determining) may be applied to the pulse.
- Each pulse to which the delay time is applied may correspond to a plurality of piezoelectric vibrators included in the probe 10.
- the pulser 116 may apply a driving signal (or a driving pulse) to the probe 10 at a timing corresponding to each pulse to which a delay time is applied.
- the reception beamformer 120 may generate ultrasonic data by processing an echo signal received from the probe 10, and may include an amplifier 122, an ADC (analog digital converter) 124, and a parallax corrector ( 126 and a focusing unit 128.
- ADC analog digital converter
- parallax corrector 126 and a focusing unit 128.
- the amplifier 122 amplifies the echo signal for each channel, the ADC 124 may analog-digital convert the amplified echo signal, and the parallax correction unit 126 determines the reception directionality.
- the delay time may be applied to the digitally converted echo signal, and the focusing unit 128 may generate ultrasonic data by summing the echo signals processed by the reception delay unit 166.
- the reception beamformer 120 may not include the amplifier 122 according to its implementation form. That is, when the sensitivity of the probe 10 is improved or the number of processing bits of the ADC 124 is improved, the amplifier 122 may be omitted.
- the motion detector 200 may receive the beamforming output signals, divide the received signals, and calculate a motion vector.
- the motion detector 200 may include a divider 210 and a motion vector calculator 220.
- the divider 210 may receive a beamforming output signal output from the focusing unit 128 of the reception beamformer and divide the divided beam into a plurality of groups.
- the dividing unit 210 may divide into a portion having a high motion and a portion not according to a previously detected motion vector, or may divide a group into a preset number or a division number input by a user.
- the motion vector calculator 220 may calculate motion vectors of the plurality of divided beamforming output signals. For example, the motion vector calculator 220 may cross-correlate the beamforming signal of one group and the beamforming signal of the remaining group to calculate a motion vector. In addition, the motion vector calculator 220 may calculate a motion vector by cross-correlationing a beamforming signal currently output with a beamforming signal previously output. In the case of calculating a motion vector by cross-correlationing beamforming signals, the lower the value, the more severe the motion, and the higher the value, the lower the motion. However, the calculation of the motion vector by the motion vector calculator 220 is not limited to calculating the motion vector by comparing with the previous beamforming signal through cross correlation.
- the sampling unit 130 may change the sampling of the beamforming output signal according to the motion of the target portion.
- the motion vector calculated by the motion vector calculator 220 may analyze the motion vector to sample the beam-forming output signal by slowing the sampling period in the group having many motions and slowing the sampling period in the group having the small motion.
- the image processor 300 may generate and display an ultrasound image through a scan conversion process on the ultrasound data generated by the sampling unit 130.
- the ultrasound image may be a gray scale image or a 3D image obtained by scanning an object in an A mode, a B mode, and a M mode, as well as a Doppler effect ( It may also include a Doppler image representing a moving object using a doppler effect.
- the Doppler image may include a blood flow Doppler image (or also referred to as a color Doppler image) representing a blood flow, a tissue Doppler image representing a tissue movement, and a spectral Doppler image displaying a moving speed of an object as a waveform. .
- the image processor 300 may include a data processor 310 and an image generator 360.
- the data processor 310 may include an image matching processor 320, a 3D mode processor 350, a B mode processor 330, and a D mode processor 340.
- the image matching processor 320 may estimate an image through linear interpolation with respect to the time that is not sampled in the low-motion group among the time sampled in the high-motion group, and between the sampling signals of the plurality of groups. It can be estimated by interpolating the intervals. In addition, the image matching processor 320 may substitute the estimated beamforming signals into the unsampled time domain and match the continuous signal. In addition, the image matching processor 320 may assist in calculating the motion vector by comparing the linearly interpolated signal with the previous interpolated signal by transmitting the linearly interpolated signal of the beamforming signal to the motion vector calculator.
- the 3D mode processor 350 may analyze the output signals of different depths and heights output from the beam forming unit to generate a 3D volume and render the combinations thereof.
- the B mode processor 330 may extract and process the B mode component from the ultrasound data.
- the image generator 360 may generate an ultrasound image in which the intensity of the signal is expressed as brightness based on the B mode component extracted by the B mode processor 330.
- the D mode processor 340 may extract the Doppler component from the ultrasound data, and the image generator 360 may generate a Doppler image representing the movement of the object in color or waveform based on the extracted Doppler component.
- the image generator 360 may generate an ultrasound image by temporally or spatially combining the data processed by the data processor 310.
- the image generator 360 may include an image synthesizer 370 and an image reconstructor 380.
- the image synthesizing unit 370 may synthesize the sampled beamforming output signal divided temporally or spatially after the image matching process, and then recombine it temporally or spatially to express the ultrasound image.
- the image reconstructor 380 may reconstruct the distorted ultrasound image signal by using an estimation function or an interpolation function for a data processing process, a diagnosis process, or other reasons.
- the image generator 360 may generate an elastic image that images the deformation degree of the object 20 according to the pressure.
- the image generator 360 may express various additional information in text or graphics on the ultrasound image.
- the generated ultrasound image may be stored in the memory 400.
- the communication unit 400 may be connected to the network 30 by wire or wirelessly to communicate with an external device or a server.
- the communicator 400 may exchange data with a hospital server or another medical device in the hospital connected through a PACS (Picture Archiving and Communication System).
- the communication unit 400 may perform data communication according to a digital imaging and communications in medicine (DICOM) standard.
- DICOM digital imaging and communications in medicine
- the communication unit 400 may transmit and receive data related to diagnosis of the object, such as an ultrasound image, ultrasound data, and Doppler data of the object 20 through the network 30, and may be photographed by another medical device such as CT, MRI, or X-ray. A medical image can also be sent and received.
- the communication unit 400 may receive information on a diagnosis history, a treatment schedule, and the like of a patient from a server and use the same to diagnose the object 20.
- the communication unit 400 may perform data communication with a portable terminal of a doctor or a patient, as well as a server or a medical device in a hospital.
- the communication unit 400 may be connected to the network 30 by wire or wirelessly to exchange data with the server 32, the medical device 34, or the portable terminal 36.
- the communication unit 400 may include one or more components that enable communication with an external device. For example, it may include a short range communication module 410, a wired communication module 420, and a mobile communication module 430.
- the short range communication module 410 may be a module for short range communication within a predetermined distance.
- Local area communication technology includes a wireless LAN, Wi-Fi, Bluetooth, Zigbee, WFD (Wi-Fi Direct), UWB (ultra wideband), infrared communication ( IrDA (Infrared Data Association), Bluetooth Low Energy (BLE), Near Field Communication (NFC), and the like, but are not limited thereto.
- the wired communication module 420 refers to a module for communication using an electrical signal or an optical signal.
- the wired communication technology includes a pair cable, a coaxial cable, an optical fiber cable, an Ethernet cable, and the like. This may be included.
- the mobile communication module 430 may transmit / receive a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
- the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
- the memory 400 may store various types of information processed by the ultrasound imaging apparatus 2.
- the memory 400 may store medical data related to diagnosis of an object, such as input / output ultrasound data and an ultrasound image, or may store an algorithm or a program performed in the ultrasound imaging apparatus 2.
- the memory 400 may be implemented as various types of storage media such as a flash memory, a hard disk, and an EEPROM. Also, the ultrasound imaging apparatus 2 may operate a web storage or a cloud server that performs a storage function of the memory 400 on the web.
- the display unit 600 may display and output the generated ultrasound image.
- the display unit 600 may display and output not only an ultrasound image but also various information processed by the ultrasound imaging apparatus 2 on a screen through a graphical user interface (GUI). Meanwhile, the ultrasound imaging apparatus 2 may include two or more display units 600 according to an implementation form.
- GUI graphical user interface
- the controller 700 may control overall operations of the ultrasound imaging apparatus 2. That is, the controller 700 may include the probe 10, the beamformer 100, the motion detector 200, the sampling unit 130, the image processor 300, the communicator 400, and the memory illustrated in FIG. 2. 500) an operation between the display unit 700 and the input unit 800 may be controlled.
- the input unit 800 may be a means for receiving data for controlling the ultrasound imaging apparatus 2 from a user.
- the input unit 800 may include a hardware configuration such as a keypad, a mouse, a touch panel, a touch screen, a trackball, and a jog switch, but is not limited thereto.
- An ECG module, a breath module, a voice recognition sensor, a gesture recognition sensor, Various input means such as a fingerprint sensor, an iris sensor, a depth sensor, and a distance sensor may be further included.
- Some or all of the probe 10, the beamforming unit 100, the image processing unit 300, the communication unit 400, the memory 400, the input unit 800, and the control unit 700 may be operated by a software module.
- the present invention is not limited thereto, and some of the above-described configurations may be operated by hardware.
- at least some of the beamformer 100, the image processor 300, and the communicator 400 may be included in the controller 700, but are not limited thereto.
- Figure 3 shows a cross section of a plurality of transducer arrays corresponding to a plurality of channels in the ultrasonic probe according to this.
- the ultrasound imaging apparatus may include an ultrasound probe 10, and the ultrasound probe 10 may collect information about a target portion using ultrasound.
- the ultrasound probe 10 may have a structure capable of detecting a 3D volume.
- the ultrasonic probe 10 may include a plurality of ultrasonic probes 10 having transducers arranged in a matrix form.
- a plurality of transducers arranged in a matrix form may output a plurality of echo signals, and a three-dimensional volume may be generated by accumulating the output echo signals.
- the ultrasound probe 10 may include a configuration for moving the transducers arranged in a row and the transducers arranged in a row. More specifically, at both ends of the plurality of transducers arranged in a line, a rail may be provided in a direction perpendicular to the direction in which the plurality of transducers are arranged. The plurality of transducers arranged in a row may be moved along the rail in the scanning direction to obtain a plurality of echo signals, and the three obtained echo signals may be accumulated to generate a three-dimensional volume. Can be.
- the ultrasonic probe 10 will be described as arranging transducers in a matrix form.
- the ultrasonic probe 10 may have a plurality of ultrasonic transducers t1 to t10 installed at one end thereof.
- the ultrasonic transducers t1 to t10 may generate a corresponding ultrasonic wave according to a signal or power applied thereto, irradiate the ultrasonic wave to the object, receive echo ultrasonic waves reflected from the object, and generate and output an echo signal.
- the ultrasonic transducers t1 to t10 are supplied with power from an external power supply device or an internal power storage device such as a battery, and the ultrasonic transducers t1 to t10 according to the applied power.
- the piezoelectric vibrator, thin film or the like of the vibration may generate ultrasonic waves.
- the ultrasonic transducer p10 may convert the ultrasonic wave into an echo signal by generating an alternating current having a frequency corresponding to the vibration frequency while the piezoelectric material or the thin film vibrates according to the reception of the ultrasonic wave.
- the generated echo signal may be transmitted to the main body through the plurality of channels c1 to c10.
- the above-described ultrasonic transducers t1 to t10 are magnetostrictive ultrasonic transducers utilizing magnetostrictive effects of magnetic materials, piezoelectric ultrasonic transducers using piezoelectric effects of piezoelectric materials, and hundreds or thousands of microfabricated materials.
- Capacitive Micromachined Ultrasonic Transducers (cMUTs) for transmitting and receiving ultrasonic waves using vibrations of two thin films may be used.
- cMUTs Capacitive Micromachined Ultrasonic Transducers
- other types of transducers that may generate ultrasonic waves according to electrical signals or electrical signals according to ultrasonic waves may also be used as examples of the ultrasonic transducers t1 to t10 described above.
- the beamformer 100 may include a transmission beamformer 110 and a reception beamformer 120.
- the transmission beamformer 110 may perform transmit beamforming by using the transmission signal generator 112 and the time delay unit 114.
- Transmission beamforming refers to focusing ultrasonic waves generated from at least one transducer T at a focal point. That is, in order to overcome the time difference in which the ultrasonic waves generated in the at least one transducer T reach the focal point, the ultrasonic waves are generated in the transducer T in a proper order.
- the transmission signal generator 112 of the transmission beamformer 110 may generate a transmission signal to at least one transducer T according to a control signal of the controller 700. At this time, the transmission signal in the form of a high frequency AC current, it may be generated corresponding to the number of transducers. The transmission signal generated by the transmission signal generator 112 may be transmitted to the time delay unit 114.
- the time delay unit 114 may adjust the time to reach the corresponding transducer T by applying a time delay to each transmission signal.
- the transmission signal delayed by the time delay unit 114 is applied to the transducer T, the transducer T generates an ultrasonic wave corresponding to the frequency of the transmission signal.
- Ultrasound generated at each transducer T is focused at a focal point. The position of the focal point where the ultrasound generated by the transducer T is focused may vary depending on what type of delay pattern is applied to the transmission signal.
- FIG. 4 illustrates five transducers t1 to t5, and three delay patterns that can be applied to the transmission signals are illustrated by thick solid lines, medium thick solid lines, and thin solid lines.
- the ultrasonic waves generated by each of the transducers t1 to t5 are the first focal point F1. Can be focused).
- the ultrasonic waves generated by each of the transducers t1 to t5 are the first focal point F1. Can be focused at a second focal point F2 farther than).
- the ultrasonic waves generated by each of the transducers t1 to t5 are smaller than the second focal point F2.
- the focus may be focused at a distant third focal point F3.
- the position of the focus may vary according to the delay pattern applied to the transmission signal generated by the transmission signal generator 112.
- the ultrasound irradiated to the object may be fixed-focused at a fixed focus, but when another delay pattern is applied, the ultrasound irradiated to the object may be focused at multiple focuses. can be focused).
- the ultrasonic waves generated by each transducer T may be fixedly focused at one point or may be focused at multiple points.
- Ultrasound irradiated into the object may be reflected at a target site within the object, and the reflected echo ultrasound may be received by the transducer T.
- the transducer T may convert the received echo ultrasound into an electrical signal and output the electrical signal.
- the signal output from the transducer T may be amplified and filtered, and then converted into a digital signal and provided to the reception beamformer 120.
- the reception beamformer 120 may include a parallax correction unit 126 and a focusing unit 128 to perform receive beamforming on the received signal S converted into a digital signal. Receive beamforming is to correct and focus a parallax existing between the received signals S output from the transducers T and focus.
- the parallax correction unit 126 delays the received signals S output from the transducers T for a predetermined time so that the received signals are transmitted to the focusing unit 128 at the same time.
- the focusing unit 128 may focus the received signals S whose parallaxes are corrected by the parallax correction unit 126 into one.
- the focusing unit 128 may add a predetermined weight, for example, a beamforming coefficient, to each input signal to focus and attenuate the predetermined reception signal S relative to other reception signals.
- the focused receive signal S may be provided to the motion detector 200 and the sampling unit 130.
- the motion detection unit divides a target part into a plurality of groups, detects a motion vector of each group, and samples the beamforming output signal according to the motion vector with reference to FIGS. 6 to 10. Let's explain.
- the divider may divide the target portion into a plurality of groups and assign a unique identification number.
- the division unit may divide the target region into a plurality of groups by the number of divisions previously stored in the memory, or divide the target region into a plurality of groups by the division number input by the user through the input unit.
- the divider may divide the target region into a plurality of groups by grouping regions of the target region having similar motion vectors based on the previously calculated motion vector.
- various methods for calculating the motion vector for each group by dividing the target portion into a plurality of groups may be used as an example of the divider.
- the motion vector calculator may calculate a motion vector for each group by inputting a plurality of groups of beamforming output signals or signals interpolated from sampling signals.
- the motion vector calculator may determine the amount of motion of the target region by comparing the interpolation signals of one group with the interpolation signals of the other group.
- An example of calculating the motion vector may be expressed by Equation 1 below.
- Equation 1 is an equation for calculating a motion vector of one group.
- p_r is a pixel of one group
- t_r is a sampling time of one group
- I_r (p_r, t_r) is a sampling signal of one group
- I_r '(p_r, t_r) is one
- P_n is the pixel of the nth group
- t_n is the sampling point of the nth group
- I_n (p_n, t_n) is the sampling signal of the nth group
- I_n '(p_n, t_n) Is an interpolated signal of the nth group of sampling signals
- * is a cross-correlation operator, and is a multiplication operator.
- the motion vector calculator cross-correlates I_r '(p_r, t_r), which is an interpolated signal of one group of sampling signals, and I_1' (p_1, t_1), which is an interpolated signal of the first group of the remaining groups.
- the first result is calculated by dividing the cross-correlation value by multiplying the absolute value of I_r '(p_r, t_r) by the absolute value of I_1' (p_1, t_1).
- the motion vector calculator calculates the nth result value by calculating up to the nth group as in the above method, and may determine the smallest result value from the first result value to the nth result value as the motion vector of one group. .
- the motion detector may detect the motion of the target part by calculating a motion vector of one group by using a motion vector calculation equation such as Equation 1.
- the motion vector calculator may determine the amount of motion of the target region by comparing the current interpolation signal of one group with the previous interpolation signal.
- An example of calculating the motion vector may be expressed by Equation 2 below.
- Equation 2 is an equation for calculating a motion vector of one group.
- p_r is the pixel of one group
- t_r is the sampling point of one group
- I_r (p_r, t_r) is the sampling signal of one group
- I_r '(p_r, t_r) Is the signal interpolating the sampling signal of one group
- p_p is the pixel of the previous one group
- t_p is the sampling point of the previous one group
- I_p (p_p, t_p) is the sampling signal of the previous one group
- I_p '(p_p, t_p) is an interpolated signal of a previous group of sampling signals
- * is a cross-correlation operator, and is an operator of multiplication.
- the motion vector calculator cross-correlates I_r '(p_r, t_r), which is a signal interpolated with a sampling signal of one group, and I_p' (p_p, t_p), which is a signal obtained by interpolating a sampling signal of a previous group.
- the matrix of motion vectors of one group is calculated by dividing the correlation value by the product of the absolute value of I_r '(p_r, t_r) and the absolute value of I_p' (p_p, t_p). Thereafter, the motion vector calculator may determine the matrix having the smallest value among the matrices relating to the motion vector as the current motion vector of one group.
- the motion vector calculator may calculate motion vectors of other groups through the same calculation for other groups.
- the motion detector may detect the motion of the target part by calculating a motion vector of one group by using a motion vector calculation equation such as Equation 2.
- the motion vector calculated by the above-mentioned method may be transferred to the memory, the controller and the sampling unit.
- FIG. 6 illustrates the concept of matching and combining sampling signals by differently sampling the beamforming output signals for each group.
- the sampling unit may increase the sampling period of the group 131 having less motion to a range exceeding a preset period based on the motion vector. Therefore, when sampling the group 131 with less motion, the sampling unit may sample the beamforming signal by adjusting the sampling frequency less.
- the sampling unit may reduce the sampling period of the group with a lot of motion 132 below a predetermined period based on the motion vector. Therefore, when sampling the group 132 having a lot of motion, the sampling unit may sample the beamforming signal by frequently adjusting the sampling frequency.
- the preset motion vector and the preset sampling period of the sampling unit may be values input by the user through the input unit or may be values stored in the memory.
- various values for reducing the ultrasound image distortion due to the motion of the target portion may be used as examples of the preset motion vector and the preset sampling period of the sampling unit.
- the image generating unit synthesizes the sampling signal of the low-motion group 131 and the sampling signal of the high-motion group 132 to generate a whole ultrasound image 133, and generates a single ultrasound image.
- the final ultrasound full image 134 may be generated by spatially synthesizing the image signals through matching.
- FIG. 7A illustrates the concept of sampling the beamforming output signal to have different periods for each group
- FIG. 7B illustrates the concept of interpolating sampling the beamforming output signal to have different periods for each group
- 7C illustrates the concept of synthesizing an interpolated sampling signal.
- the sampling unit simultaneously samples the beamforming output signals of the high-motion group 141 and the low-motion group 142 at a time t, so that the x-axis and the target are the lateral direction of the target portion.
- the ultrasound image signal 143 about the z-axis that is the axial direction of the site may be sampled.
- the sampling unit samples only the beamforming output signal of the group 144 having a lot of motion at time t + 1, and does not perform sampling of the beamforming output signal of the group 145 having a low motion. Therefore, the ultrasound image signal 146 may be sampled for the x-axis, which is the lateral direction of only the group 144 having a lot of motion, and the z-axis, which is the axial direction of the object.
- the sampling unit simultaneously samples the beamforming output signals of the high-motion group 147 and the low-motion group 148 so that the x-axis and the axis of the target region are lateral.
- the ultrasound image signal 149 about the z-axis in the direction Axial may be sampled.
- the image processing unit performs spatial matching only on the beamforming signals of the high-motion group 151 and the low-motion group 152 at time t to output the ultrasound image signal 153. can do.
- the image processor may perform spatial matching only on the ultrasound image signal 159 which sampled both the beamforming output signals of the group 157 with a high motion and the group 158 with a low motion at time t + 2. have.
- the sampling unit did not sample the low-motion group 145 at time t + 1
- the image processing unit based on the sampling signals of the low-motion group 142 and 148 at time t and time t + 2.
- the temporal matching matches the beamforming output signals of the group with low motion 155.
- the image processor may output the signal of the group 155 having a low motion and the sampling signal of the group 154 having a high motion as a single ultrasound image signal 156 through spatial matching.
- one ultrasound image signal 250 may be obtained through temporal or spatial matching and synthesis of the ultrasound image signals of the high-motion group 251 and the low-motion group 252. Can be.
- FIG. 8A illustrates the concept of sampling the beamforming output signal to have different periods and viewpoints in groups
- FIG. 8B illustrates the concept of interpolating sampling the beamforming output signal to have different periods and viewpoints in groups
- 8C illustrates the concept of synthesizing an interpolated sampling signal.
- the motion detection unit divides the target region into four groups, calculates four groups of motion vectors, and divides the target region into a group having a lot of motion 161 and a group having less motion; It can be classified into a second group and a third group.
- the sampling unit simultaneously samples the beamforming output signals of the group 161 with a lot of motion and the first group 162 with a little motion at time t, and the x-axis and the axial direction of the target site are laterally.
- the ultrasound image signal 163 on the z-axis that is (Axial) may be sampled.
- the sampling unit simultaneously samples the beamforming output signals of the group 164 with a lot of motion and the second group 165 with a lot of motion, and detects the x-axis and the region of the target part laterally.
- the ultrasound image signal 166 about the z-axis in the axial direction may be sampled.
- the sampling unit simultaneously samples the beamforming output signals of the third group 167 with a lot of motion and the group 168 with a little motion at the same time, so that the x-axis and the target region of the target region are laterally.
- the ultrasound image signal 169 may be sampled with respect to the z-axis in the axial direction of.
- the sampling unit did not sample the three groups with less motion at time t, time t + 1, and time t + 2, the image processing unit at time t, time t + 1, and time t + 2.
- the first group 172, the second group 175, and the third group 178 were sampled only once.
- the image processor may perform the group matching with less motion through temporal matching based on the sampling signals of the first group 172, the second group 175, and the third group 178. Match the beamforming signals 175 and 178.
- the signals of the groups 172, 175, and 178 that have low temporally matched motion and the sampling signals of the groups 171, 174, and 177 that have a lot of motion are matched spatially, so that time t, time t + 1, and time t
- One ultrasound image signal 173, 176, and 179 at +2 may be output.
- one ultrasound image signal 260 may be obtained by temporally or spatially matching and synthesizing the ultrasound image signals of the high motion group 261 and the low motion group 262. Can be.
- FIG. 9A illustrates the concept of sampling a beamforming output signal such that groups of adjacent heights have different periods
- FIG. 9B interpolates sampling a beamforming output signal such that groups of adjacent heights have different periods. It shows the concept of doing.
- the sampling unit simultaneously samples the beamforming output signals of the high-motion group 181 and the low-motion group 182 at a time t so that the x-axis, the target, which is the laterally (Lateral) of the target portion, is sampled.
- the ultrasound image signal 183 may be sampled with respect to the y-axis that is the height direction of the site and the z-axis that is the axis direction of the target site.
- the sampling unit may sample the beamforming output signal by dividing the groups adjacent to the y-axis, which is the height direction of the target portion, into different groups.
- the sampling unit samples only the beamforming output signal of the high motion group 184 at time t + 1, and does not perform the sampling of the beamforming output signal of the low motion group 185. Therefore, the ultrasound image signal for the x-axis, which is the laterally (Lateral), the y-axis, which is the height direction of the target site, and the z-axis, which is the axis direction of the target site, of only the group 184 having a lot of motion among the target sites. 186 can be sampled.
- the sampling unit simultaneously samples the beamforming output signals of the high-motion group 187 and the low-motion group 188 so that the x-axis and the height of the target region are laterally.
- the ultrasound image signal 189 may be sampled with respect to the y-axis in the direction and the z-axis in the axial direction of the target portion.
- the image processor spatially matches the ultrasound image signal 193 obtained by sampling the beamforming output signals of the high-motion group 191 and the low-motion group 192 at time t. You can only print by doing
- the image processing unit may perform spatial matching only on the ultrasound image signal 199 sampling both the beamforming output signals of the high-motion group 197 and the low-motion group 198 at time t + 2. have.
- the image processing unit may perform spatial matching on the signal of the group 185 having low motion and the sampling signal of the group 194 having high motion to output one ultrasound image signal 196.
- FIG. 10A illustrates an ultrasound image when there is no motion through the ultrasound imaging apparatus
- FIG. 10B illustrates an ultrasound image when there is motion in the axial direction through the ultrasound imaging apparatus
- FIG. 10C illustrates an ultrasound image.
- the ultrasound image is interpolated when there is motion in the axial direction through the imaging device.
- the ultrasound image measuring the target part without motion is measured with the ultrasound image 361 focused on a desired position parallel to the lateral axis.
- the ultrasound image measuring the target part with motion in the axial direction is focused on the ultrasound image in a state where the user does not want to be parallel to the lateral axis. 362) can be measured.
- the ultrasound imaging apparatus reduces the distortion by dividing the target region with motion in the axial direction into a plurality of groups and sampling the beamforming output signals for each group based on the motion vectors of the divided groups. You can.
- the ultrasound imaging apparatus may acquire the ultrasound image 363 focused on a desired position of the object in parallel to the lateral axis as illustrated in FIG. 10C through the control of the above-mentioned ultrasound imaging apparatus.
- 11 illustrates a concept for explaining three-dimensional volume and observation information.
- the 3D volume may be composed of elements called voxels.
- a voxel is a compound word of volume and pixel. If a pixel defines a point in a two-dimensional plane, the voxel may define a point in three-dimensional space. That is, unlike pixels that can be represented by x and y coordinates, the voxels may be represented by x, y, and z coordinates.
- the ultrasound probe 10 may probe a three-dimensional volume.
- the three-dimensional volume is composed of a plurality of voxels.
- the reception beamformer 120 detects the voxel VP1 corresponding to the center of the voxels closest to the axial direction (Axial), which is the probe direction, as the observation center, and the depth direction of the probe as the observation direction VD1. Can be detected.
- observation information may be changed according to a user input. For example, when the user changes the observation center VP1 to VP2 and changes the viewing direction from VD1 to VD2, the observation information may be changed even if the probe direction of the probe does not change.
- the 3D mode processor 350 may generate a 3D volume by combining one or more output signals output from the reception beamformer 120, and render and output the generated 3D volume.
- the 3D mode processor 350 may include a volume generator 351, a volume converter 352, a renderer 353, and an image corrector 354.
- the volume generator 351 may generate a 3D volume by combining one or more 2D images. As described above, although the 3D volume may be generated in various ways, it will be described below that the 3D volume is generated by data interpolation for convenience of description.
- FIG. 13 illustrates a concept for explaining creating a three-dimensional volume.
- a plurality of two-dimensional cross-sectional images 355, 356, 357, and 358 may be obtained based on one or more output signals received from the reception beamformer 120.
- the volume generator 351 lists the acquired two-dimensional cross-sectional images 355, 356, 357, and 358 in three dimensions according to positions, and then interpolates the values between the cross-sectional images and performs a three-dimensional volume ( 359).
- the 3D volume may be generated in the form of a matrix. That is, each voxel may be represented by the XYZ axis. Meanwhile, each voxel may be represented by a scalar value or a vector value.
- a three-dimensional volume is generated in binary volume data format, or in a multi-volume volume data format in which the voxel value can be expressed as a measurable value such as density and temperature.
- a three-dimensional volume can be created.
- the voxel value can be used to obtain values of optical elements of the voxel, such as opacity and color values.
- the color values can be respectively calculated by a color transfer function that defines the relationship between the voxel value and the color value.
- the volume converter 352 may perform a scan transformation of the 3D volume. According to an embodiment, when the ultrasonic probe 10 is in a linear form, a different volume conversion may not be necessary. However, when the ultrasonic probe 10 is in another form, for example, a convex form, the volume may be in a rectangular coordinate system. You need to convert
- the volume since the display screen uses a Cartesian coordinate system, the volume must also be in the form of a Cartesian coordinate system in order to visualize the volume of the object on the display screen in three dimensions.
- volume generated from the volume generator 351 is in the form of a concentric spherical coordinate system as shown on the left side of FIG. 14, a coordinate transformation process is required in the process of visualizing the volume on the display screen.
- the volume converter 352 performs three-dimensional scan conversion of each voxel to a position corresponding to the Cartesian coordinate system in the volume of the concentric spherical coordinate system as shown on the left side of FIG. 14, as shown on the right side of FIG. 14.
- the volume can be corrected in Cartesian coordinates.
- FIG. 15 illustrates a concept for describing rendering processed by an ultrasound imaging apparatus.
- the renderer 353 may perform volume rendering based on the 3D volume and generate a projection image of the object. More specifically, the rendering unit 353 is a process of visualizing the 3D volume as a 3D image, and the volume rendering method may be largely divided into a surface rendering method and a direct rendering method.
- the surface rendering method may estimate surface information based on a scalar value and a spatial variation set by a user from a volume. Then, this can be visualized by changing it into a geometric element such as a polygon or a curved patch.
- a representative surface rendering method is the matching cubes algorithm.
- Direct rendering is a way of directly visualizing a volume without the intermediate step of turning a surface into a geometric element.
- Direct rendering may be divided into an image-order algorithm and an object-order algorithm according to a method of searching a volume.
- the object ordering algorithm searches a volume according to a storage order and synthesizes each voxel to a corresponding pixel.
- a representative example is a splatting method.
- the image order algorithm is a method of determining each pixel value in the order of scan lines of an image, and sequentially determining pixel values corresponding to a volume along a ray starting from each pixel.
- Representative methods of image ordering algorithms are ray casting and ray tracing.
- an ultrasound imaging apparatus divides a group of target portions, samples a beamforming output signal for each group, and then matches and synthesizes the sampling signals will be described.
- FIG. 16 is an ultrasound imaging apparatus for dividing a target part into groups to detect motion vectors of each group, adjusting sampling of beamforming output signals according to the detected motion vectors, and matching and synthesizing the sampling signals of each group.
- a flow chart is shown for the control method.
- the ultrasound imaging apparatus may radiate the ultrasound generated to the target site through the ultrasound probe and receive the echo ultrasound signal reflected from the target site (S 10).
- the reception beamformer of the beamforming unit may convert the received echo signal into a digital signal, compensate the delay time, and concentrate the echo signal to output the beamforming signal (S 20).
- the motion detection unit may divide the beamforming signal output by the division number set through the user input unit or the division number or other division number set in the system into n groups (S 30).
- the motion detection unit calculates a motion vector by cross-correlationing a beamforming output signal of a plurality of groups of n divided groups or a signal interpolated with a sampling signal with another group or a signal of a previous time ( S 40).
- the sampling unit may determine a sampling period and a sampling time point of the corresponding group n_p according to the calculated motion vector (S50), and sample and store the beamforming output signal of the corresponding group n_p (S60). .
- the image processor may match and synthesize the sampling signals of the n groups by using linear interpolation (S 80).
- the motion detection unit determines whether there is a group with motion among the current n groups (S90), and when there is even one group with motion, the operations of S 30 to S 80. Again, and if there is no group with motion, the operation can be terminated.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Acoustics & Sound (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Veterinary Medicine (AREA)
- Radiology & Medical Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Multimedia (AREA)
- Gynecology & Obstetrics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
Claims (16)
- 초음파를 대상체로 송신하고, 상기 대상체로부터 반사된 초음파를 수신하는 초음파 프로브;상기 수신된 초음파를 빔 포밍하여 빔 포밍 신호를 출력하는 빔 포밍부;상기 대상체의 모션 양에 따라 상기 빔 포밍 신호의 샘플링 횟수를 상이하게 조절하는 샘플링부; 및상기 샘플링한 신호를 정합 및 합성하는 영상 처리부;를 포함하는 초음파 영상 장치.
- 제1항에 있어서,상기 빔 포밍 신호를 그룹별로 분할하고, 하나의 그룹과 나머지 그룹의 빔 포밍 신호들을 비교하여 모션벡터를 산출 및 저장하는 모션 감지부;를 더 포함하는 초음파 영상 장치.
- 제2항에 있어서,상기 모션 감지부는 그룹별로 분할한 각 그룹의 빔 포밍 신호와 이전 빔 포밍 신호를 비교하여 모션벡터를 산출 및 저장하는 초음파 영상 장치.
- 제2항에 있어서,상기 샘플링부는 상기 분할된 그룹의 모션벡터가 미리 설정된 값 이하이면 빔 포밍 신호의 샘플링 주기를 미리 설정된 주기 이하로 조절하고, 상기 분할된 그룹의 모션벡터가 미리 설정된 값을 초과하면 상기 빔 포밍 신호의 샘플링 주기를 미리 설정된 주기를 초과시켜 조절하는 초음파 영상 장치.
- 제4항에 있어서,상기 샘플링부는 상기 복수개의 그룹들의 모션벡터가 미리 설정된 값을 초과하면 그룹별로 샘플링 주기 및 샘플링 시점을 상이하게 조절하는 초음파 영상 장치.
- 제2항에 있어서,상기 모션 감지부는 근접한 높이(elevation)에 위치한 그룹은 서로 상이한 그룹으로 분할하는 초음파 영상 장치.
- 제4항에 있어서,상기 영상 처리부는 상기 빔 포밍 신호를 선형 보간법을 이용해 보간하고,상기 모션 감지부의 상기 빔 포밍 신호는 상기 영상 처리부에서 보간한 신호인 초음파 영상 장치.
- 제4항에 있어서,상기 영상 처리부는 상기 빔 포밍 신호를 선형 보간법을 이용해 보간하고, 상기 분할된 그룹의 모션벡터가 미리 설정된 값 이하이면, 샘플링신호를 동일 그룹의 상기 보간된 신호로 대체하는 초음파 영상 장치.
- 초음파를 대상체로 송신하고, 상기 대상체로부터 반사된 초음파를 수신하는 단계;상기 수신된 초음파를 빔 포밍하여 빔 포밍 신호를 출력하는 단계;상기 대상체의 모션 양에 따라 상기 빔 포밍 신호의 샘플링 횟수를 상이하게 조절하여 샘플링하는 단계; 및상기 샘플링한 신호를 정합 및 합성하는 단계;를 포함하는 초음파 영상 장치의 제어방법.
- 제9항에 있어서,상기 빔 포밍 신호를 그룹별로 분할하는 단계; 및하나의 그룹과 나머지 그룹의 빔포밍 신호들을 비교하여 모션벡터를 산출 및 저장하는 단계;를 더 포함하는 초음파 영상 장치의 제어방법.
- 제10항에 있어서,상기 모션벡터를 산출 및 저장하는 단계는 상기 그룹별로 분할한 각 그룹의 빔 포밍 신호와 이전 빔포밍 신호를 비교하여 모션벡터를 산출 및 저장하는 단계인 초음파 영상 장치의 제어방법.
- 제10항에 있어서,상기 샘플링하는 단계는 상기 산출 및 저장된 모션벡터가 미리 설정된 값 이하인 그룹은 빔 포밍 신호의 샘플링 주기를 미리 설정된 주기 이하로 조절하고, 상기 모션벡터가 미리 설정된 값을 초과하는 그룹은 상기 빔 포밍 신호의 샘플링 주기를 미리 설정된 주기를 초과하도록 조절하여 샘플링하는 단계인 초음파 영상 장치의 제어방법.
- 제12항에 있어서,상기 샘플링하는 단계의 상기 모션벡터가 미리 설정된 값을 초과하는 그룹들은 각각 샘플링 주기 및 샘플링 시점을 상이하게 조절하여 샘플링하는 초음파 영상 장치의 제어방법.
- 제10항에 있어서,상기 신호를 그룹별로 분할하는 단계는 근접한 높이(elevation)에 위치한 그룹은 서로 상이한 그룹으로 분할하는 초음파 영상 장치의 제어방법.
- 제12항에 있어서,상기 그룹별로 분할한 빔 포밍 신호를 선형 보간법을 이용해 보간하는 단계;를 더 포함하고,상기 모션벡터를 산출 및 저장하는 단계의 상기 그룹별로 분할한 빔 포밍 신호는 상기 보간된 신호인 초음파 영상 장치의 제어 방법.
- 제12항에 있어서,상기 그룹별로 분할한 빔 포밍 신호를 선형 보간법을 이용해 보간하는 단계;를 더 포함하고,상기 신호들을 정합 및 합성하는 단계는 상기 모션벡터가 미리 설정된 값 이하인 그룹의 샘플링 신호를 상기 모션벡터가 미리 설정된 값 이하인 그룹의 보간된 신호로 대체하는 초음파 영상 장치의 제어방법.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/537,717 US11272906B2 (en) | 2014-12-19 | 2014-12-19 | Ultrasonic imaging device and method for controlling same |
KR1020177006745A KR102336172B1 (ko) | 2014-12-19 | 2014-12-19 | 초음파 영상 장치 및 그 제어방법 |
PCT/KR2014/012581 WO2016098929A1 (ko) | 2014-12-19 | 2014-12-19 | 초음파 영상 장치 및 그 제어방법 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2014/012581 WO2016098929A1 (ko) | 2014-12-19 | 2014-12-19 | 초음파 영상 장치 및 그 제어방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016098929A1 true WO2016098929A1 (ko) | 2016-06-23 |
Family
ID=56126801
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2014/012581 WO2016098929A1 (ko) | 2014-12-19 | 2014-12-19 | 초음파 영상 장치 및 그 제어방법 |
Country Status (3)
Country | Link |
---|---|
US (1) | US11272906B2 (ko) |
KR (1) | KR102336172B1 (ko) |
WO (1) | WO2016098929A1 (ko) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018226918A1 (en) * | 2017-06-09 | 2018-12-13 | Tokitae Llc | Ultrasound systems and methods of identifying fluids in body regions using the same |
KR101975462B1 (ko) * | 2018-06-14 | 2019-05-07 | 엘아이지넥스원 주식회사 | 다중 펄스 압축 기법 기반의 레이더 수신신호 처리 방법 및 그를 위한 장치 |
KR20210014284A (ko) * | 2019-07-30 | 2021-02-09 | 한국과학기술원 | 다양한 센서 조건에서의 초음파 영상 처리 장치 및 그 방법 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022018932A (ja) * | 2020-07-16 | 2022-01-27 | コニカミノルタ株式会社 | 超音波診断装置、超音波信号処理方法、及びプログラム |
JP7449879B2 (ja) * | 2021-01-18 | 2024-03-14 | 富士フイルムヘルスケア株式会社 | 超音波診断装置及びその制御方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011024719A (ja) * | 2009-07-23 | 2011-02-10 | Olympus Corp | 画像形成装置 |
KR20120095731A (ko) * | 2011-02-21 | 2012-08-29 | 삼성전자주식회사 | 초음파 영상 생성 방법 및 장치 |
JP2014050648A (ja) * | 2012-09-10 | 2014-03-20 | Toshiba Corp | 超音波診断装置 |
KR20140098843A (ko) * | 2011-12-01 | 2014-08-08 | 마우이 이미징, 인코포레이티드 | 핑-기반 및 다수 개구부 도플러 초음파를 이용한 모션 검출 |
US20140288876A1 (en) * | 2013-03-15 | 2014-09-25 | Aliphcom | Dynamic control of sampling rate of motion to modify power consumption |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5899861A (en) * | 1995-03-31 | 1999-05-04 | Siemens Medical Systems, Inc. | 3-dimensional volume by aggregating ultrasound fields of view |
US6468216B1 (en) * | 2000-08-24 | 2002-10-22 | Kininklijke Philips Electronics N.V. | Ultrasonic diagnostic imaging of the coronary arteries |
US20050096538A1 (en) * | 2003-10-29 | 2005-05-05 | Siemens Medical Solutions Usa, Inc. | Image plane stabilization for medical imaging |
US20070276236A1 (en) * | 2003-12-16 | 2007-11-29 | Koninklijke Philips Electronics N.V. | Ultrasonic diagnostic imaging system with automatic control of penetration, resolution and frame rate |
JP5209213B2 (ja) * | 2006-01-10 | 2013-06-12 | 株式会社東芝 | 超音波診断装置及び超音波画像生成プログラム |
EP1872724B1 (en) * | 2006-01-10 | 2019-08-28 | Toshiba Medical Systems Corporation | Ultrasonograph and ultrasonogram creating method |
JP4975098B2 (ja) * | 2006-05-12 | 2012-07-11 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 動き補償による超音波合成送信フォーカシング |
FR2907301A1 (fr) * | 2006-10-12 | 2008-04-18 | Thomson Licensing Sas | Procede d'interpolation d'une image compensee en mouvement et dispositif pour la mise en oeuvre dudit procede |
US9320491B2 (en) * | 2011-04-18 | 2016-04-26 | The Trustees Of Columbia University In The City Of New York | Ultrasound devices methods and systems |
KR20140132821A (ko) * | 2013-05-07 | 2014-11-19 | 삼성전자주식회사 | 영상 처리 유닛, 초음파 영상 장치 및 영상 생성 방법 |
US10034657B2 (en) * | 2013-07-26 | 2018-07-31 | Siemens Medical Solutions Usa, Inc. | Motion artifact suppression for three-dimensional parametric ultrasound imaging |
US20150272547A1 (en) * | 2014-03-31 | 2015-10-01 | Siemens Medical Solutions Usa, Inc. | Acquisition control for elasticity ultrasound imaging |
-
2014
- 2014-12-19 US US15/537,717 patent/US11272906B2/en active Active
- 2014-12-19 WO PCT/KR2014/012581 patent/WO2016098929A1/ko active Application Filing
- 2014-12-19 KR KR1020177006745A patent/KR102336172B1/ko active IP Right Grant
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011024719A (ja) * | 2009-07-23 | 2011-02-10 | Olympus Corp | 画像形成装置 |
KR20120095731A (ko) * | 2011-02-21 | 2012-08-29 | 삼성전자주식회사 | 초음파 영상 생성 방법 및 장치 |
KR20140098843A (ko) * | 2011-12-01 | 2014-08-08 | 마우이 이미징, 인코포레이티드 | 핑-기반 및 다수 개구부 도플러 초음파를 이용한 모션 검출 |
JP2014050648A (ja) * | 2012-09-10 | 2014-03-20 | Toshiba Corp | 超音波診断装置 |
US20140288876A1 (en) * | 2013-03-15 | 2014-09-25 | Aliphcom | Dynamic control of sampling rate of motion to modify power consumption |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018226918A1 (en) * | 2017-06-09 | 2018-12-13 | Tokitae Llc | Ultrasound systems and methods of identifying fluids in body regions using the same |
US11446004B2 (en) | 2017-06-09 | 2022-09-20 | Tokitae Llc | Ultrasound systems and methods of identifying fluids in body regions using the same |
KR101975462B1 (ko) * | 2018-06-14 | 2019-05-07 | 엘아이지넥스원 주식회사 | 다중 펄스 압축 기법 기반의 레이더 수신신호 처리 방법 및 그를 위한 장치 |
KR20210014284A (ko) * | 2019-07-30 | 2021-02-09 | 한국과학기술원 | 다양한 센서 조건에서의 초음파 영상 처리 장치 및 그 방법 |
KR102317337B1 (ko) | 2019-07-30 | 2021-10-26 | 한국과학기술원 | 다양한 센서 조건에서의 초음파 영상 처리 장치 및 그 방법 |
Also Published As
Publication number | Publication date |
---|---|
KR20170095799A (ko) | 2017-08-23 |
KR102336172B1 (ko) | 2021-12-08 |
US11272906B2 (en) | 2022-03-15 |
US20180000458A1 (en) | 2018-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015102474A1 (en) | Ultrasound diagnostic apparatus, ultrasound image capturing method, and computer-readable recording medium | |
WO2016098929A1 (ko) | 초음파 영상 장치 및 그 제어방법 | |
WO2016032298A1 (en) | Ultrasound diagnosis apparatus for self-diagnosis and remote-diagnosis, and method of operating the ultrasound diagnosis apparatus | |
US9486182B2 (en) | Ultrasound image generating device, ultrasound image generating method, and program | |
WO2016182166A1 (en) | Method of displaying elastography image and ultrasound diagnosis apparatus performing the method | |
WO2015130070A2 (en) | Ultrasound diagnostic apparatus and method of operating the same | |
WO2015076508A1 (en) | Method and apparatus for displaying ultrasound image | |
WO2016186279A1 (en) | Method and apparatus for synthesizing medical images | |
EP3071113A1 (en) | Method and apparatus for displaying ultrasound image | |
WO2018182308A1 (ko) | 초음파 진단 장치 및 그 동작 방법 | |
WO2020036321A1 (ko) | 빔포밍 장치, 빔포밍 장치의 제어방법 및 초음파 진단 장치 | |
WO2016047895A1 (en) | Ultrasound imaging apparatus and method using synthetic aperture focusing | |
WO2015084113A1 (en) | Ultrasonic imaging apparatus and control method therefor | |
WO2015160047A1 (en) | Medical imaging apparatus and method of operating the same | |
WO2018056572A1 (en) | Ultrasound probe, ultrasound imaging apparatus, ultrasound imaging system, and method for controlling thereof | |
WO2015002400A1 (en) | Ultrasonic diagnostic apparatus and method of operating the same | |
WO2017179782A1 (ko) | 초음파 진단 장치 및 그 제어 방법 | |
EP3229693A1 (en) | Ultrasound diagnostic apparatus and method of operating the same | |
WO2015072808A1 (en) | Ultrasonic imaging apparatus and method of controlling the same | |
WO2015137616A1 (en) | Medical diagnostic apparatus and operating method thereof | |
WO2016047892A1 (en) | Ultrasound diagnostic apparatus and method of generating ultrasound image | |
WO2017082625A1 (ko) | 프로브 장치 및 그 제어 방법 | |
WO2017135698A1 (ko) | 초음파 진단장치 및 그 제어방법 | |
WO2018139707A1 (ko) | 대상체에 관한 횡파 탄성 데이터를 표시하는 초음파 진단 장치 그 동작 방법 | |
EP3515315A1 (en) | Ultrasound probe, ultrasound imaging apparatus, ultrasound imaging system, and method for controlling thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14908495 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20177006745 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15537717 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14908495 Country of ref document: EP Kind code of ref document: A1 |