US20150025385A1 - Ultrasonic imaging device - Google Patents

Ultrasonic imaging device Download PDF

Info

Publication number
US20150025385A1
US20150025385A1 US14/378,507 US201314378507A US2015025385A1 US 20150025385 A1 US20150025385 A1 US 20150025385A1 US 201314378507 A US201314378507 A US 201314378507A US 2015025385 A1 US2015025385 A1 US 2015025385A1
Authority
US
United States
Prior art keywords
plural
beamforming
peripheral information
operator
delay
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/378,507
Other languages
English (en)
Inventor
Teiichiro Ikeda
Hiroshi Masuzawa
Marie Tabaru
Shinta Takano
Kunio Hashiba
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Assigned to HITACHI, LTD. reassignment HITACHI, LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TABARU, MARIE, HASHIBA, KUNIO, IKEDA, TEIICHIRO, MASUZAWA, HIROSHI, TAKANO, Shinta
Publication of US20150025385A1 publication Critical patent/US20150025385A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5269Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52046Techniques for image enhancement involving transmitter or receiver
    • G01S7/52049Techniques for image enhancement involving transmitter or receiver using correction of medium-induced phase aberration
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10KSOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
    • G10K11/00Methods or devices for transmitting, conducting or directing sound in general; Methods or devices for protecting against, or for damping, noise or other acoustic waves in general
    • G10K11/18Methods or devices for transmitting, conducting or directing sound
    • G10K11/26Sound-focusing or directing, e.g. scanning
    • G10K11/34Sound-focusing or directing, e.g. scanning using electrical steering of transducer arrays, e.g. beam steering
    • G10K11/341Circuits therefor
    • G10K11/346Circuits therefor using phase variation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems

Definitions

  • the present invention relates to an ultrasound imaging technique for imaging the inside of a test subject, through the use of ultrasound waves.
  • Ultrasound imaging is a technique for non-invasively creating an image of the inside of a test subject such as a human body, through the use of ultrasound waves (being a sound wave not intended for hearing, and generally a high-frequency sound wave having 20 kHz or higher).
  • a medical ultrasound imaging apparatus will be briefly explained.
  • An ultrasound probe transmits the ultrasound waves to the inside of a patient body, and receives echo signals reflected from the inside of the patient.
  • the received signals are subjected to signal processing in one or both of the ultrasound probe and the main unit of the ultrasound imaging apparatus, and thereafter transferred to a monitor, and then, an ultrasound image is displayed thereon.
  • a transmit beamformer in the main unit of the ultrasound imaging apparatus generates signals of a transmission beam, allowing the signals to pass through the transmit-receive separation circuit (T/R), and thereafter transfers the signals to the ultrasound probe.
  • the ultrasound probe sends out ultrasound waves.
  • the ultrasound probe After receiving the echo signals from the internal body, the ultrasound probe transmits the signals to the main unit of the imaging apparatus.
  • the received signals pass through the transmit-receive separation circuit again, being subjected to beamforming process in the receive beamformer, and then those signals are transmitted to an image processor.
  • the image processor executes various image processing through the use of various filters, a scan converter, and the like. Finally, the monitor displays an ultrasound image.
  • a general ultrasound diagnostic apparatus is made up of three techniques; transmit beamforming, receive beamforming, and a backend image processing.
  • transmit beamforming receive beamforming
  • receive beamforming receive beamforming
  • backend image processing a backend image processing.
  • the beamformers for transmitting and receiving perform signal processing at an RF (high-frequency) level
  • algorithms and implementation architecture in the beamformers decide a basic image quality of the ultrasound image. Therefore, the beamformers serve as major parts of the apparatus.
  • the receive beamformer assigns a delay time to each received signal (received data) in plural elements that constitute the ultrasound probe, the delay time distributing an amount of delay in a concave form, in association with the relations between a focal position and the element positions, and after virtually achieving focus (focused) at a certain point in space, the received signal data items are summed up.
  • This method is referred to as a beamforming according to a delay-and-sum method.
  • the received data items that are received by the plural elements in the ultrasound diagnostic apparatus are multiplied by a fixed weight vector stored in the diagnostic apparatus, then being weighted and summed up. This process is also performed in the transmit beamformer in a similar manner, not only in the receive beamformer.
  • lateral resolution is subject to constraints. Since transmitting and receiving of the ultrasound waves are performed by an array having a finite opening size, there is an impact of diffraction on the edge of the opening. If an infinitely long array is prepared, there is a possibility that the resolution is enhanced infinitely in the same manner as in the depth direction. In actual, however, a physical restriction in designing the apparatus, i.e., the length of the array, has hampered the enhancement of the lateral resolution.
  • an improved lateral resolution technique is being reported which is achieved by applying an adaptive signal processing technique including the MVDR method (Minimum Variance Distortionless Response; Capon method) that has been developed in the field of mobile communication, to the beamformer of the received data (Non Patent Documents 1 to 6).
  • the technique described in those adaptive methods is implemented by adaptively varying complex components of the weight vector used for the delay-and-sum, on the basis of the correlative matrix of the received data.
  • the weight vector has been a fixed value, but in the adaptive method, the weight vector is obtained according to an operation with the use of the received signal, for each sample point in the time direction of the received signal, and the received signal is multiplied by thus obtained weight vector.
  • the apparatus sets the focus of the receive beamformer, assuming that the sound-velocity is constant and the medium is homogeneous. Therefore, when distortion exists in sound-wave propagation, there is a problem that an image appears blurred, or an image is formed on a position different from the actual position.
  • correction of the wavefront distortion (wave aberration) is a long standing problem, and aberration correction technique utilizing cross-correlation processing has been studied.
  • Patent Document 1
  • Patent Document 2
  • Patent Document 5
  • an adaptive beamforming technique is demanded, being able to reduce the number of the ultrasound signals failing to be picked up, after scattered around the focus, highly robust against the correlative noise caused by the wavefront distortion, and provided with an image-quality compensation method with relatively small processing load.
  • An object of the present invention is to provide an ultrasound imaging apparatus that is capable of compensating for the image-quality deterioration caused by inhomogeneity of the test subject medium.
  • the ultrasound imaging apparatus of the present invention is provided with plural elements configured to receive ultrasound signals from a test subject, a delay unit configured to delay each of the received signals on the plural elements, in association with a position of a predetermined receive focus, and generate post-delay received signals, a peripheral information operator configured to acquire information items as to plural points within a region including the receive focus and a peripheral region thereof, respectively from the post-delay received signals, a peripheral information combiner configured to combine the information items acquired as to the plural points respectively, and generate a beamforming output by using the information being combined, and an image processor configured to use the beamforming output to generate image data.
  • the present invention it is possible to acquire ultrasound signals scattered around the focus, reduce of the number of the ultrasound signals failing to be picked up, achieve a highly robust property against the correlative noise caused by the wavefront distortion, and compensate for image-quality deterioration without significantly increasing the processing load.
  • FIG. 1( a ) is a perspective view illustrating a schematic configuration of the ultrasound imaging apparatus of the present embodiment, and FIG. 1( b ) is a block diagram thereof;
  • FIG. 2 is a block diagram illustrating a configuration of the receive beamformer according to the first embodiment
  • FIG. 3 is a block diagram illustrating a configuration of the receive beamformer according to the second embodiment
  • FIG. 4 illustrates operations of the delay circuit according to the second embodiment
  • FIG. 5( a ) to FIG. 5( f ) illustrate effects of the embodiments of the present invention
  • FIG. 6 illustrates operations and effects of the signal processing in the second embodiment
  • FIG. 7 is a block diagram illustrating a configuration of the peripheral information combiner in the receive beamformer according to the third embodiment
  • FIG. 8 illustrates operations of the delay circuit according to the fourth embodiment
  • FIG. 9 is a block diagram illustrating the receive beamformer according to the fourth embodiment.
  • FIG. 10( a ) to FIG. 10( c ) illustrate the steering vectors of the receive beamformer according to the fourth embodiment
  • FIG. 11 is a block diagram illustrating a configuration of the receive beamformer according to the fifth embodiment.
  • FIG. 12 is a block diagram illustrating a configuration of a temporary storage in the receive beamformer according to the sixth embodiment
  • FIG. 13 is a block diagram illustrating operations and effects of the signal processing in the sixth embodiment.
  • FIG. 14 is a block diagram illustrating a configuration of the temporary storage in the receive beamformer according to the seventh embodiment
  • FIG. 15 is a block diagram illustrating operations and effects of the signal processing in the seventh embodiment
  • FIG. 16 is a block diagram illustrating a configuration of the temporary storage in the receive beamformer according to the eighth embodiment.
  • FIG. 17 is a block diagram illustrating operations and effects of the signal processing in the eighth embodiment.
  • FIG. 18 is a block diagram illustrating a configuration of the temporary storage in the receive beamformer according to the ninth embodiment.
  • FIG. 19 is a block diagram illustrating operations and effects of the signal processing in the ninth embodiment.
  • FIG. 20( a ) to FIG. 20( g ) illustrate operations of the adjuster for setting the steering vectors according to the tenth embodiment
  • FIG. 21 is a block diagram illustrating a configuration of the receive beamformer according to the eleventh embodiment.
  • FIG. 22( a ) illustrates a mismatch between the signal spreading area by reflection at a focus (elliptic region 2205 ) and the point spread function 2204 based on the fixed steering angle, in association with the position of the ultrasound image
  • FIG. 22( b ) illustrates the intensity distribution of the ultrasound image (B-mode image) obtained at the fixed steering angle
  • FIG. 22( c ) illustrates relations between the point spread function 2204 at the steering angle set by the eleventh embodiment and the elliptic region 2205
  • FIG. 22( d ) illustrates the intensity distribution of the ultrasound image (B-mode image) obtained at the steering angle set in the eleventh embodiment
  • FIG. 23( a ) is a graph showing the signal strength or the intensity distribution of the B-mode image in the twelfth embodiment
  • FIG. 23( b ) is a graph showing the distribution of the steering angle obtained from the graph of FIG. 23( a ) and a graph showing the distribution of a predetermined steering angle;
  • FIG. 24 is a perspective view of the console of the ultrasound imaging apparatus according to the present embodiments.
  • FIG. 25 is a perspective view of the console of another specific example and the monitor of the ultrasound imaging apparatus according to the present embodiments;
  • FIG. 26( a ) illustrates the ultrasound images (B-mode images) showing six point scatterers when no wavefront distortion occurs in the ultrasound beam
  • FIG. 26( b ) illustrates the ultrasound images (B-mode images) of six point scatterers when wavefront distortion occurs in the ultrasound beam
  • FIG. 27 is a graph showing the intensity distribution of one point scatterer and surroundings thereof, with regard to the ultrasound images of FIG. 26 .
  • the ultrasound imaging apparatus of the first embodiment is provided with plural elements configured to receive ultrasound signals from a test subject, a delay unit configured to delay each of the received signals on the plural elements, in association with a position of a predetermined receive focus and generate post-delay received signals, a peripheral information operator, a peripheral information combiner, and an image processor.
  • the peripheral information operator acquires information items as to plural points within a region including the receive focus and the peripheral region of the receive focus, respectively from the post-delay received signals.
  • the peripheral information combiner combines the information items respectively acquired as to the plural points, and uses the information being combined to generate a beamforming output.
  • the image processor uses the beamforming output to generate image data. Specific explanations will be provided in the following.
  • FIG. 1( a ) is a perspective view of the apparatus
  • FIG. 1( b ) is a block diagram schematically showing the inner configuration.
  • the ultrasound imaging apparatus is provided with an ultrasound probe 101 , an apparatus main body 102 , a monitor 103 , and a console 110 .
  • the apparatus main body 102 incorporates a transmit beamformer 104 , a transmit-receive separation circuit (T/R) 107 , a receive beamformer 108 , an image processor 109 , and a controller 111 for controlling operations of those described above.
  • T/R transmit-receive separation circuit
  • the ultrasound probe 101 is provided with plural elements (ultrasound transducers) 106 arranged in the form of an array.
  • the transmit beamformer 104 generates signals for a transmission beam, and transfers the transmission beam to the ultrasound probe 101 via the transmit-receive separation circuit 107 .
  • the ultrasound probe 101 transmits ultrasound waves from the plural elements, directed to the inside of the test subject 100 .
  • the ultrasound probe 101 receives echo signals reflected inside the body. The received signals go through the transmit-receive separation circuit 107 again and subjected to beamforming operation, and the like, in the receive beamformer 108 .
  • the received signals after the beamforming operation is applied are transferred to the image processor 109 , and various image processes such as various filters, and a scan converter are executed on the received signals, and then, an ultrasound image is generated.
  • the ultrasound image is transferred to the monitor 103 , and displayed thereon.
  • FIG. 2 is a block diagram illustrating the configuration of the receive beamformer 108 .
  • the receive beamformer 108 incorporates a delay circuit 204 , a peripheral information operator 205 , a peripheral information combiner 206 , and a bypass line 207 from the output of the delay circuit 204 to the peripheral information combiner.
  • Independent circuits may configure those components in the receive beamformer 108 , respectively. It is alternatively possible to configure such that a memory storing programs in advance and a CPU for reading and executing the programs implement the operations of those components.
  • the active channel setter (not illustrated) sets an active channel in a part of the finite aperture of the ultrasound probe 101 , so as to perform the receive beamforming process.
  • the elements 106 within a partial range are assumed as the active channel 201 , among the plural elements 106 constituting the ultrasound probe 101 which has received echoes associated with one transmit ultrasound beam, and the receive beamformer generates one line of image data (raster: beamforming output y(n)) in the ultrasound propagating direction, by using the received signals in the active channel 201 .
  • the active channel setter sets an active channel in a part of the finite aperture of the ultrasound probe 101 , so as to perform the receive beamforming process.
  • the elements 106 within a partial range are assumed as the active channel 201 , among the plural elements 106 constituting the ultrasound probe 101 which has received echoes associated with one transmit ultrasound beam, and the receive beamformer generates one line of image data (raster: beamforming output y(n)) in the ultrasound propagating direction, by using the received signals in the
  • the positions of the elements 106 in the channel are displaced gradually, thereby sequentially configuring the active channel 202 , active channel 201 , and active channel 203 , and the rasters are generated respectively for the active channels 202 , 201 , and 203 .
  • Resultant arrangement of the rasters forms an ultrasound image.
  • the delay circuit 204 corresponds to a block provided as a previous stage of the peripheral information operator 205 , configured to give a delay time to each of the received signals (received data) on the plural elements 106 constituting the active channel 201 , in association with the element position, so as to perform processing to focus the received signals on a certain point virtually existing in a space.
  • the delay time given to each of the elements 106 is prepared in advance, as a delay-time set, in association with the object positions in the space of the imaging target.
  • the delay circuit 204 selects the delay-time set in response to the focal position being set, and gives different delay times to the respective received signals on the plural elements. With this configuration, it is possible to perform the focusing process covering the entire space of the imaging target.
  • the peripheral information operator 205 performs the operation to collect information items that are scattered around the receive focus due to the reflection from the receive focus.
  • the peripheral information operator 205 performs the adaptive signal processing, thereby obtaining adaptive weights as to two or more points, on the receive focus and in its periphery (peripheral region). With this configuration, it is possible to collect peripheral information with respect to the receive focus.
  • the periphery corresponds to an area where signals are scattered due to the reflection of the signals on the receive focus, having a nearly elliptical shape (oval sphere in the three-dimensional case); the size being different depending on the conditions such as receive elements, a transmit frequency, a receive frequency, a ultrasound pulse waveform, the size and pitch of the element array, and a point spread function, roughly having a long axis spreading around ⁇ 5° or less in the active channel direction, assuming as the central axis, the axis connecting the receive focus and the center point of the active channel, and a short axis in a length equivalent to or a little shorter than the active channel direction, in the ultrasound propagating direction orthogonal to the active channel direction.
  • the focus is positioned at an intersection of the short axis and the long axis.
  • the periphery corresponds to an area where the wavefront distortion causes that the reflected waves from the medium surrounding the focus exist in the received waves in mixed manner, and the reflected waves from the surrounding medium may become obvious in the received signals in the form of noise signals, being correlated with the information from the focus.
  • the peripheral information combiner 206 combines (compounds) the peripheral information items collected by the peripheral information operator 205 and obtains a final beamforming output y(n).
  • the peripheral information combiner 206 combines the adaptive weights of at least two points obtained by the peripheral information operator 205 , and this combined weight being obtained is assigned on the signals of the elements 106 , after the delay process is performed by the delay circuit 204 , and weighted signals are subjected to beamforming, thereby combining the peripheral information items.
  • This finally obtained beamforming output y(n) is sequentially transferred to the image processor 109 , raster by raster, respectively in association with the active channels 202 , 201 , and 203 , and an ultrasound image is generated on the basis of thus combined beamforming output.
  • the peripheral information operator 205 collects the information items from the surrounding of the receive focus, and even though the adaptive beamformer with high directivity is used, it is possible to collect the information items reflected on the focus, and scattered in the surrounding thereof. Furthermore, by using thus collected peripheral information, an asymmetry property of the correlative signals is utilized, and this cancels (performs decorrelation of) the correlative noise caused by the wavefront distortion. Therefore, even when the medium of the test subject is inhomogeneous, it is possible to prevent the image quality from being deteriorated, resisting the influence from the correlative noise caused by the wavefront distortion, thereby achieving a highly robust ultrasound imaging apparatus.
  • the peripheral information operator 205 in the ultrasound imaging apparatus of the first embodiment performs the adaptive beamforming, thereby obtaining an adaptive weight as information.
  • the peripheral information operator 205 obtains the adaptive weights as to plural points, by using steering vectors that are directional vectors connecting a predetermined position on a surface of the array made up of the plural elements 106 as described in the first embodiment, and the plural points.
  • the peripheral information operator 205 incorporates a matrix operator configured to generate a covariance matrix by using the post-delay received signals, and a weight vector operator configured to obtain adaptive weight vectors as to the plural points, from the steering vectors.
  • the peripheral information combiner 206 incorporates a weight combiner configured to sum the adaptive weights as to plural points obtained by the peripheral information operator 205 and generate a combined weight, and an inner-product operator configured to multiply the post-delay received signals by the combined weight to generate a beamforming output.
  • a fixed apodization multiplier between the peripheral information operator 205 and the weight combiner, the fixed apodization multiplier being configured to multiply each of the adaptive weights as to the plural points obtained in the peripheral information operator, by a fixed weight being predetermined.
  • the inner-product operator multiplies each of the post-delay received signals by the combined weight, and thereafter sums the post-delay received signals, so as to generate the beamforming output.
  • the ultrasound imaging apparatus of the second embodiment of the present invention will be specifically explained. It is to be noted that a configuration similar to the first embodiment will not be tediously explained.
  • the peripheral information operator 205 obtains adaptive weights as to two or more points on the receive focus and in the periphery thereof, thereby collecting the peripheral information of the receive focus. Specifically, upon calculating the adaptive weight, plural steering vectors “a” are used so as to obtain the adaptive weight also as to a point at which the direction from the elements in the active channel of the ultrasound probe is displaced from the receive focus (i.e., a point at which the direction of the steering vector is different from the focus direction).
  • the peripheral information combiner 206 combines the adaptive weights of two or more points obtained by the peripheral information operator 205 , and uses thus obtained combined weight to perform beamforming process on the received signals.
  • the steering vectors are directional vectors connecting the center position of the active channel and the points as described above. Details of the steering vectors will be explained with reference to the formula (4) described below. Hereinafter, more specific explanations will be given.
  • FIG. 3 is a block diagram illustrating a configuration of the receive beamformer 108 according to the second embodiment.
  • an adaptive beamformer in which the adaptive signal processing technique is applied to the beamformer.
  • the components other than the receive beamformer 108 in other words, the ultrasound probe 101 , the transmit beamformer 104 , and the transmit-receive circuit 107 , being positioned in the previous stage of the receive beamformer 108 , and the image processor 109 and the monitor 103 being positioned in the subsequent stage of the receive beamformer 108 , are the same as those of the first embodiment. Therefore, tedious explanations and illustrations will not be given here.
  • the peripheral information operator 205 in the receive beamformer 108 is provided with the matrix operator 300 and the adaptive weight operator 301 .
  • the adaptive weight operator 301 incorporates plural weight vector operators 3021 , 3022 , and 3023 , and computes the weight vectors respectively as to the receive focus and the points surrounding thereof.
  • three weight vector operators 3021 to 3023 are arranged, and thus it is possible to obtain the weight vectors as to three points.
  • the number of the weight vector operators is not limited to three, and any number may be applicable, as far as it is more than one.
  • the matrix operator 300 receives inputs from the delay circuit 204 (post-delay received data x(n)), and generates the spatial covariance matrix R(n).
  • the spatial covariance matrix calculates weight vectors w 1 (n) w 2 (n), and w 3 (n), as to the predetermined three points with different steering vector directions, out of the points including the receive focus and in the periphery thereof, and outputs the calculated weight vectors to the peripheral information combiner 206 . Details of the steering vectors will be explained in the following.
  • the peripheral information combiner 206 is provided with the weight combiner 306 and the inner-product operator 307 .
  • the weight combiner 306 sums the plural weight vectors w 1 (n), w 2 (n), and w 3 (n) to combine the vectors, and generates a combined weight w sum (n).
  • the inner-product operator 307 uses this combined weight w sum (n) to assign weight on the post-delay received data x(n) that is inputted via the bypass line 207 , and then sums the data to obtain the beamforming output y(n). With this configuration, it is possible to obtain information in the form of one beamforming output, the information being acquired from the plural points within an identical raster and being different in the steering vector direction.
  • the inner-product operator 307 incorporates a multiplier 3071 configured to multiply the post-delay received data x(n) by the combined weight w sum (n), and an adder 3072 configured to calculate the sum of the post-delay data after the multiplication.
  • the fixed apodization multiplier 305 may be added to the peripheral information combiner 206 .
  • the fixed apodization multiplier 305 multiplies the weight vectors w 1 (n), w 2 (n), and w 3 (n), by a predetermined fixed weight, so as assign the weight on the weight vectors.
  • the beamforming output y(n) obtained by the peripheral information combiner 206 is transferred to the image processor 109 .
  • the image processor 109 generates an ultrasound image on the basis of the beamforming result y(n).
  • FIG. 4 illustrates the operations of the delay circuit 204 , including some other parts.
  • K elements 106 constituting the active channel 201 receive K pieces of data.
  • K pieces of the received data is also referred to as K-channel received data u(n).
  • the received data u(n) passes through the transmit-receive separation circuit 107 , and enters the delay circuit 204 .
  • the delay circuit 204 gives to the received signals (received data 404 ) u 1 (n), u 2 (n), . . .
  • u K (n) of K elements 106 constituting the active channel 201 delay time in such a manner as distributing the delay amount in the form of the concave shape 405 assuming one point 401 in the space as a center, in association with the positions of the elements 106 , and then, obtains the post-delay received data x 1 (n), x 2 (n), . . . , x K (n).
  • n represents a certain time (snapshot time) in the time direction (the depth direction in the ultrasound propagating direction) of the ultrasound received signal.
  • each of the focuses 402 and 403 is allowed to be obtained.
  • the received signal 404 on each element 106 is delayed, focusing on a desired point, and time-series data items 408 , 409 , and 410 of the received signals (post-delay received data) with aligned wavefront are obtained.
  • the number of the elements constituting the active channel 201 (the number of channels) is K, it is possible to express K post-delay received data items at a certain snapshot time n, as the vector x(n) on the left-hand side of the following formula (1).
  • x ( n ) [ x 1 ( n ), x 2 ( n ), . . . , x K ( n )] T (1)
  • the post-delay received data vectors x(n) are inputted in the peripheral information operator 205 .
  • the matrix operator 300 firstly obtains the spatial covariance matrix R(n) according to the formula (2).
  • x(n) may be used as a real signal as it is, or it may be subjected to Hilbert transform or baseband modulation, so as to use the resultant complex data after the conversion.
  • R(n) in the formula (2) indicates an ensemble average of a product of the complex vector ⁇ (n) expressed by the formula (3) and its (complex) conjugate transpose vector ⁇ H (n).
  • Another method for averaging in the time direction may be taken, the method multiplying each sample in the time direction by an arbitrary weight, such as a trapezoidal weight, and calculating the arithmetic mean.
  • the spatial covariance matrix R(n) outputted from the matrix operator 300 is subsequently inputted into the adaptive weight operator 301 .
  • the adaptive weight operator 301 that has received the spatial covariance matrix R(n) calculates weight vectors w(n) according to the MVDR method.
  • plural steering vectors “a” are used to obtain the peripheral information of the focus.
  • the steering vectors a p is expressed by the formula (4).
  • the steering vectors a p is a directional vector having K vector elements (from 0 to (K ⁇ 1)), the number being equal to the number of the active channels.
  • the steering vector is expressed by a function of the receive frequency f p and the angle ( ⁇ p , ⁇ p ) formed by the normal vector direction on the surface of the elements 106 and the steering vector (hereinafter, referred to as “steering angle”).
  • ⁇ p represents an aperture angle from the normal vector
  • ⁇ p represents a swivel angle from the array direction of the elements 106 .
  • the steering angle is expressed by the final form of the formula (4).
  • ⁇ p represents a wavelength of a sound wave associated with the frequency f p
  • d represents a distance between the center of the elements 106 (pitch between the elements).
  • the weight vector w (n) obtained by the MVDR method, as to the direction of the steering vector a p as described above is calculated by the formula (5) in this example. Therefore, the weight vectors w(n) are computed as to the steering vectors a p being different respectively in the weight vector operators 3021 to 3023 , thereby obtaining the adaptive weight vectors w 1 (n) to w P (n) corresponding to the number P of the steering vectors a p (i.e., the number of the weight vector operators 3021 to 3023 ).
  • R(n) represents the spatial covariance matrix at a certain snap shot n in the time direction, being generated by the formula (2), and the superscript “ ⁇ 1” represents an inverse matrix.
  • the number P of the steering vectors may be any value, as for as it is an integer at least two. Any method for selecting the direction of the steering vector is applicable.
  • the focus is positioned on the central axis of the channel 201 ; i.e., if the number of the active channels is an even number, it is positioned on the normal line (the normal line is orthogonal to the active channel surface) passing through the middle point between the K/2 th and the (K+2)/2 th elements, out of the K elements 106 constituting the active channel 201 .
  • the steering angle thereof may be for instance, at an equiangular pitch, or conform to a non-linear rule, or any irregular and random method may be taken for selecting the angle.
  • the adaptive weight vector w(n) in the focus direction corresponds to the adaptive weight vector w(n) obtained by the conventional MVDR method.
  • the beamformer according to the present embodiment is different from the beamformer according to the ordinary MVDR method, in the point that various plural steering vectors a p are set, and adaptive weight vectors w 1 (n) to w P (n) are obtained for the respective vectors.
  • the focus direction is not used on purpose, and this enhances the effect of the correlative noise canceling (decorrelation) more. Therefore, it is one of the preferred embodiments of the present invention.
  • the P adaptive weight vectors w 1 (n) to w P (n) outputted from the adaptive weight operator 301 are inputted into the peripheral information combiner 206 .
  • the peripheral information combiner 206 sums the P adaptive weight vectors w 1 (n) to w P (n) in the weight combiner 306 , takes an arithmetic average, and calculate the combined weight vector w sum (n) as shown in the formula (6).
  • a fixed apodization multiplier 305 may be provided, so as to multiply each of the weight vectors w p (n) by a fixed apodization.
  • it is possible to multiply the weight vectors by the fixed apodization by having a distribution where a value of the adaptive weight vector w P (n) in the direction of ⁇ 0° is made larger and values in another directions are made smaller, and this is implemented by the computation as shown in the formula (7).
  • the combined weight vector w sum (n) is inputted into the inner-product operator 307 within the peripheral information combiner 206 .
  • the inner-product operator 307 incorporates the multiplier 3071 and the adder 3072 , performs the inner-product operation between the combined weight vector w sum (n) and the received data vectors x(n) transferred from the delay circuit 204 via the bypass line 207 , as shown in the formula (8), and then obtains the beamforming output y(n).
  • the multiplier 3071 obtains the product of the weight vector and each channel element of the post-delay received data vectors (1 to K).
  • the adder 3072 calculates a total sum of the K products obtained by the multiplier 3071 , and obtains a final output of the peripheral information combiner 206 (beamforming output y(n), scalar value).
  • This beamforming output y(n) from the inner-product operator 307 is transferred to the image processor 109 .
  • the beamforming outputs y(n) for one raster obtained by the formula (8) are acquired respectively for the active channels, shifting on the received array from the active channel 201 to the active channels 202 , and 203 , and transferred to the image processor 109 , raster by raster.
  • the image processor 109 a scan converter associated with the scanning system of the ultrasound probe 101 , arranges all the rasters and generates a two-dimensional image.
  • backend image processing such as various filter processing, and computations by measurement applications are performed.
  • the monitor 103 displays an ultrasound image and a computational result of the measurement applications.
  • plural steering vectors a p are set, and the adaptive weight vectors w 1 (n) to w P (n) are obtained for the respective steering vectors, and combined weight vector w sum (n) obtained by combining those adaptive weight vectors is used to perform beamforming of the received data vectors x(n). Then, one beamforming output y(n) is obtained.
  • the effect will be explained.
  • FIG. 5( a ) illustrates an example that data is received once from the active array 201 , focusing on the receive focus 2201 , and the receive beamforming is performed.
  • the receive beamforming is performed according to the conventional delay-and-sum method
  • signals in the region indicated by the point spread function 2202 around the receive focus 2201 may also be acquired by one-time receive beamforming.
  • the range of the signals being obtained by one-time receive beamforming is represented by the point spread function 2204 , and it is smaller than the range represented by the point spread function 2202 of the delay-and-sum method.
  • the signals of the object on the focus 2201 are reflected by the focus 2201 and distributed randomly in the periphery of the focus 2201 as indicated by the elliptic region 2205 .
  • This elliptic region 2205 corresponds to the periphery (peripheral region) of the first embodiment.
  • the adaptive weight vectors obtained by the adaptive beamformer, as to the respective plural points (here three points) having different steering angles are combined.
  • This process is equivalent to allow the regions of the point spread function 2204 to overlap one another, the regions corresponding to the respective peripheries of the plural points as shown in FIG. 5( b ). Therefore, in the second embodiment, the region 2206 obtained by the three displaced regions of the point spread function 2204 overlapping one another become the region allowing the signals therein to be acquired by one-time receiving. Therefore, it is possible to acquire most of the signals of the object, scattered in the elliptic region 2205 surrounding the focus 2201 , and this allows reduction of the number of signals failing to be picked up.
  • weight vectors having different steering angles By using the weight vectors having different steering angles, it is possible to utilize an asymmetry property of noise on the right side and noise on the left side of the central axis, being correlated with each other.
  • adaptive weights having different steering angles are combined, thereby canceling (performing decorrelation of) the correlative noise signals in the received signals, and therefore, it is possible to selectively eliminate highly correlative noise signals, without failing to pick up the information of the signals on the focus.
  • the effects as described above bring about improvements against the disadvantages such as; failing to pick up signals due to the sharp beam directivity of the adaptive beamformer, and being vulnerable to influence of the wavefront distortion. Therefore, this provides robustness, being impervious to such influence.
  • the receive beamforming is performed every receiving (raster by rater). This may reduce the noise included in the beamforming outputs 1504 , 1505 , and 1506 respectively from the rasters 1501 , 1502 , and 1503 , and attenuate the impact of the wavefront distortion. Therefore, an ultrasound image obtained by arranging those beamforming outputs of the respective rasters is a clear image with less noise, and with reduced impact of wavefront distortion caused by the influences such as sound-velocity inhomogeneous in the living body, a scatterer distribution, and body motion.
  • the subarray matrix R ⁇ SUBl in the spatial average operation is expressed by a product of the subspace vectors ⁇ l (n) (Formula (10)) as shown in the formula (9).
  • the subspace vector ⁇ l (n) is a vector obtained by extracting a partial substance (L elements) from the post-delay received data (that is represented by generalized complex signal vector ⁇ (n) here, but that may also be represented by a real vector x(n), instead) in association with K active channels. Therefore, the total number of the subspace vectors is equal to (K ⁇ L+1) (0 ⁇ l (l is lower case of L) ⁇ (K ⁇ L+1)).
  • ⁇ l ⁇ ( n ) [ ⁇ l ( n ), ⁇ l+1 ( n ), . . . , ⁇ l+L ⁇ 1 ( n )] T (10)
  • Samples are displaced one by one, establishing association between the main diagonal component of this subarray matrix and the main diagonal component of the spatial covariance matrix R(n), resulting in a spatial averaging process of the (K ⁇ L+1) subarray matrixes, and the subarray spatial covariance matrix R ⁇ (n) as shown in the formula (11) is obtained.
  • the adaptive weight operator 301 computes the subarray spatial covariance matrix R ⁇ (n), it is used to substitute for R(n) in the above formula (5), and weight vectors w p (n) is computed as shown in the formula (12). It is to be noted that the output from the matrix operator 300 for this case is (L ⁇ L) in size, and the number of the elements constituting the weight vectors w p (n) becomes L.
  • a forward and backward spatial averaging method may be employed.
  • the backward subarray matrix R ⁇ SUBl (n) may be obtained by a product of the backward subspace vector ⁇ ⁇ l (n) as shown in the formula (13).
  • the backward subspace vector is expressed by the formula (14).
  • the backward subarray spatial matrix R ⁇ (n) may be calculated as shown in the formula (15).
  • the forward and backward subarray spatial covariance matrix R FB (n) may be obtained. Similar to the forward spatial averaging, this forward and backward subarray spatial covariance matrix R FB (n) is substituted for R(n) in the formula (5) in the computations in the adaptive weight operator 301 , thereby computing the weight vectors w p (n) as shown in the formula (17). Also in this case, the output from the matrix operator 300 is (L ⁇ L) in size, and the number of elements constituting the weight vectors w p (n) is L.
  • Plural (P) weight vectors w p calculated in the formula (12) or in the formula (17) according to the spatial averaging method are transferred to the peripheral information combiner 206 , similar to the case where the spatial averaging method is not employed.
  • the weight combiner 306 sums the weight vectors, and calculates w sum (Formula (7), Formula (8)).
  • the inner-product operator 307 performs the inner product calculation using the post-delay received data transferred via the bypass line 207 , and eventually, the result is transferred to the image processor 109 as the beamforming data y(n).
  • the spatial averaging method is used under the condition that the number of the elements of the weight vectors w p is L. Therefore, in order to perform computations finally in the inner-product operator 307 , a block is additionally required, so as to perform the computation that generates vectors g(n) having L components, out of (n) having K components (Formula (18)).
  • a dimensional compressor 308 may be arranged in the middle of the bypass line 207 and also in the previous stage of the inner-product operator 307 , allowing the dimensional compressor 308 to generate the vectors g(n) having L elements, from (n) having K elements. It is to be noted that the dimensional compressor 308 may be arranged within the peripheral information combiner 206 .
  • the beamforming output y(n) using the spatial averaging method is expressed by the formula (19).
  • the matrix operator 300 performs the spatial averaging process using the subarray matrix, and produces an effect that restrains correlated noises being included in the received ultrasound signals. Therefore, the spatial averaging process using the subarray matrix is combined with the configuration for synthesizing plural adaptive weight vectors around the focus, thereby obtaining an ultrasound image that is much less influenced by noise.
  • the MVDR is taken as an example for explaining the adaptive beamforming method.
  • the algorithm applied in the adaptive weight operator 301 may be any other algorithm as far as it uses the spatial covariance matrix calculated in the matrix operator 300 . Any other methods may be applicable, such as the MMSE method, APES method, Eigenspace-MV method (ESMV, EIBMV) using the spatial covariance matrix and its eigenvalues and eigenvectors, ESPRIT method, and MUSIC method.
  • the third embodiment is different from the second embodiment in the point that the peripheral information combiner 206 multiplies the post-delay received signals, by the adaptive weights respectively for the plural points obtained by the peripheral information operator 205 , and plural inner-product operators for generating pre-synthesis beamforming outputs respectively for the plural adaptive weights, and an output combiner are provided.
  • the output combiner sums and combines the pre-synthesis beamforming outputs respectively as to the plural adaptive weights, and generates a beamforming output to be used for generating image data.
  • a fixed apodization multiplier configured to multiply the adaptive weights for the respective plural points obtained by the peripheral information operator 205 , by fixed weights being predetermined respectively. It is alternatively possible to arrange between the plural inner-product operators and the output combiner, the fixed apodization multiplier configured to multiply the plural post-delay received signals for the respective adaptive weights, by the fixed weights being predetermined respectively. It is further alternatively possible to arrange between the plural multipliers and the plural adders constituting the plural inner-product operators, the fixed apodization multiplier configured to multiply the plural signals after multiplication, by fixed weights being predetermined respectively.
  • the peripheral information combiner 206 combines the information items that is obtained by the peripheral information operator 205 from the received signals in one active channel, and generates a final beamforming output that is used by the image processor to generate an image.
  • the peripheral information combiner 206 sums the pre-synthesis beamforming outputs for the respective plural adaptive weights, obtained by the peripheral information operator 205 from the received signals in one active channel, and generates a final beamforming output that is used by the image processor 109 with respect to each active channel, so as to generate an image.
  • the ultrasound imaging apparatus according to the third embodiment of the present invention will be explained specifically.
  • the configuration of the peripheral information combiner 206 is different from the second embodiment, only the different part will be explained without tediously explaining the remaining parts.
  • the peripheral information combiner 206 combines the plural adaptive weight vectors w 1 (n), w 2 (n), and w 3 (n) by the weight combiner 306 , and with the use of thus obtained combined weight w sum (n), the beamforming summation process is performed on the post-delay received data x(n). Since this is a linear operation, processing of the weight combining and processing of the beamforming summation may be transposed.
  • a set of one multiplier 3071 and one adder 3072 configures one inner-product operator for each of the adaptive weight vectors. Preparing this set more than one configures plural inner-product operators 307 .
  • the post-delay received data x(n) is inputted in the multipliers 3071 - 1 , 3071 - 2 , and 3071 - 3 , respectively via the bypass inputs 2071 , 2072 , and 2073 .
  • the output combiner 500 is arranged in the subsequent stage of the plural inner-product operators 307 .
  • the bypass input 2071 is subjected to the beamforming summation by using the weight vectors w 1 (n)
  • the bypass input 2072 is subjected to the beamforming summation by using the weight vectors w 2 (n)
  • the bypass input 2073 is subjected to the beamforming summation by using the weight vectors w 2 (n).
  • plural pre-synthesis beamforming outputs y 1 (n), y 2 (n), and y 3 (n) respectively in association with the steering vectors are calculated (Formula (20) in the following).
  • the fixed apodization multiplier 305 may be added.
  • the fixed apodization multiplier 305 is placed in the previous stage of the inner-product operator 307 .
  • the position of the fixed apodization multiplier 305 is not limited to the place as shown in FIG. 7 . Since the operation is a linear operation, it is obviously possible to add the fixed apodization multiplier 305 between the inner-product operator 307 and the output combiner 500 , or to add the fixed apodization multiplier 305 between the multipliers 3071 - 1 to 3071 - 3 and the adders 3072 - 1 to 3072 - 3 in the inner-product operator. In any of the above cases, the final combined beamforming output y sum (n) is expressed by the formula (22).
  • the second embodiment is different from the third embodiment in the point that the weights are combined and thereafter the inner-product operation is performed; or the beamforming outputs after the inner-product operation are combined.
  • the final output is identical since this is a linear operation, and y(n) in the formula (19) being the final beamforming output of the second embodiment becomes a value being equal to the final combined beamforming output y sum (n) of the third embodiment.
  • the delay unit (delay circuit 204 ) generates post-delay received signals respectively for different plural receive focuses.
  • the peripheral information operator 205 and the peripheral information combiner 206 acquire information items as to the receive focus and plural points in the periphery of the receive focus, as to each of the plural receive focuses, and generates a beamforming output.
  • the delay unit (delay circuit 204 ) is prepared more than one, and generates the post-delay received signal as to each of the receive focuses being different for the plural delay units, respectively.
  • the peripheral information operator 205 and the peripheral information combiner 206 are arranged for each of the delay units, acquire information from the post-delay received signal generated by the delay unit for each receive focus, and generates the beamforming output.
  • the positions of the plural receive focuses of the received signals in the active channel at a certain point of time may partially overlap the positions of the plural receive focuses for the active channel at a different point of time.
  • the ultrasound imaging apparatus according to the fourth embodiment of the present invention will be specifically explained.
  • the configurations similar to the second and the third embodiments may not be tediously explained, and only the different portions will be explained.
  • the focus is positioned on the central axis (the normal line passing through the active channel center, with respect to the active channel surface) for every received signal.
  • plural points are set as the focuses at a certain depth for every received signal.
  • the focuses are set respectively on the points 601 and 603 on the axes in proximity to and displaced from the central axis 1600 , on both sides of the point 602 on the central axis, and information of the test subject 100 is collected.
  • Configuring a distribution of the delay times of the received signals of the K elements 106 constituting the active channel, in the form of concave 604 , 605 , and 606 respectively centering on the focus 601 , 602 , and 603 , allows the signals received for one time, to focus on the plural focuses 601 , 602 , and 603 .
  • the post-delay data obtained as to each of the plural focuses 601 , 602 , 603 at a certain depth (at a certain snapshot time n) is subjected to arithmetic operation for collecting and combining the periphery information.
  • FIG. 9 is a block diagram illustrating the receive beamformer 108 according to the fourth embodiment.
  • the receive beamformer 108 in order to set plural focuses 601 , 602 , and 603 simultaneously at a certain snapshot time, the receive beamformer 108 is provided with the delay circuits 2041 , 2042 , and 2043 , the peripheral information operators 2051 , 2052 , and 2053 , and the peripheral information combiners 2061 , 2062 , and 2063 , the number of which is the same as the number of the focuses.
  • the configurations of the peripheral information operators 2051 , 2052 , and 2053 , and the peripheral information combiners 2061 , 2062 , and 2063 are the same as those of the peripheral information operator 205 and the peripheral information combiner 206 in the second embodiment. Therefore, the number of the beamforming outputs y(n) is equal to the number of the focuses, and in the example of FIG. 9 , there are three beamforming outputs, y 1 (n), y 2 ( n ), and y 3 (n).
  • the delay circuit 2042 forms a delay concave surface to obtain the focus 602 on the central axis 1600 of the active channel at a certain snapshot time n
  • the delay circuits 2041 and 2043 form the delay concave surfaces respectively for obtaining the focuses 601 and 603 on the axes displaced from the central axis 1600
  • the delay circuits generate the post-delay received data.
  • the peripheral information combiner 2051 performs the operations as explained in the second embodiment, on the post-delay received data from the delay circuit 2041 , so as to calculate the weight vectors w1 1 (n), w1 2 (n), and w1 3 (n) respectively for the focus 601 and two points forming the steering angle 1602 with respect to the direction of the focus 601 .
  • the peripheral information combiner 2061 combines the weight vectors w1 1 (n), w1 2 (n), and w1 3 (n), and then uses thus combined weight vector w1 sum (n) to obtain the beamforming output y1(n).
  • the peripheral information combiner 2052 performs operations on the post-delay received data from the delay circuit 2042 , and calculates the weight vectors w2 1 (n), w2 2 (n), and w2 3 (n) respectively for the focus 602 and the two points forming the steering angle 1604 with respect to the direction of the focus 602 .
  • the peripheral information combiner 2062 combines the weight vectors w2 1 (n), w2 2 (n), and w2 3 (n), so as to use thus combined weight vector w2 sum (n) to obtain the beamforming output y2(n).
  • the peripheral information combiner 2053 performs operations on the post-delay received data from the delay circuit 2043 , and calculates the weight vectors w3 1 (n), w3 2 (n), and w3 3 (n) respectively for the focus 603 and the two points forming the steering angle 1606 with respect to the direction of the focus 603 .
  • the peripheral information combiner 2063 combines the weight vectors w3 1 (n), w3 2 (n), and w3 3 (n), and uses thus combined weight vector w3 sum (n) to obtain the beamforming output y3(n).
  • Those beamforming outputs y1 sum (n), y2 sum (n), and y3 sum (n) are transferred to the image processor 109 .
  • the processing above allows acquisition of ultrasound image data of the three focuses 601 , 602 , and 603 , by using just one-time transmit-receive signals to and from the active channel 201 , for instance.
  • This processing indicates that when the direction of the focus 602 is assumed as the main beam direction, the image may be generated also in the sub-beam directions (the directions of the focuses 601 and 603 ).
  • the configuration of the fourth embodiment of the present invention is employed, i.e., to acquire signals of the object scattered around the focus, it is found that the configuration of the present invention is applicable to the technique such as the sub-beam processing and parallel beamforming. Therefore, there are obtained effects such as reduction of failing to pick up the signals and reduction of noise.
  • this process is also applicable to the plural active channels 201 , 202 , and 203 .
  • the focus is set for the active channels 202 and 203 on the same point as the focus 602 for the active channel 201 , in such a manner as viewing in the slant direction from the active channels 202 and 203 , thereby newly acquiring the data regarding the focus 602 . Therefore, as for the plural active channels 201 , 202 , and 203 , data items acquired by viewing the identical point 602 from different directions are obtained respectively, and then, the obtained data items are combined and overlapped one another.
  • This process is referred to as synthetic aperture (combined aperture) process. That is, by using the configuration of the fourth embodiment, the technique of the present invention for acquiring signals of the object, scattered around the focus, is applicable to the synthetic aperture process.
  • an effect of the present invention that is, reduction of the wavefront distortion, may be produced also in the off-axis beamforming technique focusing on an off-axis point, the parallel beamforming, and the synthetic aperture beamforming.
  • peripheral information operators 2051 to 2053 and the peripheral information combiners 2061 to 2063 of the present embodiment as shown in FIG. 9 are the same as those illustrated in FIG. 3 .
  • any configuration may be applicable as far as it is within the scope of the present invention.
  • the peripheral information combiner 206 of the third embodiment as shown in FIG. 7 may be employed. It is alternatively possible to employ the configurations of the peripheral information operator 205 and the peripheral information combiner 206 of the fifth embodiment to the eighth embodiment, which will be described in the following.
  • a storing unit (temporary storage 800 ) is arranged between the plural inner-product operators of the third embodiment and the output combiner, the storage being configured to store the pre-synthesis beamforming outputs respectively for the plural adaptive weights generated by the respective inner-product operators.
  • the configuration of the fifth embodiment is the same as that of the third embodiment, but they are different in the point that the temporary storage 800 is provided between the plural inner-product operators 307 and the output combiner 500 in the peripheral information combiner 206 .
  • the temporary storage 800 is provided between the plural inner-product operators 307 and the output combiner 500 in the peripheral information combiner 206 .
  • the signals to be subjected to the beamforming are flowing sequentially with time, from the time being received at the active channel until reaching the image processor 109 . Therefore, the synthesizing process in the output combiner 500 is limited to the process for synthesizing the beamforming outputs y 1 (n), Y 2 (n), and y 3 (n), respectively obtained by the plural inner-product operators 307 as to the plural steering vectors of each data, in the raster currently under the beamforming process and at the current snapshot time n.
  • the temporary storage 800 as a memory configured to store the beamforming outputs y 1 (n), y 2 (n), and y 3 (n) immediately before the output combiner 500 .
  • the temporary storage 800 stores the beamforming outputs y 1 (n), y 2 (n), and y 3 (n) obtained by the plural inner-product operators 307 , every updating at the snapshot n, or every updating of the raster.
  • the output combiner 500 is allowed to read and combine the beamforming outputs y 1 (n), y 2 (n), and y 3 (n) stored in the temporary storage 800 , and it is possible to combine the beamforming outputs at various time samples within an identical raster, or combine the beamforming outputs across different rasters.
  • those beamforming outputs y 1 (n), y 2 (n), and y 3 (n) are results of the beamforming using the weight vectors w 1 (n), w 2 (n), and w 3 (n) obtained as to different steering vectors at every snapshot time n. Therefore, the beamforming outputs y 1 (n), y 2 (n), and y 3 (n) indicate beamforming outputs in the respective steering vector directions. This allows the output combiner 500 to select any combination of the beamforming outputs among different rasters and among different steering vector directions, and combine thus selected beamforming outputs. The beamforming outputs to be selected among the different steering vector directions may be predetermined, or set by the operator.
  • the receive beamformer 108 may also be provided with an angle adjuster 502 and a memory output adjuster 503 .
  • the angle adjuster 502 takes control of setting the steering vector directions, the number of steering vectors, and the density thereof, in the weight vector operators 3021 , 3022 , and 3023 in the adaptive weight operator 301 .
  • the memory output adjuster 503 designates the timing (clock) for reading and transferring the beamforming output from the temporary storage 800 to the output combiner 500 , and assigns in the memory, the address of the beamforming output to be read out. It is further possible to provide an adjuster 501 on an upper level, configured to adjust the parameters of both the angle adjuster 502 and the memory output adjuster 503 in conjunction with each other.
  • the peripheral information combiner 206 provided with the temporary storage 800 of the present embodiment may substitute for the peripheral information combiners 2061 , 2062 , 2063 of the fourth embodiment.
  • This configuration allows synthesis of the beamforming outputs of different time samples as to each of the plural focuses in an identical raster, or synthesis of the beamforming outputs across different rasters, and the like.
  • the output combiner 500 acquires and sums the pre-synthesis beamforming outputs different in the time direction, from the storing unit (temporary storage 800 ) and the plural inner-product operators, so as to generate a beamforming output.
  • the ultrasound imaging apparatus according to the sixth embodiment of the present invention will be explained specifically. Similar to the fifth embodiment, the ultrasound imaging apparatus of the present embodiment is provided with the temporary storage 800 , but the internal configuration of the temporary storage 800 is different from the fifth embodiment.
  • the temporary storage 800 is provided with a memory 900 , and a bypass line 901 for the beamforming outputs (hereinafter, also referred to as “post-beamforming data”) y 1 (n) to y P (n).
  • the memory 900 has a storage area (address) for storing (P ⁇ N) post-beamforming data items, that is, post-beamforming items y 1 (n) to y P (n) for every N snapshots in the time direction, in association with each of the P steering vectors.
  • P memory areas 9001 , 9002 , and 9003 are prepared in the memory 900 , each including N addresses schematically illustrated. Since other configurations are the same as those of the fifth embodiment, tedious explanations will not be provided.
  • the temporary storage 800 receives the post-beamforming data items y 1 (n) to y P (n) from the plural inner-product operators 307 , at a certain snapshot time n. Those post-beamforming data items y 1 (n) to y P (n) are transferred to both the memory 900 and the bypass line 901 .
  • the post-beamforming data items y 1 (n) to y P (n) are stored in a predetermined write address (WA) in the memory areas 9001 , 9002 , and 9003 , respectively. Simultaneously, the post-beamforming data items y 1 (n ⁇ 1) to y P (n ⁇ 1) .
  • the beamforming outputs of the plural steering vectors 1 to P are outputted, combining the information items at the time (n ⁇ 1) . . . (n ⁇ i), i.e., being obtained by going back to y 1 (n ⁇ 1) to y P (n ⁇ 1) . . . y 1 (n ⁇ i) to y P (n ⁇ i), with the current post-beamforming data items y 1 (n) to y P (n), and this allows the output combiner in the subsequent stage to combine all of those outputs together.
  • the information stored in the memory 900 is configured to be overwritten every time the raster is updated, so as to overwrite the information sequentially by the post-beamforming data in a new raster.
  • this configuration allows the post-beamforming data items y 1 (n) to y P (n) . . . y 1 (n ⁇ i) to y P (n ⁇ i) at the plural sample times from n to (n ⁇ i) to be combined, and it is possible to obtain the beamforming outputs 1703 , 1704 , and 1705 , being acquired by synthesizing the beamforming results of plural time samples, respectively in the same rasters 1501 , 1502 , and 1503 .
  • This allows collecting the data items 1701 and 1702 inhomogeneously scattered in the time direction, and compensate for the image quality deterioration of the ultrasound image in the time direction (ultrasound wave propagating direction).
  • the read address (RA) may be determined in advance, or an output adjuster 503 may be provided and configured to designate the read address.
  • the output adjuster 503 may transfer to the temporary storage 800 , for instance, the information regarding which data is outputted, out of the current post-beamforming data items y 1 (n) to y P (n) and the previous data held in the memory, and a signal for controlling the output timing and the output data address.
  • the plural inner-product operators generate the pre-synthesis beamforming outputs respectively for the plural adaptive weights for each active channel
  • the output combiner 500 acquires from the storing unit (temporary storage 800 ) and plural inner-product operators, the pre-synthesis beamforming outputs generated for different active channels and sums the outputs, so as to generate the beamforming output.
  • the ultrasound imaging apparatus of the seventh embodiment is provided with the memories 900 - 1 , 900 - 2 , and 900 - 3 within the temporary storage 800 for storing the beamforming outputs (post-beamforming data) y 1 (n) to y P (n) in association with P steering vectors for the three rasters.
  • the structure of each memory 900 - 1 , 900 - 2 , and 900 - 3 is the same as the memory 900 of the sixth embodiment.
  • y p (r ⁇ 2, n) at the snapshot time n of the (r ⁇ 2) th raster are respectively stored in the memories 900 - 1 , 900 - 2 , and 900 - 3 .
  • Those data items multiplied by N are stored for each raster, N being the number of snapshots in the time direction.
  • the current post-beamforming data items y 1 (r, n), y 2 (r, n), and y 3 (r, n) at the snapshot time n are written into the write address WA as shown in the figure, and simultaneously those data items are also outputted to the output combiner 500 via the bypass line 901 .
  • the post-beamforming data at one snapshot time n corresponds to three data items y 1 (n) to y 3 (n) for each raster.
  • Other configurations are the same as those of the sixth embodiment, and tedious explanations will not be provided.
  • the post-beamforming data items y 1 (r,1), y 2 (r,1) . . . y P (r,1) to y 1 (r,N), y 2 (r,N) . . . y P (r,N) of the current scanning raster (the r th raster) at the snapshot time 1 to N can be held in the memory 900 - 1 , and further, the post-beamforming data items of the (r ⁇ 1) th raster and the (r ⁇ 2) th raster are also stored respectively in the memory 900 - 2 and in the memory 900 - 3 .
  • the post-beamforming data items relating to various rasters, snapshot times, and steering vectors, are respectively stored in the memories 900 - 1 to 900 - 3 . Accordingly, at least one desired post-beamforming data item is selected from those post-beamforming data items, and outputted to the output combiner 500 .
  • the data y 3 (r ⁇ 2, n) at the time n, at the steering angle ⁇ 3 is outputted from the (r ⁇ 2) th raster.
  • the formula (23) expresses the combining process performed by the output combiner 500 .
  • post-beamforming data items acquired from plural rasters are combined with regard to an identical focus 1500 , and compared to the case of using the post-beamforming data obtained from one steering vector in a single raster, it is possible to achieve higher resolution of the point image, and to reduce noise caused by combining plural rasters.
  • the post-beamforming data items stores in the memories 900 - 1 , 900 - 2 , and 900 - 3 are overwritten every time the raster is updated, and the latest post-beamforming data items for the three rasters are constantly stored in the memories 900 - 1 , 900 - 2 , and 900 - 3 , respectively.
  • the output adjuster 503 may be provided. In this case, the output adjuster 503 transfers to the temporary storage 800 , a signal to control the data address to be outputted and its timing to the output combiner 500 .
  • the ultrasound imaging apparatus according to the eighth embodiment of the present invention will be explained.
  • the ultrasound imaging apparatus of the eighth embodiment there are arranged (m+1) memories 900 - 1 , 900 - 2 . . . 900 -( m +1) within the temporary storage 800 .
  • Each of the memories 900 - 1 to 900 -( m+ 1) has the same configuration as the memory 900 - 1 of the seventh embodiment.
  • At least one of any combination is allowed to be selected from all the combinations of the post-beamforming data with regard to the three-dimensions; the plural steering angle directions 1 to P in an identical raster, the snapshot times 1 to N, and the raster shifting directions 1 to (m+1), and the selected combination is outputted to the output combiner 500 for synthesis.
  • the outputs 1106 , 1107 , and 1108 to be combined are selected from P ⁇ N ⁇ (m+1) post-beamforming data items, and outputted to the output combiner 500 so as to be combined. It is to be noted that the configuration being the same as the seventh embodiment will not be explained tediously.
  • the beamforming data of the identical focus 1500 is obtained from different rasters 1501 , 1502 , and 1503 , and those data items are combined, thereby acquiring the data of the focus 1500 repeatedly, and enhancing the precision of data.
  • the post-beamforming data items as to the points 1701 to 1704 around the focus 1500 are also selected and outputted, and then the output combiner 500 is allowed to combine those data items. As illustrated in FIG. 5( c ), it is further possible to combine the information (post-beamforming data) of the point spread function 2204 around the focus 1500 .
  • any point in the time direction may also be selected for combine the post-beamforming data at that time, and this allows collection of the data items 1701 and 1702 scattered inhomogeneously in the time direction, and it is also possible to compensate for the deterioration of the image quality of the ultrasound image in the time direction (in the ultrasound propagating direction).
  • the output adjuster 503 may be provided, so as to control the data address to be outputted and its timing to the output combiner 500 .
  • the peripheral information operator 205 generates pre-synthesis beamforming outputs as to plural points being set two-dimensionally.
  • the storing unit (temporary storage) 800 stores those pre-synthesis beamforming outputs.
  • the ultrasound imaging apparatus of the ninth embodiment has a configuration that Q temporary storages 800 of the eighth embodiment as shown in FIG. 16 are arranged in parallel.
  • a probe in which the elements 106 are two dimensionally arranged is employed as the probe 101 .
  • a two-dimensional array probe, a two-dimensional array convex probe, a probe with a bundle of short-axes, or the like may be used.
  • the configuration similar to the eighth embodiment will not be explained tediously.
  • the adaptive weight operator 301 varies the steering angle being calculated, in the two-dimensional direction where the elements 106 of the probe 101 are arranged, and obtains adaptive weight vectors as to P ⁇ Q steering angles.
  • the post-beamforming data items obtained as to the adaptive weight vectors for the respective steering vectors are stored at every snapshot time. With this configuration, it is possible to collect the peripheral information (post-beamforming data) three-dimensionally.
  • the steering vectors a p expressed by the aforementioned formula (4) are extended in the two-dimensional direction, and the present invention allows the steering vector direction to be selected, in the form of any combination of ( ⁇ p , ⁇ q ), particularly using 1 ⁇ p ⁇ P and 1 ⁇ q ⁇ Q in the two dimensional angles.
  • the formula (24) expresses the steering vector a (p, q) for this case.
  • FIG. 19 illustrates the operations of the signal processing for the ninth embodiment.
  • P ⁇ Q steering vectors a (p, q) associated with the angles ( ⁇ p , ⁇ q ) in the two-dimensional direction may be set on the sphere.
  • FIG. 20 illustrates an example in which the adjuster 501 of FIG. 11 sets the steering vectors.
  • the steering vector direction may be determined by a predetermined fixed angle, or the direction may be changed according to the control by the adjuster 501 .
  • FIG. 20( a ) to FIG. 20( f ) illustrate two examples of the steering angle, as the embodiment thereof.
  • FIG. 20( a ) to FIG. 20( c ) illustrate that the steering angle 2104 is fixed.
  • the depth of the imaging target point becomes deeper. Since the steering angle 2104 is fixed, even though the steering vectors are kept the same, the deeper is the imaging target point, the range (the point spread function 2204 ( FIG. 5( a )) becomes larger, the range allowing the signals to be collected around the imaging target point.
  • FIG. 20( d ) to FIG. 20( f ) illustrate an example that the steering angle becomes smaller, as the imaging target becomes deeper, and the horizontal distance 2108 (spread of imaging target point) is made constant, the horizontal distance indicating the distance from the receive focus to the point expected by the steering vector.
  • the steering angle is made to vary for each depth (snapshot time n) of the imaging target point, thereby rendering the spread of the imaging target point (point spread function 2204 ) to be constant irrespective of the depth of the imaging target.
  • the adjuster 501 is allowed to set the number of the steering vectors within the steering angle 2109 , or the angle of adjacent steering vectors.
  • the adjuster 501 stores data in advance in such a manner as storing plural types of combinations of the steering angle and the number of the vectors, and depending on the focal position (the position of the imaging target point), the imaging portion of the test subject 100 , and the imaging sequence, the adjuster 501 selects a suitable combination of the steering angle and the number of the vectors, so as to set the selected combination in the angle adjuster 502 and in the output adjuster 503 . It is alternatively possible that the operator selects via the console 110 , a combination of the steering angle and the number of vectors, or the operator inputs in the adjuster 501 via the console 110 , any steering angle and any number of the vectors.
  • a memory configured to store in advance, a distribution of the spread angles of the steering vectors directed to plural points, in association with the position of the receive focus within the image that is generated by the image processor.
  • the adjuster uses the steering vectors having the spread angles read out from the memory, in association with the position of the receive focus, so as to obtain the adaptive weights.
  • the spread angle of the steering vector stored in the memory is configured to be small for the receive focus being close to the send focus upon transmitting ultrasound wave signals to the test subject, whereas it is configured to be large for the receive focus being distant from the send focus.
  • the spread angle of the steering vector is configured to be smaller at the edge part of the image, compared to the center part of the image.
  • the adjuster 501 applies the steering angles different for each imaging target point within one image, depending on the type of the ultrasound probe 101 , imaging conditions, and a type of the imaging sequence, and then an image is generated.
  • various sets of steering angles are applied to generate an image, in association with the probe, such as a linear probe, a convex probe, a sector probe, a two-dimensional array probe, a mechanical 3D imaging probe, and the like.
  • various sets of steering angles are applied to generate an image, for example, the steering angles being different depending on the imaging conditions such as the send focus, receive focus, send/receive frequency, frame rate, parallel beamforming count, tissue harmonics, and contrast, and the imaging sequence.
  • the focus upon sending is set on at least one fixed position when one ultrasound image is taken, and the receive beamformer 108 varies the receive focus, thereby obtaining an image for each imaging target point within the imaging area. Therefore, the focus 2210 upon sending is determined at a position within the image.
  • signals of the object spread in the elliptic region 2205 centering on the receive focus, due to the reflection on the receive focus, and the size of the region varies depending on the depth (time) of the receive focus. As illustrated in FIG. 22( a ), the size do the elliptic region 2205 is smaller as it is closer to the send focus 2210 , and it becomes larger with distance from the send focus 2210 .
  • the beam forming is performed using the steering vectors with a fixed steering angle (spread angle) 2211 , there may occur a mismatch between the elliptic region 2205 indicating the spreading of the signals of the object, and the size of the point spread function 2204 allowing signal acquisition, determined by the spread angles of the steering vectors.
  • the aforementioned mismatch and extending off the image of the point spread function 2204 may result in that intensity unevenness shows up in a B-mode image.
  • intensity unevenness shows up in a B-mode image.
  • FIG. 22( b ) the structure around the send focus 2210 is depicted with homogeneous intensity, but in a shallow part, a deep part, and the image edge part, the image intensity may be deteriorated and unevenness may occur, due to the mismatch between the elliptic region 2205 indicating the spread of the signals of the object and the point spread function 2204 , and extending off the image of the point spread function 2204 .
  • the steering spread angle is varied in accordance with the signal spreading region of the object, so that the point spread function 2204 upon beamforming matches the elliptic region 2205 .
  • a steering angle is used which restrains spreading of the point spread function 2204 determined by the spread angles of the steering vectors, so that the point spread function may not expand unnecessarily out of the imaging target.
  • the adjuster 501 is provided in the memory 2215 as shown in FIG. 21 , in the eleventh embodiment.
  • the memory 2215 stores in advance, data in the form of table, or the like, defining a distribution of the spread angles of the steering vectors (steering angles), with regard to the scanning direction and the depth (time) direction of the raster.
  • the distribution of the steering angles is defined so as to enhance the image intensity and reduce the unevenness, considering that the variation of the size of the elliptic region 2205 indicating the spread of the signals of the object, depends on the type of the probe 101 and the imaging sequence (a focus upon sending, and the like).
  • the distribution of the steering angles is prepared for each combination of the type of the probe 101 and the imaging sequence.
  • the distribution of the steering angles for enhancing the image intensity and reducing unevenness may be obtained by calculations (including a computer simulation), experiments, and the like, performed in advance.
  • the adjuster 501 receives from the controller 111 , information regarding the type of the probe 101 and the imaging sequence, and reads out from the memory 2215 , data defining the distribution of the spread angles of the steering vectors in association with the information. According to the data being read out, the adjuster 501 sets the spread angles (steering angles) of the steering vectors in the angle adjuster 502 and in the output adjuster 503 in the receive beamformer 108 , for each scanning direction and depth (time) direction of the raster.
  • the point spread function 2204 upon beamforming to coincide with the elliptic region 2205 indicating the spread of the signals of the object. Further in the image edge part, the point spread function 2204 is controlled not to expand unnecessarily out of the imaging target. Therefore, as shown in FIG. 22( d ), it is possible to create an image not including any intensity unevenness, and it is effective for enhancing the image quality.
  • the configuration of the receive beamformer 108 is not limited to the configuration of FIG. 21 , and the receive beamformer 108 performing the receive beamforming of the first to the ninth embodiment may be employed.
  • the send focus is fixed to one or plural points. If a dynamic focus for sending is used so as to change the send focus in the depth direction, there may occur unevenness in the elliptic region 2205 indicating the spread of signals of the object. Therefore, the tenth embodiment is applicable.
  • FIG. 22( a ) to FIG. 22( d ) illustrate an image obtained by using the convex probe, but the tenth embodiment may be applicable to any other type of probe such as a linear probe, a sector probe, a 2D array probe, a mechanical 3D probe, and any other raster scanning method.
  • the eleventh embodiment is directed to the configuration that determines in advance the distribution of the steering angles, but in the twelfth embodiment, it is obtained from the distribution of signal strength or the distribution of intensity in the B-mode image.
  • the receive beamforming is performed by using thus obtained distribution of the steering angles.
  • the adjuster converts the distribution of the intensity or the signal strength of the image, into a distribution of the spread angles of steering vectors, through the use of a predetermined function, the image having been obtained with the setting of steering vectors using the distribution of the spread angles of the steering vectors being stored in the memory.
  • a difference between the distribution of the spread angles of the steering vectors being stored in the memory and the distribution of the spread angles of the steering vectors obtained through the use of the function, is used to correct either the distribution of the spread angles of the steering vectors being stored in the memory, or the distribution of signal strength or the intensity distribution.
  • the beamforming of any of the first to the ninth embodiments is performed initially, with the use of the fixed steering angle as shown in FIG. 22( a ), and the B-mode image as shown in FIG. 22( b ) is obtained.
  • the adjuster 501 obtains the distribution of intensity or the distribution of the post-beamforming signal strength 230 in the imaging target, as illustrated in FIG. 23( a ).
  • the intensity distribution or the signal strength distribution 230 being obtained is fitted to a predetermined function for converting the distribution of the intensity or the signal strength into the steering angle distribution, so as to obtain the steering angle distribution 231 as shown in FIG. 23( b ). Then, the receive beamforming of the eleventh embodiment is performed as shown in FIG. 22( c ), thereby actively improving the intensity unevenness in the image.
  • the function for converting the distribution of the intensity or signal strength into the steering angle distribution is obtained in advance by experiments, or computations (including a computer simulation), and the memory 2215 , or the like, stores the function in the form of database like a table, or in the form of mathematical formula.
  • the receive beamforming is performed as shown in FIG. 22( c ), and as to thus obtained B-mode image of FIG. 22( d ), the intensity distribution or the post-beamforming signal strength distribution 230 within the imaging target may be obtained as illustrated in FIG. 23( a ).
  • the obtained intensity distribution of the signal strength distribution is converted into the steering angle distribution 231 through the use of the aforementioned function.
  • the obtained steering angle distribution 231 is compared with the steering angle distribution 232 used for generating the B-mode image of FIG. 22( d ), and a difference therebetween is obtained and the difference is reflected on the beamforming process.
  • a method that modifies the steering angle distribution stored in the memory 2215 in accordance with the difference and performs the receive beamforming again with the use of the modified distribution of steering angles, or a method that modifies the intensity distribution of the signal strength distribution of the B-mode image, in accordance with the obtained difference.
  • the steering angle distribution may be obtained actively, or modified with the feedback operation.
  • This process is applicable not only to the intensity unevenness when the adaptive beamformer is utilized, but also the intensity unevenness due to energy inhomogeneity of acoustic signals that occurs also in the beamforming according to a conventional DAS (Delay And Sum: delay addition process). Therefore, this may contribute to enhancement of an ultrasound B-mode image inherently.
  • DAS Delay And Sum: delay addition process
  • FIG. 24 illustrates one example of the console 110 of the present invention.
  • the operating portion configured to change the number of the steering vectors and the density thereof, and the number of the beamforming results to be combined, in the angle direction, in the time direction, and in the raster direction, and the like.
  • switches 1303 are provided on the console 110 , as the operating portion configured to switch the mode of the steering angle explained in each of the aforementioned embodiments, and to switch between combining the beamforming results and not combining the beamforming results.
  • the operator is allowed to change various parameters for the beamforming and synthesizing process, while viewing the actual ultrasound image, and conduct diagnosis under conditions being optimum for each test subject 100 .
  • the set values may be displayed in a part of the display area 1304 on the monitor 103 .
  • a mode-switching part in the console may perform switching, in association with the various types of the probe, imaging conditions, and imaging sequences.
  • the switching part may switch to the mode that applies a set of various steering angles, being different depending on the probe types, such as a linear probe, a convex probe, a sector probe, a two-dimensional array probe, and a mechanical 3D imaging probe, so as to generate an image.
  • the switching part may switch to the mode that applies a set of various steering angles, being different depending on the imaging conditions, such as the send focus and receive focus, send/receive frequency, frame rate, parallel beamforming count, tissue harmonics, and contrast, and the imaging sequence, so as to generate an image.
  • FIG. 25 is a perspective view illustrating another specific example of the console 110 and the monitor 103 of the ultrasound diagnostic apparatus according to the present invention.
  • a manually handling operating portion 1403 e.g., a mouse
  • the operator is allowed to generate the image 1402 after the adaptive processing of the present invention is applied thereto, only as to a specific ROI 1401 .
  • the image 1402 after the adaptive processing of the present invention is applied may be displayed in a different area on the monitor 103 .
  • the operator sets the arithmetic parameters respectively of the peripheral information operator 205 and the peripheral information combiner 206 , through the use of the operating portion 1403 that is manipulated by the operator.
  • the adjuster 501 as shown in FIG. 11 receives the arithmetic parameters being set via the operating portion 1403 , outputs control signals to the angle adjuster 502 and the combined output adjuster 503 , and adjust the arithmetic parameters in the adaptive weight operator 301 and the output data in the temporary storage 800 .
  • FIG. 26 shows ultrasound images (B-mode images) obtained by simulation for illustrating effects of the embodiments of the present invention.
  • six point scatterers are assumed as the test subject 100 , and the ultrasound images thereof are obtained by a computer.
  • FIG. 26( a ) illustrates the image 2300 and the image 2303 where no wavefront distortion occurs in the ultrasound beam.
  • the image 2300 is obtained by the receive beamforming according to the conventional delay-and-sum method, and the image 2303 is obtained by the conventional adaptive beamforming.
  • FIG. 26( b ) illustrates the images 2301 , 2302 , 2304 , and 2305 where wavefront distortion occurs in the ultrasound beam.
  • the image 2301 is an image obtained by the receive beamforming according to the conventional delay-and-sum method
  • the image 2304 is an image obtained by the conventional adaptive beamforming.
  • the image 2302 is an image obtained according to the delay-and-sum method and steering vectors being combined similar to the method of the second or the third embodiment.
  • the image 2305 is an image obtained by performing the adaptive beamforming and combining the steering vectors according to the method of the second or the third embodiment.
  • FIG. 27 is a graph showing the intensity distribution of one point scatterer and the surrounding thereof in the images 2301 , 2304 , and 2305 of FIG. 26 .
  • the horizontal axis of the graph in FIG. 27 indicates the azimuth direction of the receive array (scanning direction of the raster), and the vertical axis indicates the intensity of the image.
  • the position where the horizontal axis is just zero corresponds to the position of the point scatterer.
  • the dashed-dotted line 2402 indicates the intensity distribution of the image 2304 according to the conventional adaptive beamformer
  • the solid line 2403 indicates the intensity distribution of the image 2305 of the adaptive beamformer to which the synthesis of steering vectors in the second or the third embodiment is applied.
  • the intensity peak position in the intensity distribution of the image 2301 according to the delay-and-sum method is displaced from the real position of the object.
  • the intensity peak position in the intensity distribution of the image 2304 according to the conventional adaptive beamformer becomes closer to the real position of the object, but the magnitude of the intensity is lowered.
  • the signal strength is maintained to be equivalent to the intensity distribution by the delay-and-sum method (the dotted line 2401 ), the peak intensity is located on the real position of the object (the position zero on the horizontal axis), and therefore this ascertains the effect produced by compensating for the wavefront distortion.
  • the adaptive beamformer is provided with the ability of avoiding the failure in picking up information due to sharp beam directivity.
  • another ability of canceling and reducing the unnecessary correlative noise from the medium around the focus Accordingly, the present invention provides the ultrasound imaging apparatus having a robust property against the wavefront distortion, caused by sound-velocity inhomogeneity within a living body, a distribution of scatterers, and body motion influence.
  • the present invention further allows a point image made up of plural perspective angles to become high in resolution, without performing the focus calculation (delay calculation) one by one for each target point in the test subject 100 . Therefore, according to the present invention, it is possible to provide the ultrasound imaging apparatus with the adaptive beamformer that needs relatively small processing loads for solving the problems above.
  • the first embodiment relates to a configuration of the apparatus provided with the peripheral information operator and the peripheral information combiner.
  • the ultrasound imaging apparatus is provided with plural elements configured to receive ultrasound signals from a test subject, a delay unit configured to delay each of the signals received by the plural elements in association with a predetermined position of the receive focus, and generate post-delay received signals
  • the peripheral information operator configured to acquire from the post-delay received signals, information items as to plural points on the receive focus and in the region surrounding the receive focus
  • the peripheral information combiner configured to combine the information items respectively acquired as to the plural points and generate a beamforming output by using the information items being combined
  • an image processor configured to generate image data by using the beamforming output.
  • the fourth embodiment relates to a configuration for performing synthetic aperture in the apparatus as described in the aforementioned item (1).
  • the fourth embodiment relates to the ultrasound imaging apparatus where the delay unit generates the post-delay received signals respectively as to plural different receive focuses, and the peripheral information operator and the peripheral information combiner acquire the information items as to the plural points on the receive focus and in the region surrounding the receive focus, with respect to each of the plural receive focuses, and generate the beamforming output.
  • the fourth embodiment also relates to a configuration that there are plural delay units in the apparatus as described in the aforementioned item (2). In other words, there are more than one delay unit, and each of the delay units generates the post-delay received signals for the receive focus that is different for each of the plural delay units.
  • the peripheral information operator and the peripheral information combiner are provided for each of the delay units, acquire the information items from the post-delay received signals generated by the delay unit for the each of the receive focuses, and generate the beamforming output.
  • the fourth embodiment also relates to a configuration that the synthetic aperture is performed in the apparatus as described in the aforementioned item (2) or in the item (3).
  • an active channel setter configured to set active channels sequentially to the plural elements, at the positions being different in time series, and transfer to the delay unit, the received signals of the elements included in the active channel.
  • the positions of the plural receive focuses as to the received signals in the active channel at a certain point of time partially overlap the positions of the plural receive focuses as to the active channel at a different point of time.
  • the second embodiment relates to a configuration that the adaptive beamforming is applied, in any of the apparatus in the aforementioned items (1) to (4).
  • the peripheral information operator performs the adaptive beamforming, thereby obtaining the adaptive weight as the information item.
  • the second embodiment relates to a configuration that the peripheral information operator in the apparatus as described in the aforementioned item (5) uses the steering vectors being directional vectors connecting a predetermined element among the plural elements and the plural points, so as to obtain the adaptive weights as to the plural points.
  • the second embodiment relates to a configuration that in the apparatus as described in the aforementioned item (6), the covariance matrix is used to obtain the adaptive weight vectors.
  • the peripheral information operator includes the matrix operator configured to use the post-delay received signals to generate the covariance matrix, and the weight vector operator configured to obtain the adaptive weight vectors as to the plural points, from the covariance matrix and the steering vectors.
  • the second embodiment relates to a configuration that in the apparatus as described in any of the aforementioned items from (5) to (7), the adaptive weights are combined.
  • the peripheral information combiner includes the weight combiner configured to sum the adaptive weights as to the plural points obtained by the peripheral information operator and generate the combined weight, and the inner-product operator configured to perform inner-product operation between the combined weight and the post-delay received signals, and generate the beamforming output.
  • the second embodiment relates to a configuration that in the apparatus as described in the aforementioned item (8), the adaptive weight is multiplied by the fixed weight. That is, between the peripheral information operator and the weight combiner, there is arranged a fixed apodization multiplier configured to multiply the adaptive weights as to the plural points obtained by the peripheral information operator, respectively by the predetermined fixed weights.
  • the second embodiment also relates to a configuration that in the apparatus as described in the aforementioned item (8), assigning weights on the post-delay received signals and summing the signals are performed. In other words, the inner-product operator multiplies each of the post-delay received signals by the combined weight, and thereafter, sums the post-delay received signals to generate the beamforming output.
  • the third embodiment relates to a configuration that in the apparatus as described in any of the aforementioned items from (5) to (7), the received signals as to the plural points are subjected to beamforming using the respective adaptive weights, and then those signals are added.
  • the peripheral information combiner includes the plural inner-product operators configured to perform inner-product operation between the adaptive weights as to the plural points obtained by the peripheral information operator and the post-delay received signals respectively, and generate the pre-synthesis beamforming outputs for the respective plural adaptive weights, and the output combiner configured to add and combine the pre-synthesis beamforming outputs as to the plural adaptive weights, and generate the beamforming output that is used for generating the image data.
  • the third embodiment also relates to a configuration that in the apparatus as described in the aforementioned item (11), the plural adaptive weights are multiplied respectively by the fixed weights. That is, between the peripheral information operator and the plural inner-product operators, there is arranged the fixed apodization multiplier configured to multiply the adaptive weights as to the plural points obtained by the peripheral information operator, by predetermined fixed weights, respectively.
  • the third embodiment also relates to a configuration that in the apparatus as described in the aforementioned item (11), the post-delay received signals after the beamforming are multiplied by the fixed weights, respectively.
  • the third embodiment also relates to a configuration that in the apparatus as described in the aforementioned item (11), assigning weights on the post-delay received signals and adding the signals are performed.
  • the plural inner-product operators are made up of a set of multipliers and plural adders, and the plural multipliers multiply the post-delay received signals by the adaptive weights as to the plural points respectively, and thereafter, in the plural adders, the post-delay received signals are added, thereby generating the pre-synthesis beamforming outputs.
  • the third embodiment also relates to a configuration that in the apparatus as described in the aforementioned item (13-A), multiplication by the fixed weights is performed. That is, between the plural multipliers and the plural adders, there is arranged the fixed apodization multiplier configured to multiply the post-multiplication received signals through the plural multipliers, by the fixed weights being predetermined as to the plural points obtained by the peripheral information operator.
  • the third embodiment also relates to a configuration that in the apparatus as described in any of the aforementioned items (1) to (13), the beamforming outputs are combined within an identical raster.
  • an active channel setter configured to set active channels sequentially to the plural elements, at the positions being different in time series, and transfer to the delay unit, the received signals of the elements included in the active channels.
  • the peripheral information combiner combines the information items that the peripheral information operator obtains from the received signals in one of the active channels, and generates a final beamforming output that is used by the image processor.
  • the third embodiment also relates to a configuration that in the apparatus as described in the aforementioned items (11), the beamforming outputs are combined within an identical raster.
  • the active channel setter configured to set active channels sequentially to the plural elements, at the positions being different in time series, and transfer to the delay unit, the received signals of the elements included in the active channels.
  • the peripheral information combiner sums the pre-synthesis beamforming outputs respectively as to the plural adaptive weights that the peripheral information operator obtains from the received signals in one of the active channels, and generates a final beamforming output that is used by the image processor, with respect to each of the active channels.
  • the fifth embodiment relates to a configuration that in the apparatus as described in the aforementioned item (11), synthesizing is performed by the use of a temporary storage. In other words, between the beamformer and the adder, there is provided the storing unit configured to store the pre-synthesis beamforming outputs generated by the beamformer, as to each of the plural adaptive weights.
  • the sixth embodiment relates to a configuration that in the apparatus as described in the aforementioned item (16), synthesizing is performed using plural samples in the time direction. That is, the adder acquires from the storing unit and the beamformer, the pre-synthesis beamforming outputs different in the time direction, and sums those pre-synthesis beamforming outputs, so as to generate the beamforming output.
  • the seventh embodiment and the eighth embodiment relate to a configuration that in the apparatus as described in the aforementioned item (16), synthesizing is performed between plural rasters.
  • the active channel setter configured to set active channels sequentially to the plural elements, at the positions being different in time series, and transfer to the delay unit, the received signals of the elements included in the active channels.
  • the beamformer generates the pre-synthesis beamforming outputs respectively as to the plural adaptive weights, for each of the active channels, the adder acquires from the storing unit and the beamformer, the pre-synthesis beamforming outputs generated for the different active channels and sums the beamforming outputs, so as to generate the beamforming output.
  • the ninth embodiment relates to a configuration that in the apparatus as described in the aforementioned item (16), the plural elements correspond to a two-dimensional array arranged two-dimensionally, and the peripheral information operator generates the pre-synthesis beamforming outputs as to the plural points set in the two-dimensional direction, and the storing unit stores the pre-synthesis beamforming outputs.
  • the tenth embodiment relates to a configuration that in the apparatus as described in the aforementioned item (6), there is provided the adjuster configured to adjust at least either one of the number of the steering vectors and the directions of the steering vectors.
  • the eleventh embodiment relates to a configuration that in the apparatus as described in the aforementioned item (20), setting of the steering angle distribution within an image is performed.
  • the memory configured to store in advance a distribution of spread angles of the steering vectors directed to the plural points, in association with the positions of the receive focus within the image that is generated by the image processor, and the adjuster obtains the adaptive weights, by using the steering vectors with the spread angles being read from the memory, in association with the position of the receive focus.
  • the eleventh embodiment relates to a relation between the steering angle distribution and the send focus in the apparatus as described in the aforementioned item (21).
  • the spread angle of the steering vector stored in the memory is set to be small on the receive focus being close to the send focus upon sending the ultrasound signal to the test subject, and the spread angle is set to be large at the receive angle being distant from the send focus.
  • the eleventh embodiment relates to a relation between the steering angle distribution and the image edge in the apparatus as described in the aforementioned item (21) or item (22).
  • the spread angle of the steering vector stored in the memory is set to be smaller at the edge portion, compared to the center of the image.
  • the twelfth embodiment relates to a configuration that in the apparatus as described in any of the aforementioned items (21), (22), and (23), the steering angle distribution within the image is computed from the B-mode image.
  • the adjuster employs a predetermined function to convert the distribution of the B-mode image intensity or signal strength of the test subject imaged in advance, into the spread angle distribution of the steering vectors being obtained, and uses thus obtained spread angle distribution of the steering vectors.
  • the twelfth embodiment relates to a configuration that in the apparatus as described in any of the aforementioned items (21), (22), and (23), the steering angle distribution within the image is corrected.
  • the adjuster uses the predetermined function to convert the distribution of the B-mode image intensity or signal strength, obtained by setting the steering vectors with the use of the spread angle distribution of the steering vectors that is stored in the memory, into the distribution of the spread angle of the steering vectors.
  • the spread angle distribution, the image intensity, or the signal strength of the steering vectors within the memory is corrected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Acoustics & Sound (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Gynecology & Obstetrics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Human Computer Interaction (AREA)
US14/378,507 2012-02-15 2013-01-23 Ultrasonic imaging device Abandoned US20150025385A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2012-030991 2012-02-15
JP2012030991 2012-02-15
PCT/JP2013/051305 WO2013121842A1 (ja) 2012-02-15 2013-01-23 超音波撮像装置

Publications (1)

Publication Number Publication Date
US20150025385A1 true US20150025385A1 (en) 2015-01-22

Family

ID=48983967

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/378,507 Abandoned US20150025385A1 (en) 2012-02-15 2013-01-23 Ultrasonic imaging device

Country Status (5)

Country Link
US (1) US20150025385A1 (ja)
EP (1) EP2815701A4 (ja)
JP (1) JP5913557B2 (ja)
CN (1) CN104114099B (ja)
WO (1) WO2013121842A1 (ja)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150073277A1 (en) * 2013-09-10 2015-03-12 Seiko Epson Corporation Ultrasonic measurement apparatus, ultrasonic image apparatus, and ultrasonic measurement method
US20180085091A1 (en) * 2016-09-26 2018-03-29 Seiko Epson Corporation Ultrasonic measurement device, and method of controlling ultrasonic measurement device
EP3257446A4 (en) * 2015-02-12 2018-10-17 Hitachi, Ltd. Ultrasonic imaging device, method for adjusting inter-transmission weight, and ultrasonic imaging method
US10209352B2 (en) * 2014-06-12 2019-02-19 Koninklijke Philips N.V. Ultrasound transducer assembly
US20200200886A1 (en) * 2017-05-11 2020-06-25 Koninklijke Philips N.V. Methods and systems for controlling the generation of a compound ultrasound image
US11076827B2 (en) 2015-02-18 2021-08-03 Hitachi, Ltd. Ultrasound image capturing device and method of processing ultrasound signal

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015071028A (ja) * 2013-09-05 2015-04-16 セイコーエプソン株式会社 超音波測定装置、超音波画像装置及び超音波測定方法
JP6352050B2 (ja) * 2014-05-19 2018-07-04 キヤノンメディカルシステムズ株式会社 超音波診断装置
JP6398614B2 (ja) * 2014-10-30 2018-10-03 セイコーエプソン株式会社 超音波測定装置、超音波診断装置及び超音波測定方法
US20170311928A1 (en) * 2014-11-07 2017-11-02 Samsung Electronics Co., Ltd. Ultrasonic imaging apparatus and method of controlling the same
JP2017164408A (ja) * 2016-03-18 2017-09-21 セイコーエプソン株式会社 画像生成装置および画像生成方法
JP6747108B2 (ja) * 2016-07-05 2020-08-26 コニカミノルタ株式会社 超音波信号処理装置、超音波信号処理方法、及び、超音波診断装置
CN109843180A (zh) * 2016-10-09 2019-06-04 柯惠有限合伙公司 用于驱动超声成像换能器的系统和方法
CN108324324A (zh) * 2018-03-12 2018-07-27 西安交通大学 一种超声低频经颅容积超分辨率三维造影成像方法及系统
JP7008549B2 (ja) * 2018-03-16 2022-01-25 富士フイルムヘルスケア株式会社 超音波診断装置
CN109223035B (zh) * 2018-08-21 2021-09-28 青岛海信医疗设备股份有限公司 超声信号处理方法、装置、设备及存储介质

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6482160B1 (en) * 1999-11-24 2002-11-19 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence High resolution 3D ultrasound imaging system deploying a multidimensional array of sensors and method for multidimensional beamforming sensor signals
US20050033165A1 (en) * 2003-07-25 2005-02-10 Siemens Medical Solutions Usa , Inc. Adaptive grating lobe suppression in ultrasound imaging
US7744532B2 (en) * 2004-03-31 2010-06-29 Siemens Medical Solutions Usa, Inc. Coherence factor adaptive ultrasound imaging methods and systems
US20120271144A1 (en) * 2011-04-20 2012-10-25 Samsung Electronics, Ltd. Method and apparatus for generating diagnosis image, diagnosis system, and medical image system for performing the method
US20140058262A1 (en) * 2012-08-23 2014-02-27 Canon Kabushiki Kaisha Object information acquiring apparatus, information processing apparatus and object information acquiring method
US20140064023A1 (en) * 2012-08-28 2014-03-06 Canon Kabushiki Kaisha Object information acquisition apparatus, display method, and storage medium
US20140063002A1 (en) * 2012-08-28 2014-03-06 Canon Kabushiki Kaisha Object information acquisition apparatus, display method, and computer-readable medium storing program
US20140064022A1 (en) * 2012-08-28 2014-03-06 Canon Kabushiki Kaisha Object information obtaining apparatus, display method, and storage medium
US20140064021A1 (en) * 2012-08-28 2014-03-06 Canon Kabushiki Kaisha Object information acquisition apparatus, display method, and storage medium
US20140121516A1 (en) * 2012-10-30 2014-05-01 Samsung Electronics Co., Ltd. Method of determining beamforming coefficient, beamforming method and ultrasonic imaging apparatus
US20140198621A1 (en) * 2013-01-11 2014-07-17 Industry Academic Cooperation Foundation, Hallym University Beamforming module, ultrasonic imaging apparatus using the same, beamforming method using the beamforming module, and method of controlling the ultrasonic imaging apparatus using the beamforming module
US20140240482A1 (en) * 2011-09-15 2014-08-28 Hitachi Medical Corporation Ultrasound imaging apparatus
US20150016215A1 (en) * 2013-07-11 2015-01-15 Samsung Electronics Co., Ltd. Image processing module, ultrasound imaging apparatus, image processing method, and control method of ultrasound imaging apparatus
US20150016226A1 (en) * 2013-07-11 2015-01-15 Samsung Electronics Co., Ltd. Beamformer, beamforming method, ultrasonic imaging apparatus, and control method of ultrasonic imaging apparatus
US20150351720A1 (en) * 2013-01-11 2015-12-10 Hitachi Aloka Medical, Ltd. Ultrasonic imaging device

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3991616A (en) 1975-09-08 1976-11-16 Hans Noll Automatic pipetter
JP3474278B2 (ja) 1994-03-16 2003-12-08 フクダ電子株式会社 超音波診断装置
JP3559774B2 (ja) 2000-06-10 2004-09-02 株式会社 メディソン 多段構造の遅延素子を用いる超音波受信ビーム成形装置
KR100406099B1 (ko) * 2001-09-05 2003-11-14 주식회사 메디슨 다단계 구조의 펄스 압축기를 이용한 초음파 영상 형성 장치 및 방법
JP3910860B2 (ja) * 2002-02-05 2007-04-25 株式会社日立メディコ 超音波撮像装置
KR100490565B1 (ko) * 2002-07-23 2005-05-19 주식회사 메디슨 아날로그 멀티플렉서를 이용한 디지털 수신 집속 장치
JP5016326B2 (ja) * 2007-03-06 2012-09-05 株式会社日立メディコ 超音波診断装置
JP2009089940A (ja) * 2007-10-10 2009-04-30 Toshiba Corp 超音波診断装置
JP5460144B2 (ja) 2008-08-11 2014-04-02 キヤノン株式会社 超音波受信ビーム成形装置
JP2010082371A (ja) * 2008-10-02 2010-04-15 Canon Inc 超音波受信ビーム成形装置
JP5683860B2 (ja) * 2009-07-28 2015-03-11 株式会社東芝 超音波診断装置、超音波画像処理装置、超音波診断装置制御プログラム及び超音波画像処理プログラム

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6482160B1 (en) * 1999-11-24 2002-11-19 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence High resolution 3D ultrasound imaging system deploying a multidimensional array of sensors and method for multidimensional beamforming sensor signals
US20030065262A1 (en) * 1999-11-24 2003-04-03 Stergios Stergiopoulos High resolution 3D ultrasound imaging system deploying a multi-dimensional array of sensors and method for multi-dimensional beamforming sensor signals
US6719696B2 (en) * 1999-11-24 2004-04-13 Her Majesty The Queen In Right Of Canada As Represented By The Minister Of National Defence High resolution 3D ultrasound imaging system deploying a multi-dimensional array of sensors and method for multi-dimensional beamforming sensor signals
US20050033165A1 (en) * 2003-07-25 2005-02-10 Siemens Medical Solutions Usa , Inc. Adaptive grating lobe suppression in ultrasound imaging
US7207942B2 (en) * 2003-07-25 2007-04-24 Siemens Medical Solutions Usa, Inc. Adaptive grating lobe suppression in ultrasound imaging
US20070173722A1 (en) * 2003-07-25 2007-07-26 Siemens Medical Solutions Usa, Inc. Adaptive grating lobe suppression in ultrasound imaging
US7887486B2 (en) * 2003-07-25 2011-02-15 Siemens Medical Solutions Usa, Inc. Adaptive grating lobe suppression in ultrasound imaging
US7744532B2 (en) * 2004-03-31 2010-06-29 Siemens Medical Solutions Usa, Inc. Coherence factor adaptive ultrasound imaging methods and systems
US20120271144A1 (en) * 2011-04-20 2012-10-25 Samsung Electronics, Ltd. Method and apparatus for generating diagnosis image, diagnosis system, and medical image system for performing the method
US9261586B2 (en) * 2011-04-20 2016-02-16 Samsung Electronics Co., Ltd. Method and apparatus for generating diagnosis image, diagnosis system, and medical image system for performing the method
US20140240482A1 (en) * 2011-09-15 2014-08-28 Hitachi Medical Corporation Ultrasound imaging apparatus
US20140058262A1 (en) * 2012-08-23 2014-02-27 Canon Kabushiki Kaisha Object information acquiring apparatus, information processing apparatus and object information acquiring method
US20140064022A1 (en) * 2012-08-28 2014-03-06 Canon Kabushiki Kaisha Object information obtaining apparatus, display method, and storage medium
US20140064021A1 (en) * 2012-08-28 2014-03-06 Canon Kabushiki Kaisha Object information acquisition apparatus, display method, and storage medium
US20140063002A1 (en) * 2012-08-28 2014-03-06 Canon Kabushiki Kaisha Object information acquisition apparatus, display method, and computer-readable medium storing program
US20140064023A1 (en) * 2012-08-28 2014-03-06 Canon Kabushiki Kaisha Object information acquisition apparatus, display method, and storage medium
US9367945B2 (en) * 2012-08-28 2016-06-14 Canon Kabushiki Kaisha Object information acquisition apparatus, display method, and computer-readable medium storing program
US9435881B2 (en) * 2012-08-28 2016-09-06 Canon Kabushiki Kaisha Object information acquisition apparatus, display method, and storage medium
US20140121516A1 (en) * 2012-10-30 2014-05-01 Samsung Electronics Co., Ltd. Method of determining beamforming coefficient, beamforming method and ultrasonic imaging apparatus
US9629613B2 (en) * 2012-10-30 2017-04-25 Samsung Electronics Co., Ltd. Method of determining beamforming coefficient, beamforming method and ultrasonic imaging apparatus
US20140198621A1 (en) * 2013-01-11 2014-07-17 Industry Academic Cooperation Foundation, Hallym University Beamforming module, ultrasonic imaging apparatus using the same, beamforming method using the beamforming module, and method of controlling the ultrasonic imaging apparatus using the beamforming module
US20150351720A1 (en) * 2013-01-11 2015-12-10 Hitachi Aloka Medical, Ltd. Ultrasonic imaging device
US20150016215A1 (en) * 2013-07-11 2015-01-15 Samsung Electronics Co., Ltd. Image processing module, ultrasound imaging apparatus, image processing method, and control method of ultrasound imaging apparatus
US20150016226A1 (en) * 2013-07-11 2015-01-15 Samsung Electronics Co., Ltd. Beamformer, beamforming method, ultrasonic imaging apparatus, and control method of ultrasonic imaging apparatus

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150073277A1 (en) * 2013-09-10 2015-03-12 Seiko Epson Corporation Ultrasonic measurement apparatus, ultrasonic image apparatus, and ultrasonic measurement method
US9867594B2 (en) * 2013-09-10 2018-01-16 Seiko Epson Corporation Ultrasonic measurement apparatus, ultrasonic image apparatus, and ultrasonic measurement method
US10209352B2 (en) * 2014-06-12 2019-02-19 Koninklijke Philips N.V. Ultrasound transducer assembly
US10451717B2 (en) 2014-06-12 2019-10-22 Koninklijke Philips N.V. Ultrasound transducer assembly
EP3257446A4 (en) * 2015-02-12 2018-10-17 Hitachi, Ltd. Ultrasonic imaging device, method for adjusting inter-transmission weight, and ultrasonic imaging method
US11253226B2 (en) * 2015-02-12 2022-02-22 Fujifilm Healthcare Corporation Ultrasonic imaging device, method for adjusting inter-transmission weight, and ultrasonic imaging method
US11076827B2 (en) 2015-02-18 2021-08-03 Hitachi, Ltd. Ultrasound image capturing device and method of processing ultrasound signal
US20180085091A1 (en) * 2016-09-26 2018-03-29 Seiko Epson Corporation Ultrasonic measurement device, and method of controlling ultrasonic measurement device
US20200200886A1 (en) * 2017-05-11 2020-06-25 Koninklijke Philips N.V. Methods and systems for controlling the generation of a compound ultrasound image
US11719797B2 (en) * 2017-05-11 2023-08-08 Koninklijke Philips N.V. Methods and systems for controlling the generation of a compound ultrasound image

Also Published As

Publication number Publication date
EP2815701A4 (en) 2015-04-22
EP2815701A1 (en) 2014-12-24
WO2013121842A1 (ja) 2013-08-22
JPWO2013121842A1 (ja) 2015-05-11
CN104114099B (zh) 2016-02-17
CN104114099A (zh) 2014-10-22
JP5913557B2 (ja) 2016-04-27

Similar Documents

Publication Publication Date Title
US20150025385A1 (en) Ultrasonic imaging device
JP6408297B2 (ja) ビームフォーミング方法、計測イメージング装置、及び、通信装置
US9754185B2 (en) Ultrasound imaging apparatus
EP1686393A2 (en) Coherence factor adaptive ultrasound imaging
JP6189867B2 (ja) 超音波撮像装置
CN108209971B (zh) 超声波信号处理装置和方法以及超声波诊断装置
US20050033170A1 (en) Corrections for wavefront aberrations in ultrasound imaging
US20170238908A1 (en) Ultrasound diagnostic device
US20150327840A1 (en) Ultrasonic diagnostic device and correction method
JP2018057560A (ja) 超音波信号処理装置、超音波信号処理方法、及び、超音波診断装置
US20070083109A1 (en) Adaptive line synthesis for ultrasound
WO2013180269A1 (ja) 超音波撮像装置
EP2700976B1 (en) Ultrasound imaging apparatus and method for ultrasound imaging
CN107569254B (zh) 超声波信号处理装置、超声波信号处理方法以及超声波诊断装置
EP3384313B1 (en) An imaging method, an apparatus implementing said method, a computer program and a computer-readable storage medium
Wang et al. A mixed transmitting–receiving beamformer with a robust generalized coherence factor: Enhanced resolution and contrast
JP7387249B2 (ja) 超音波診断装置、医用画像処理装置及び医用画像処理プログラム
Lou et al. Improved contrast for high frame rate imaging using coherent compounding combined with spatial matched filtering
US11413012B2 (en) Ultrasound signal processing device and ultrasound signal processing method
Guo et al. Wavenumber Beamforming with Sub-Nyquist Sampling for Focus-Beam Ultrasound Imaging
Soozande et al. Virtually Extended Array Imaging Improves Lateral Resolution in High Frame Rate Volumetric Imaging
KR20150124479A (ko) 공간 일관성 기초 초음파 신호 처리 모듈 및 그에 의한 초음파 신호 처리 방법

Legal Events

Date Code Title Description
AS Assignment

Owner name: HITACHI, LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:IKEDA, TEIICHIRO;MASUZAWA, HIROSHI;TABARU, MARIE;AND OTHERS;SIGNING DATES FROM 20140804 TO 20140825;REEL/FRAME:033636/0234

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION