CN112513673A - Ultrasound imaging system with improved dynamic range control - Google Patents

Ultrasound imaging system with improved dynamic range control Download PDF

Info

Publication number
CN112513673A
CN112513673A CN201980048987.5A CN201980048987A CN112513673A CN 112513673 A CN112513673 A CN 112513673A CN 201980048987 A CN201980048987 A CN 201980048987A CN 112513673 A CN112513673 A CN 112513673A
Authority
CN
China
Prior art keywords
image
dynamic range
imaging system
ultrasound imaging
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201980048987.5A
Other languages
Chinese (zh)
Other versions
CN112513673B (en
Inventor
F·G·G·M·维尼翁
D·W·克拉克
D·P·亚当斯
D·普拉特
A·佩里亚帕特纳纳根德拉
K·拉达克里希南
R·A·西夫莱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Publication of CN112513673A publication Critical patent/CN112513673A/en
Application granted granted Critical
Publication of CN112513673B publication Critical patent/CN112513673B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52071Multicolour displays; using colour coding; Optimising colour or information content in displays, e.g. parametric imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52077Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging with means for elimination of unwanted signals, e.g. noise or interference
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • G01S7/52084Constructional features related to particular user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Human Computer Interaction (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

An ultrasound imaging system has an improved dynamic range control that enables a user to reduce image fog with little effect on bright tissue in the image and without increasing speckle variation. The dynamic range processor processes the input image to generate an approximation image that is a spatially low pass filtered version of the input image. The dynamic range of the approximate image is compressed, and image details unaffected by the compression are re-added to produce a dynamically compressed image for display.

Description

Ultrasound imaging system with improved dynamic range control
Technical Field
The present invention relates to ultrasound imaging systems, and in particular to ultrasound imaging systems having dynamic range controls for adjusting fog removal and tissue filling without affecting the dynamic range of image speckle.
Background
Ultrasound imaging systems typically have two types of user controls for adjusting the appearance of an image: an image gain control and an image dynamic range control. These controls are typically adjusted to approve the image appearance when the user is viewing the ultrasound image of the region of interest. For example, a user may be viewing an image of the heart of a heart chamber containing a blood pool. If the gain is set too high, the blood pool is often obscured by the mist on the heart chamber. The user can reduce or eliminate the fog by adjusting the gain down so that the resulting heart chamber has the desired solid black shading. While gain reduction does not generally affect image contrast, adjusting the gain downward will darken the white shade of tissue, which can reduce the clarity and resolution of the tissue in the image.
The dynamic range control can also be adjusted to reduce unwanted fog. By adjusting down or reducing the dynamic range of the displayed image, low levels of haze can be attenuated, and as the image contrast increases, the bright white level of the tissue is substantially unaffected. The contrast to noise ratio does not change because the change in speckle (the range of different shades of speckle in the image) increases with increasing contrast.
Disclosure of Invention
Aspects of the present invention provide a user control that reduces unwanted image fog while not affecting the shade of bright white tissue, and both without detrimentally increasing speckle variation.
In accordance with the principles of the present invention, a user gain control is provided that reduces image fog with little effect on the appearance of bright tissue in the image and improves contrast without increasing speckle variation. In a preferred embodiment, this is done by a dynamic range processor that processes the input image values to produce an approximation image of the low spatial frequency of the tissue structure in the image. The difference between the input image and the approximation image contains image details, which preferably contain the speckle characteristics of the image. Dynamic range compression is then applied to the approximation image and not to the image details, which are then recombined with the dynamic range adjusted approximation image to produce an image to be displayed. While dynamic range compression can be accomplished with values preselected or optimized by the imaging system, it is preferable to provide user controls that enable a user to adjust the dynamic range processing to produce an image having a user-preferred appearance.
It should be understood that one or more processors (whether the same processor or different processors) may be used to run processing such as a beamformer, dynamic range processor, image processor.
Drawings
In the drawings:
figure 1 illustrates, in block diagram form, an ultrasound system configured in accordance with the principles of the present invention.
FIG. 2 illustrates a nine element kernel that may be used to generate an approximation image in accordance with the present invention.
Fig. 3 illustrates a dynamic range curve for an imaging system constructed in accordance with the present invention.
Fig. 4 illustrates a second dynamic range curve for an imaging system constructed in accordance with the present invention that stabilizes mid-range gray values during dynamic range adjustment.
Fig. 5 is a cardiac ultrasound image with unprocessed image speckle and no dynamic range adjustment.
Fig. 6 is the same cardiac ultrasound image as fig. 5 after dynamic range adjustment by the ultrasound imaging system of the present invention.
Detailed Description
Referring initially to FIG. 1, an ultrasound imaging system constructed in accordance with the principles of the present invention is shown in block diagram form. A transducer array 12 is provided in the ultrasound probe 10 for transmitting ultrasound waves and receiving echo information. The transducer array 12 may be a one-dimensional array or a two-dimensional array of transducer elements capable of scanning in two or three dimensions, such as in both elevation (in 3D) and azimuth. The transducer array 12 is coupled to an optional microbeamformer 14 in the probe, which microbeamformer 14 controls the transmission and reception of signals by the array elements. The microbeamformer is capable of at least partially beamforming signals received by groups or "tiles" of transducer elements, as described in U.S. patents US 5997479(Savord et al), US 6013032(Savord) and US 6623432(Powers et al). The microbeamformer is coupled by the probe cable to a transmit/receive (T/R) switch 16, which T/R switch 16 switches between transmit and receive and protects the main beamformer 20 from high energy transmit signals. The transmission of ultrasound beams from the transducer array 12 under the control of the microbeamformer 14 is directed by a beamformer controller 18 coupled to the T/R switches and the main beamformer 20, the beamformer controller 18 receiving input from the user's operation of a user interface or control panel 38. The transmission characteristics controlled by the transmission controller include the number, spacing, amplitude, phase, frequency, polarity, and diversity of the transmission waveforms. The beam formed in the pulse transmit direction may be steered straight forward from the transducer array or steered at different angles on either side of the non-steered beam to obtain a wider sector field of view. For some applications, unfocused plane waves may be used for transmission. Most one-dimensional array probes having a relatively short array length (e.g., 128 element array) do not use a microbeamformer, but rather are driven from and respond directly to the main beamformer.
Echoes received by a set of adjacent transducer elements are beamformed by being appropriately delayed and then combined. The partially beamformed signals produced from each tile by the microbeamformer 14 are coupled to a main beamformer 20 where the partially beamformed signals from the individual tiles of transducer elements are combined into fully beamformed coherent echo signals or the echo signals from the elements of the one-dimensional array are delayed and combined without the microbeamformer. For example, the main beamformer 20 may have 128 channels, each of which receives partially beamformed signals from a tile of 12 transducer elements or partially beamformed signals from individual elements. In this way, signals received by over 1500 transducer elements of a two-dimensional array transducer can effectively contribute to a single beamformed signal, and the signals received from the image plane are combined.
The signal processor 26 performs signal processing on the coherent echo signals, including filtering by digital filters and optionally noise reduction by spatial or frequency compounding. The filtered echo signals are coupled to a quadrature bandpass filter (QBP) 28. The QBP performs three functions: limiting the frequency band of the RF echo signal data, producing in-phase and quadrature pairs (I and Q) of echo signal data, and decimating the digital sample rate. The QBP comprises two separate filters, one producing in-phase samples and the other producing quadrature samples, wherein each filter is formed by a plurality of multiply-accumulators (MACs) implementing FIR filters. The signal processor can also shift the frequency band to a lower or baseband frequency range, as the QBP can do.
The beamformed and processed coherent echo signals are coupled to an image processor (B mode processor 30 which produces B mode images of in vivo structures such as tissue). B mode processor by computation (I)2+Q2)1/2To perform amplitude (envelope) detection of quadrature demodulated I and Q signal components. The quadrature echo signal components are also coupled to a doppler processor 34. The Doppler processor 34 stores the data from the image fieldAn aggregate set of echo signals at discrete points, which is then used to estimate the doppler shift at a point in the image using a Fast Fourier Transform (FFT) processor. The rate at which the aggregate is acquired determines the range of speeds at which the system can accurately measure and delineate motion in the image. The doppler shift is proportional to the motion (e.g., blood flow and tissue motion) at a point in the image field. For color doppler images, the estimated doppler flow values at each point in the blood vessel will be wall filtered and converted to color values using a look-up table. The wall filter has an adjustable cutoff frequency above or below which motion (e.g., low frequency motion of the vessel wall) will be rejected when imaging flowing blood. The B-mode image or the doppler image can be displayed separately or together in an anatomical registration in which a color doppler overlay shows the flow of blood in tissue and blood vessels in the tissue structure of the B-mode image.
The image data produced by the B-mode processor 30 and the doppler processor 34 are coupled to an image data memory 33, in which image data memory 33 the image data is stored in memory locations addressable according to the spatial location at which the image values were acquired. In accordance with the principles of the present invention, the sequential B-mode images are coupled to a dynamic range processor 40, which dynamic range processor 40 processes each image for dynamic range control as described below. The dynamic range adjusted image is coupled back to the image memory from where it is coupled to the display processor 36 for further enhancement, buffering, and temporary storage for display on the image display 22.
In the embodiment shown in fig. 1, the dynamic range processor 40 is an image processor that provides two functions: it computes an approximation image, which is preferably a spatially low-pass filtered or speckle-reduced version of the input image received from the image memory; and it performs dynamic range compression on the approximation image, after which image details not present in the approximation image are recombined into the approximation image. A simple space suitable for use with embodiments of the present invention is conceptually illustrated in FIG. 2A low pass filter. The diagram shows a nine-element kernel for spatially processing nine contiguous image elements of the input image I. By weighting and summing the eight surrounding image values and the central image value of the input image I, at the central position I of the kernelAThe image values for the approximation image a are calculated. Mathematically, this process can be expressed as:
Figure BDA0002910857170000051
that is, by
Figure BDA0002910857170000052
Each of the nine image elements of the kernel is weighted and then the weighted values of the kernel are summed to calculate an approximate image value. Therefore, the approximation image will contain mainly low spatial frequency structure information of the image. Although this simple example operates on only a small partial image area of nine image elements, the preferred kernel size will typically be significantly larger, preferably large enough to encompass input image elements that encompass a wide (and preferably complete) range of speckle values. Thus, the speckle range can be included in an image "detail" value D, which can be calculated as the difference between the input image I and the approximate image A at each image element position, or
D=I-A [2]
In conventional dynamic range compression, image values below a low luminance level M are mapped to full black, and image values above a high luminance level M are mapped to full white, or
(I-m)/(M-m) [3]
In an embodiment of the invention, dynamic range compression is applied to the approximation image a, but not to the image detail D. The algorithm to perform dynamic range compression in this way is:
Figure BDA0002910857170000053
where SDR is the speckle dynamic range. The algorithm takes the form of a mapping expression that maps an input image of a value of arbitrary bit length to an 8-bit range of 0 to 255 bits of standard display word length. The operation of the algorithm is to map all low level values of the approximate image a below level M to black and all high level values of the approximate image above level M to white. The speckle detail D of the original image, determined as shown in expression [2] above, is added back to the dynamic range adjusted approximate image weighted by the speckle dynamic range SDR, thus preserving the original speckle variation. In a constructed implementation, the SDR will typically be determined by an optimization process that first sets the SDR to the dynamic range of the input image. With the increase of SDR, speckle variation is reduced to realize image smoothing processing. However, such image smoothing will typically come at the expense of reducing image resolution, since detail D, while ideally containing only speckle, will actually also include some structural image detail.
The imaging system can use pre-selected values for the variables to automatically perform the above dynamic range processing, preferably by optimization when processing a greater number of images. Preferably, however, the dynamic range processor 40 is controlled by user input from the user interface 38, either in combination with or as an alternative to conventional gain and dynamic range controls. For many users, the ideal set of controls would be those that darken the chamber (reduce fog) and fill the tissue (lighten the tissue structure). The dynamic range processor described above can achieve this by providing the user with two controls: one control adjusts the M value for the dynamic range processor, while the other adjusts the M value. As the user adjusts the "m" control to increase the value of m for the processor, low levels of fog in the image will become increasingly darker. Unlike conventional gain adjustments, as the user adjusts the "M" control to reduce the value of M, the high echo generating tissue will become increasingly white and have little effect on the fog level. Alternatively, both variables can be changed by a single control that increases M while decreasing M; the haze is reduced while the tissue is brightened. The effect of these individual or combination controls is illustrated graphically in fig. 3. Without dynamic range compression, the compression curves 50-52-54 would be a continuous line from zero at the lower left corner to 255 at the upper right corner (all white in the 8-bit nomenclature). When the user control increases m (as shown by curve segment 50), all image values from zero to m will be set to zero (all black), thereby reducing the appearance of low level fog in the image. When the user control decreases by M (as shown by curve segment 54), all image values from M to 255 will be set to full white (255). As shown by curve segment 52, dynamic range compression is achieved.
Fig. 5 and 6 show ultrasound images illustrating these combined effects. Figure 5 shows a standard cardiac ultrasound image without compensation for fog and brightness. Fig. 6 shows the same cardiac image with compensation for both. It can be seen that the haze in the heart chamber in figure 6 has been significantly reduced and does not substantially reduce the brightness of the myocardial tissue surrounding the heart chamber.
The combined effect of embodiments of the present invention is summarized in the following table in comparison with the effect of standard gain and dynamic range controls:
Figure BDA0002910857170000071
in many ultrasound examinations, mid-gray shades of B-mode images are clinically important. For example, myocardial tissue in cardiac images typically appears in medium gray shades. Therefore, it is often important to preserve the characteristics and appearance of medium gray tones in an image. Fig. 4 illustrates a compression curve that achieves this by anchoring a medium gray value near point G in the compression curve. As m increases to reduce low level haze, the mid-gray shade remains fixed near mid-gray level 127 of the 8-bit scale. Thus, as m increases to reduce heart chamber fog, the appearance of the myocardial tissue in the image will remain unchanged. This variation of the control, which links the control of G with M, has proven particularly effective. As M decreases to increase the brightness of the image, the proportion of G also decreases. This preserves the appearance of myocardial tissue relative to the appearance of the tissue generated by the strongest echoes in the image.
Other variations of the above-described dynamic range control of the present invention will be readily apparent to those skilled in the art. For example, a user interface control that controls M for brightness can also be used to control the value of the speckle dynamic range SDR for detail D. When the control is adjusted to decrease M and thus increase the overall tissue intensity in the image, SDR will increase simultaneously, which has the effect of reducing speckle variation and smoothing the image as the tissue intensity increases.
It should be noted that ultrasound systems (and in particular the component structure of the ultrasound system of figure 1) suitable for use with embodiments of the present invention may be implemented in hardware, software, or a combination thereof. The various embodiments and/or components of the ultrasound system, or components and controls therein, may also be implemented as part of one or more computers or microprocessors. The computer or processor may include, for example, a computing device for accessing the internet, an input device, a display unit, and an interface. The computer or processor may comprise a microprocessor. The microprocessor may be connected to a communication bus to, for example, access a PACS system or a data network to import training images. The computer or processor may also include memory. The memory devices (e.g., image memory 32) may include Random Access Memory (RAM) and Read Only Memory (ROM). The computer or processor may also include a storage device, which may be a hard disk drive or a removable storage drive (e.g., a floppy disk drive, an optical disk drive, a solid state thumb drive, etc.). The storage device may also be other similar means for loading computer programs or other instructions into the computer or processor.
The terms "computer" or "module" or "processor" or "workstation" as used herein may include any processor-based or microprocessor-based system, including systems using microcontrollers, Reduced Instruction Set Computers (RISC), ASICs, logic circuits, and any other circuit or processor capable of executing the functions described herein. The above examples are exemplary only, and thus are not intended to limit the definition and/or meaning of these terms in any way.
The computer or processor executes a set of instructions stored in one or more storage elements in order to process input data. The storage elements may also store data or other information as desired. The storage elements may be in the form of information sources or physical storage elements within the processing machine.
The set of instructions of the ultrasound system, including those instructions that control the acquisition, processing, and display of ultrasound images as described above, may include various commands that instruct the computer or processor as a processing machine to perform specific operations such as the methods and processes of the various embodiments of the present invention. The set of instructions may be in the form of a software program. The software may be in various forms (e.g., system software or application software) and may be embodied as tangible and non-transitory computer readable media. Additionally, the software may be in the form of a collection of separate programs or modules (e.g., a transmission control module), a program module or portion of a program module within a larger program. The software may also include modular programming in the form of object-oriented programming. The processing of input data by a processing machine may be in response to operator commands, or in response to results of previous processing, or in response to a request made by another processing machine.
Furthermore, the limitations of the claims are not written in the form of functional modules and are not intended to be interpreted based on the sixth paragraph of 35 u.s.c 112. Unless such claim limitations expressly use the phrase "a unit for … …" to recite a function without further structure.

Claims (15)

1. An ultrasound imaging system with image dynamic range control, comprising:
a transducer array probe;
a beamformer coupled to the probe;
an image processor coupled to the beamformer and configured to produce an ultrasound image;
a dynamic range processor coupled to the image processor and adapted to generate an approximation image in response to an input image, to apply dynamic range compression to the approximation image and to add image details to the dynamic range compressed approximation image; and
a display coupled to the dynamic range processor and adapted to display a dynamic range compressed approximation image with added image detail.
2. The ultrasound imaging system of claim 1, wherein the approximation image further comprises a spatial low-pass version of the input image.
3. The ultrasound imaging system of claim 1, wherein the dynamic range processor is further adapted to one or more of: converting the approximate low image value range to a black value or converting the approximate high image value range to a white value.
4. The ultrasound imaging system of claim 1, wherein the image details further comprise a difference between the input image and the approximate image.
5. The ultrasound imaging system of claim 3, further comprising a user control coupled to the dynamic range processor.
6. The ultrasound imaging system of claim 5, wherein the user control is further adapted to define a low range of input values to be converted to black values.
7. The ultrasound imaging system of claim 6, wherein the user control is further adapted to simultaneously define a high range of input values to be converted to white values.
8. The ultrasound imaging system of claim 5, wherein the user control is further adapted to define a high range of input values to be converted to white values.
9. The ultrasound imaging system of claim 2, wherein the dynamic range processor is further adapted to generate an approximate image value from a plurality of input image values.
10. The ultrasound imaging system of claim 9, wherein the dynamic range processor is further adapted to spatially low pass filter the input image values with spatially adjacent image input values.
11. The ultrasound imaging system of claim 1, wherein the image details added to the dynamic range compressed approximation image further include speckle dynamic range information.
12. The ultrasound imaging system of claim 11, wherein the image details added to the dynamic range compressed approximation image further comprise image details weighted by a speckle dynamic range term.
13. The ultrasound imaging system of claim 12, further comprising: a user interface control coupled to the dynamic range processor and adapted to enable a user to change the speckle dynamic range term.
14. The ultrasound imaging system of claim 1, further comprising: a user control adapted to weight the image detail to be added to the dynamic range compressed approximation.
15. The ultrasound imaging system of claim 14, wherein the user control is further adapted to smooth the image produced by the display.
CN201980048987.5A 2018-07-24 2019-07-19 Ultrasound imaging system with improved dynamic range control Active CN112513673B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201862702682P 2018-07-24 2018-07-24
US62/702,682 2018-07-24
PCT/EP2019/069467 WO2020020761A1 (en) 2018-07-24 2019-07-19 Ultrasound imaging system with improved dynamic range control

Publications (2)

Publication Number Publication Date
CN112513673A true CN112513673A (en) 2021-03-16
CN112513673B CN112513673B (en) 2024-01-12

Family

ID=67390078

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980048987.5A Active CN112513673B (en) 2018-07-24 2019-07-19 Ultrasound imaging system with improved dynamic range control

Country Status (5)

Country Link
US (1) US11908110B2 (en)
EP (1) EP3827281B1 (en)
JP (1) JP7256258B2 (en)
CN (1) CN112513673B (en)
WO (1) WO2020020761A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210389438A1 (en) * 2017-12-29 2021-12-16 Koninklijke Philips N.V. System and method for adaptively configuring dynamic range for ultrasound image display

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5793883A (en) * 1995-09-29 1998-08-11 Siemens Medical Systems, Inc. Method for enhancing ultrasound image
WO2008010375A1 (en) * 2006-07-20 2008-01-24 Hitachi Medical Corporation Ultrasonographic device
WO2014069374A1 (en) * 2012-11-01 2014-05-08 日立アロカメディカル株式会社 Medical image diagnostic device and medical image generation method
CN104490418A (en) * 2014-09-25 2015-04-08 深圳市恩普电子技术有限公司 Automatic ultrasonic-image optimization method based on signal statistic analysis
CN107862724A (en) * 2017-12-01 2018-03-30 中国医学科学院生物医学工程研究所 A kind of improved microvascular blood flow imaging method

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4771470A (en) * 1985-11-14 1988-09-13 University Of Florida Noise reduction method and apparatus for medical ultrasound
US5799111A (en) * 1991-06-14 1998-08-25 D.V.P. Technologies, Ltd. Apparatus and methods for smoothing images
US5835618A (en) 1996-09-27 1998-11-10 Siemens Corporate Research, Inc. Uniform and non-uniform dynamic range remapping for optimum image display
US6013032A (en) 1998-03-13 2000-01-11 Hewlett-Packard Company Beamforming methods and apparatus for three-dimensional ultrasound imaging using two-dimensional transducer array
US5997479A (en) 1998-05-28 1999-12-07 Hewlett-Packard Company Phased array acoustic systems with intra-group processors
US6468216B1 (en) 2000-08-24 2002-10-22 Kininklijke Philips Electronics N.V. Ultrasonic diagnostic imaging of the coronary arteries
US6743174B2 (en) * 2002-04-01 2004-06-01 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging system with automatically controlled contrast and brightness
US7720268B2 (en) * 2005-07-15 2010-05-18 Siemens Corporation System and method for ultrasound specific segmentation using speckle distributions
US20070083114A1 (en) * 2005-08-26 2007-04-12 The University Of Connecticut Systems and methods for image resolution enhancement
WO2007067200A2 (en) * 2005-10-26 2007-06-14 Aloka Co., Ltd. Method and apparatus for elasticity imaging
US20110054317A1 (en) * 2009-08-31 2011-03-03 Feng Lin Tracking and optimizing gain and contrast in real-time for ultrasound imaging
US9324137B2 (en) 2012-10-24 2016-04-26 Marvell World Trade Ltd. Low-frequency compression of high dynamic range images
CN103971340A (en) 2014-05-15 2014-08-06 中国科学院光电技术研究所 High bit-wide digital image dynamic range compression and detail enhancement method
US9460499B2 (en) * 2014-05-30 2016-10-04 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Systems and methods for selective enhancement of a region of interest in an image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5793883A (en) * 1995-09-29 1998-08-11 Siemens Medical Systems, Inc. Method for enhancing ultrasound image
WO2008010375A1 (en) * 2006-07-20 2008-01-24 Hitachi Medical Corporation Ultrasonographic device
WO2014069374A1 (en) * 2012-11-01 2014-05-08 日立アロカメディカル株式会社 Medical image diagnostic device and medical image generation method
CN104490418A (en) * 2014-09-25 2015-04-08 深圳市恩普电子技术有限公司 Automatic ultrasonic-image optimization method based on signal statistic analysis
CN107862724A (en) * 2017-12-01 2018-03-30 中国医学科学院生物医学工程研究所 A kind of improved microvascular blood flow imaging method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210389438A1 (en) * 2017-12-29 2021-12-16 Koninklijke Philips N.V. System and method for adaptively configuring dynamic range for ultrasound image display
US11561296B2 (en) * 2017-12-29 2023-01-24 Koninklijke Philips N.V. System and method for adaptively configuring dynamic range for ultrasound image display

Also Published As

Publication number Publication date
JP2021531860A (en) 2021-11-25
EP3827281B1 (en) 2024-01-03
JP7256258B2 (en) 2023-04-11
EP3827281A1 (en) 2021-06-02
WO2020020761A1 (en) 2020-01-30
US20210248724A1 (en) 2021-08-12
US11908110B2 (en) 2024-02-20
CN112513673B (en) 2024-01-12

Similar Documents

Publication Publication Date Title
US8425422B2 (en) Adaptive volume rendering for ultrasound color flow diagnostic imaging
EP1686393A2 (en) Coherence factor adaptive ultrasound imaging
JP6356216B2 (en) Ultrasound diagnostic imaging system.
JP6207562B2 (en) Shadow suppression in ultrasound imaging
JPH1128210A (en) Three-dimensional imaging system and method for ultrasonic scattering matter
JP2000300562A (en) Method and device for arranging region of interest within image
JP2002526229A (en) Ultrasound diagnostic imaging system with reduced spatial synthesis seam artifact
JPH1176236A (en) Three-dimensional imaging system and method for ultrasonic scattering medium in object
JP2001507794A (en) Ultrasonic scan conversion method with spatial dithering
JPH1128209A (en) Ultrasonic imaging method and system
US20140066768A1 (en) Frequency Distribution in Harmonic Ultrasound Imaging
JP2002534185A (en) Automatic spectral optimization method for Doppler ultrasound
US11408987B2 (en) Ultrasonic imaging with multi-scale processing for grating lobe suppression
CN112912762A (en) Adaptive ultrasound flow imaging
US20230355214A1 (en) Ultrasound imaging system with tissue specific presets for diagnostic exams
US20070083109A1 (en) Adaptive line synthesis for ultrasound
CN112513673B (en) Ultrasound imaging system with improved dynamic range control
WO2021018968A1 (en) Ultrasonic imaging of acoustic attenuation coefficients with elevation compounding
CN112119327A (en) Synthetic transmission focused ultrasound system with sonic velocity aberration mapping
US11607194B2 (en) Ultrasound imaging system with depth-dependent transmit focus
JP4276532B2 (en) Ultrasonic diagnostic equipment
US11953591B2 (en) Ultrasound imaging system with pixel extrapolation image enhancement
JP7366137B2 (en) Ultrasonic imaging of acoustic attenuation coefficient with reliability estimation
US20210055398A1 (en) Ultrasound system with improved noise performance by persistence processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant