CN116831625A - Ultrasonic imaging method and equipment - Google Patents

Ultrasonic imaging method and equipment Download PDF

Info

Publication number
CN116831625A
CN116831625A CN202210306929.1A CN202210306929A CN116831625A CN 116831625 A CN116831625 A CN 116831625A CN 202210306929 A CN202210306929 A CN 202210306929A CN 116831625 A CN116831625 A CN 116831625A
Authority
CN
China
Prior art keywords
ultrasonic
pixel
image
beam forming
target tissue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210306929.1A
Other languages
Chinese (zh)
Inventor
刘敬
郭冲冲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority to CN202210306929.1A priority Critical patent/CN116831625A/en
Publication of CN116831625A publication Critical patent/CN116831625A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
    • A61B8/5253Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode combining overlapping images, e.g. spatial compounding

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The embodiment of the invention provides an ultrasonic imaging method and equipment, wherein the method comprises the following steps: acquiring initial ultrasonic images of the target tissue corresponding to each ultrasonic beam synthesis method after processing the same channel echo data of the target tissue by adopting at least two different ultrasonic beam synthesis methods; for each pixel point in each initial ultrasonic image, determining the fusion coefficient of the pixel point according to the contrast and the pixel quality of the pixel point to obtain a fusion coefficient map corresponding to each initial ultrasonic image; and fusing all the obtained initial ultrasonic images of the target tissue according to the obtained fusion coefficient map corresponding to each initial ultrasonic image to obtain a fused ultrasonic image of the target tissue. The method provided by the embodiment of the invention can integrate the advantages of each beam forming method and improve the quality of ultrasonic images.

Description

Ultrasonic imaging method and equipment
Technical Field
The embodiment of the invention relates to the technical field of medical ultrasonic imaging, in particular to an ultrasonic imaging method and equipment.
Background
The ultrasonic diagnosis technique is a technique for performing clinical diagnosis based on an obtained ultrasonic image by transmitting ultrasonic waves to a tissue of a diagnosis object, receiving echoes of the ultrasonic waves returned from the tissue, and generating an ultrasonic image of the tissue, and has been widely used in clinic due to advantages of non-invasiveness, low cost, high real-time property, and the like. Because clinical diagnosis is based on ultrasound images of tissue, the quality of the ultrasound images will directly affect the accuracy of the diagnosis.
Beamforming is the most critical step in an ultrasound imaging system, and the quality of the beamforming will directly determine the quality of the resulting ultrasound image. The beam forming methods commonly used in the industry at present include a delay superposition beam forming method, a minimum variance beam forming method, a coherence factor beam forming method and the like. The ultrasonic image obtained by the delay superimposed beam forming method can well display the whole structure of the tissue, but the display of the tissue boundary and the tiny focus is not clear enough; although the minimum variance beam forming method can improve the spatial resolution of the ultrasonic image, the ultrasonic image of uniform tissues is not compact; although the coherent factor beam forming method can make the structure of the strongly reflecting tissue clearer, it may also cause the ultrasound image of the weakly reflecting tissue to be not displayed.
In summary, the existing ultrasonic beam synthesis methods have certain limitations, and it is difficult to obtain an optimal ultrasonic image by using a single beam synthesis method. Increasing the quality of ultrasound images is a continuing goal pursued by those skilled in the art.
Disclosure of Invention
The embodiment of the invention provides an ultrasonic imaging method and equipment, which are used for improving the quality of ultrasonic images.
In one embodiment of the present invention, there is provided an ultrasound imaging method comprising:
acquiring initial ultrasonic images of the target tissue corresponding to each ultrasonic beam synthesis method after processing the same channel echo data of the target tissue by adopting at least two different ultrasonic beam synthesis methods;
for each pixel point in each initial ultrasonic image, determining a fusion coefficient of the pixel point according to the contrast and the pixel quality of the pixel point, so as to obtain a fusion coefficient diagram corresponding to each initial ultrasonic image, wherein the contrast is positively correlated with the pixel value variation of the pixel point relative to the adjacent pixel point, the pixel quality is negatively correlated with the difference value between the pixel value of the pixel point and the pixel median value, and the fusion coefficient is positively correlated with the contrast and the pixel quality respectively;
and fusing all the obtained initial ultrasonic images of the target tissue according to the obtained fusion coefficient map corresponding to each initial ultrasonic image to obtain a fused ultrasonic image of the target tissue.
In one embodiment, fusing all the obtained initial ultrasound images of the target tissue according to the obtained fusion coefficient map corresponding to each initial ultrasound image includes:
And calculating the sum of products of the pixel values at the positions of all the pixel points in the initial ultrasonic image and the corresponding fusion coefficients to obtain the pixel value at the position of the pixel point in the fusion ultrasonic image.
In one embodiment, fusing all the obtained initial ultrasound images of the target tissue according to the obtained fusion coefficient map corresponding to each initial ultrasound image includes:
carrying out multi-scale decomposition on each initial ultrasonic image and the corresponding fusion coefficient graph thereof to obtain ultrasonic sub-images and corresponding fusion coefficient sub-images under a plurality of scales;
fusing all the ultrasonic sub-images under the same scale according to the fusion coefficient subgraphs corresponding to the ultrasonic sub-images to obtain fused ultrasonic sub-images under the scale;
reconstructing the fused ultrasound sub-images under a plurality of scales to obtain the fused ultrasound images.
In one embodiment, the multi-scale decomposition is performed on each initial ultrasound image and the corresponding fusion coefficient graph thereof, including:
carrying out Laplacian pyramid decomposition on each initial ultrasonic image;
and carrying out Gaussian pyramid decomposition on the fusion coefficient graph corresponding to each initial ultrasonic image.
In one embodiment, the pixel quality of a pixel is determined according to the following expression:
Where (x, y) denotes coordinates of the pixel point, q (x, y) denotes pixel quality of the pixel point, g (x, y) denotes normalized pixel value of the pixel point, μ denotes normalized pixel median, and σ is a constant between 0 and 1.
In one embodiment, the pixel quality is determined by means of a look-up table.
In one embodiment, before processing the same channel echo data of the target tissue using at least two different ultrasound beam forming methods, the method further comprises:
displaying a plurality of ultrasonic beam forming methods on a display interface in a text or icon mode for a user to select, and determining at least two different ultrasonic beam forming methods according to the selection operation of the user;
or alternatively, the process may be performed,
and determining at least two different ultrasonic beam forming methods according to the type of the target tissue and the mapping relation between the type of the pre-established tissue and the ultrasonic beam forming methods.
In one embodiment, the method further comprises:
and displaying the initial ultrasonic image of the target tissue corresponding to each ultrasonic beam forming method on a display interface.
In a second aspect, an embodiment of the present invention provides an ultrasound imaging apparatus including:
an ultrasonic probe;
The transmitting circuit is used for outputting a corresponding transmitting sequence to the ultrasonic probe according to a set mode so as to control the ultrasonic probe to transmit corresponding ultrasonic waves;
the receiving circuit is used for receiving the ultrasonic echo signal output by the ultrasonic probe and outputting ultrasonic echo data;
the display is used for outputting visual information;
a processor for performing the ultrasound imaging method of any of the first aspects.
In a third aspect, embodiments of the present invention provide a computer-readable storage medium having stored therein computer-executable instructions which, when executed by a processor, are adapted to carry out the ultrasound imaging method according to any of the first aspects.
According to the ultrasonic imaging method and the ultrasonic imaging device, the same channel echo data of the target tissue are processed by adopting at least two different ultrasonic beam forming methods, the obtained initial ultrasonic image of the target tissue corresponding to each ultrasonic beam forming method is fused by taking the pixel as a unit and adopting the fusion coefficient positively related to the contrast and the pixel quality, so that the detailed information and the overall structure information of the target tissue are considered, the fused ultrasonic image can integrate the advantages of each beam forming method, the overall structure information and the detailed information of the target tissue are better displayed, and the quality of the ultrasonic image is improved.
Drawings
FIG. 1 is a block diagram of an ultrasound imaging apparatus according to an embodiment of the present application;
FIG. 2 is a flow chart of an ultrasound imaging method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an ultrasound imaging process according to an embodiment of the present application;
FIG. 4 is a flow chart of an ultrasound imaging method according to yet another embodiment of the present application;
FIG. 5 is a schematic diagram of a multi-scale ultrasound imaging process according to one embodiment of the present application;
fig. 6 is an interface schematic diagram of a method for selecting an ultrasonic beam forming according to an embodiment of the present application.
Detailed Description
The application will be described in further detail below with reference to the drawings by means of specific embodiments. Wherein like elements in different embodiments are numbered alike in association. In the following embodiments, numerous specific details are set forth in order to provide a better understanding of the present application. However, one skilled in the art will readily recognize that some of the features may be omitted, or replaced by other elements, materials, or methods in different situations. In some instances, related operations of the present application have not been shown or described in the specification in order to avoid obscuring the core portions of the present application, and may be unnecessary to persons skilled in the art from a detailed description of the related operations, which may be presented in the description and general knowledge of one skilled in the art.
Furthermore, the described features, operations, or characteristics of the description may be combined in any suitable manner in various embodiments. Also, various steps or acts in the method descriptions may be interchanged or modified in a manner apparent to those of ordinary skill in the art. Thus, the various orders in the description and drawings are for clarity of description of only certain embodiments, and are not meant to be required orders unless otherwise indicated.
The numbering of the components itself, e.g. "first", "second", etc., is used herein merely to distinguish between the described objects and does not have any sequential or technical meaning. The term "coupled" as used herein includes both direct and indirect coupling (coupling), unless otherwise indicated.
Ultrasonic imaging equipment is widely used due to the advantages of non-invasiveness, low cost, strong real-time performance and the like. As shown in fig. 1, an ultrasonic imaging apparatus provided in an embodiment of the present application may include: the ultrasound probe 20, the transmit/receive circuitry 30 (i.e., the transmit circuitry 310 and the receive circuitry 320), the processor 40, the memory 50, and the human interaction device 60. The processor 40 may include, among other things, a transmit/receive sequence control module 410, a beam forming module 420, and an image processing module 440.
The ultrasonic probe 20 includes a transducer (not shown in the figure) composed of a plurality of array elements arranged in an array, the plurality of array elements are arranged in a row to form a linear array, or are arranged in a two-dimensional matrix to form an area array, and the plurality of array elements may also form a convex array. The array elements are used for transmitting ultrasonic beams according to the excitation electric signals or converting the received ultrasonic beams into electric signals. Each array element can thus be used to achieve a mutual conversion of the electrical pulse signal and the ultrasound beam, so as to achieve an emission of ultrasound waves towards the target tissue of the examination object (human or animal), and also to receive echoes of the ultrasound waves reflected back through the target tissue. In performing ultrasonic detection, it is possible to control which array elements are used to transmit an ultrasonic beam, which array elements are used to receive an ultrasonic beam, or control the array element slots are used to transmit an ultrasonic beam or receive an echo of an ultrasonic beam through the transmitting circuit 310 and the receiving circuit 320. The array elements participating in ultrasonic wave transmission can be excited by the electric signals at the same time, so that ultrasonic waves are transmitted at the same time; or the array elements participating in the ultrasonic wave transmission can be excited by a plurality of electric signals with a certain time interval, so that the ultrasonic wave with a certain time interval can be continuously transmitted.
In this embodiment, the user may select a suitable position and angle by moving the ultrasonic probe 20 to transmit ultrasonic waves to the target tissue 10 and receive echoes of the ultrasonic waves returned by the target tissue 10, and obtain and output an electrical signal of the echoes, which is an analog electrical signal of a channel formed by taking a receiving array element as a channel, and carries amplitude information, frequency information and time information.
The transmitting circuit 310 is configured to generate a transmitting sequence according to the control of the transmitting/receiving sequence control module 410, where the transmitting sequence is configured to control some or all of the plurality of array elements to transmit ultrasonic waves to the biological tissue, and the transmitting sequence parameters include an array element position for transmitting, the number of array elements, and ultrasonic beam transmitting parameters (such as amplitude, frequency, the number of transmissions, a transmission interval, a transmission angle, a waveform, a focusing position, etc.). In some cases, the transmitting circuit 310 is further configured to delay the phases of the transmitted beams, so that different transmitting array elements transmit ultrasound waves at different times, so that each transmitting ultrasound beam can be focused at a predetermined region of interest. Different modes of operation, such as B-image mode, C-image mode, and D-image mode (doppler mode), the transmit sequence parameters may be different, and after the echo signals are received by the receive circuit 320 and processed by subsequent modules and corresponding algorithms, a B-image reflecting the anatomy of the target tissue, a C-image reflecting the anatomy and blood flow information of the target tissue, and a D-image reflecting the doppler spectrum image may be generated.
The receiving circuit 320 is used for receiving the electric signal of the ultrasonic echo from the ultrasonic probe 20 and processing the electric signal of the ultrasonic echo. The receive circuitry 320 may include one or more amplifiers, analog-to-digital converters (ADCs), and the like. The amplifier is used for amplifying the received electric signal of the ultrasonic echo after proper gain compensation, and the analog-to-digital converter is used for sampling the analog echo signal according to a preset time interval so as to convert the analog echo signal into a digitized signal, and the digitized echo signal still maintains amplitude information, frequency information and phase information. The data output by the receiving circuit 320 may be output to the beam forming module 420 for processing or may be output to the memory 50 for storage.
The processor 40 is configured to be a central controller Circuit (CPU), one or more microprocessors, graphics controller circuits (GPU) or any other electronic component capable of processing input data according to specific logic instructions, which may perform control of peripheral electronic components, or data reading and/or saving of memory 50 according to the input instructions or predetermined instructions, and may also perform processing of the input data by executing programs in the memory 50, such as one or more processing operations on the acquired ultrasound data according to one or more modes of operation, including but not limited to adjusting or defining the form of ultrasound emitted by the ultrasound probe 20, generating various image frames for display by a display of a subsequent human-machine interaction device 60, or adjusting or defining the content and form displayed on the display, or adjusting one or more image display settings (e.g., ultrasound images, interface components, locating regions of interest) displayed on the display.
The beam forming module 420 may be in signal connection with the receiving circuit 320, and is configured to perform beam forming processing on the channel echo data of the target tissue output by the receiving circuit 320, so as to obtain beamformed ultrasonic image data, where the channel echo data is an un-demodulated radio frequency signal, and the ultrasonic image data output by the beam forming module 420 is also radio frequency data (RF data). The beam forming module 420 may output the radio frequency data to an IQ demodulation module (not shown) for demodulation to obtain baseband data. The IQ demodulation module may output the baseband data to the image processing module 440 of the processor 40 for image processing, and may also output the baseband data to the memory 50 for buffering or saving, so that the image processing module 440 reads the data from the memory 50 for subsequent image processing. The beam forming module 420 may also output the rf data to the memory 50 for buffering or saving, or directly output the rf data to the image processing module 440 of the processor 40 for image processing.
In an alternative embodiment, an IQ demodulation module (not shown in the figure) may be further located between the beam forming module 420 and the receiving circuit 320, that is, the IQ demodulation module may be signal-connected to the receiving circuit 320, and remove a signal carrier through IQ demodulation, extract tissue structure information included in the signal, and perform filtering to remove noise, where the obtained signal is called a baseband signal; the beam forming module 420 may be connected to the IQ demodulation module by a signal, and configured to perform beam forming processing on the channel echo data of the target tissue output by the IQ demodulation module, so as to obtain beamformed ultrasonic image data, where the channel echo data is a demodulated baseband signal, and the ultrasonic image data output by the beam forming module 420 is also baseband data. The beam forming module 420 may also output the baseband data to the memory 50 for buffering or saving, or directly output the baseband data to the image processing module 440 of the processor 40 for image processing.
In summary, the channel echo data processed by the beam forming module 420 may be either a radio frequency signal or a demodulated baseband signal.
The beam forming module 420 may perform the above-described functions in hardware, firmware, or software, for example, the beam forming module 420 may include a central controller Circuit (CPU), one or more micro-processing chips, or any other electronic component capable of processing input data according to specific logic instructions, which when the beam forming module 420 is implemented in software, may execute instructions stored on tangible and non-transitory computer readable media (e.g., memory 50) to perform beam forming calculations using any suitable beam forming method.
The image processing module 440 is configured to process the data output by the beam forming module 420 or the data output by an IQ demodulation module (not shown in the figure) to generate an ultrasound image of the target tissue. The image processing module 440 may output the obtained ultrasound image of the target tissue to a display of the human-machine interaction device 60 for display.
The memory 50 may be a tangible and non-transitory computer readable medium, such as a flash memory card, a solid state memory, a hard disk, etc., for storing data or programs, for example, the memory 50 may be used to store acquired ultrasound data or image frames generated by the processor 40 that are not immediately displayed at once, or the memory 50 may store a graphical user interface, one or more default image display settings, programming instructions for the processor, the beam synthesis module 420, or the IQ decoding module.
The man-machine interaction device 60 is used for performing man-machine interaction, namely receiving input and output visual information of a user; the input of the user can be received by a keyboard, an operation button, a mouse, a track ball and the like, and a touch screen integrated with a display can also be adopted; the output visual information adopts a display.
It should be noted that the structure shown in fig. 1 is only illustrative, and more or fewer components/modules than those shown in fig. 1 may be included in actual use, or have a different configuration from that shown in fig. 1. The various components/modules shown in fig. 1 may be implemented in hardware and/or software. The ultrasound imaging apparatus shown in fig. 1 may be used to perform the ultrasound imaging method provided by any of the embodiments of the present invention.
Because the distances from the ultrasonic receiving points in the target tissue to different receiving array elements are different, the channel echo data of the same ultrasonic receiving point output by the different receiving array elements have delay differences, and the differences need to be eliminated through beam synthesis. At present, ultrasonic beam forming methods capable of being used for ultrasonic imaging equipment are various, and a time delay superposition beam forming method, a minimum variance beam forming method, a coherence factor beam forming method and the like are commonly used.
The Delay-and-Sum (DAS) beam synthesis method comprises three steps of Delay, apodization and summation. The DAS beam forming method has been widely applied to ultrasonic imaging equipment because of the advantages of simple implementation principle, easy implementation, high image robustness and the like. The method aligns channel echo data according to corresponding delay coefficients, and then performs weighted summation according to corresponding apodization coefficients. The delay coefficient and the apodization coefficient are two important parameters of the DAS beam forming method. The delay coefficient may be determined according to the geometric position of the imaging point in the emitted sound field and the geometric distance from the array element position, and the apodization coefficient is typically a fixed window function such as a rectangular window, a hanning window, a gaussian window, a semicircular window, and the like. Different window functions may be applied to obtain different image effects. Specifically, the main lobe of a target corresponding to the rectangular window is narrow, but the side lobe is high, and the spatial resolution on the image representation is higher, but clutter interference can be brought; the Gaussian window has wide main lobe of the corresponding target point, low side lobe, poor spatial resolution on image representation, less clutter and low noise level. In practical application, an ultrasonic research engineer needs to adjust the delay coefficient and the apodization coefficient according to practical conditions so as to trade off the ultrasonic image.
From this, the ultrasound image obtained by the DAS beam forming method can well show the whole structure of the tissue, but the display of the tissue boundary and the micro focus is not clear enough. On the one hand because the relevant assumptions applied in the calculation of the delay coefficients do not fully fit the actual situation, and on the other hand because fixed apodization coefficients are used and cannot be adaptively changed according to the characteristics of the ultrasound echo signal. In order to improve the quality of an ultrasound image, a beam forming method, such as a Minimum Variance (MV) beam forming method, a Coherence Factor (CF) beam forming method, etc., capable of adaptively changing according to characteristics of an ultrasound echo signal, has been developed.
The MV beam synthesis method delays channel echo data, calculates a covariance matrix of the delayed channel echo data, calculates a set of apodization coefficients by taking the output of minimized noise and interference as an optimization target, and performs weighted summation on the delayed channel echo data according to the obtained apodization coefficients to obtain beam synthesis output. Since the MV beam synthesis method adaptively calculates the apodization coefficients according to the received channel echo data in real time, the apodization coefficients are not fixed. The method can improve the spatial resolution and contrast resolution of the ultrasonic image, but has large speckle noise variance and large image background noise, so that the ultrasonic image of uniform tissues is not compact, and certain defects exist.
The CF wave beam synthesis method delays the channel echo data to obtain delayed channel echo data. And giving different weights to the delayed channel echo data according to a certain priori knowledge to obtain weighted summation output. And assuming that the effective signals of the channel echo data have certain coherence, and the noise signals are incoherent, performing coherence coefficient calculation according to the coherent signals and incoherent signals in the delayed channel echo data to obtain coherence factors. And processing the obtained coherent factor and the weighted sum output result to obtain a beam synthesis output. The CF wave beam synthesis method has obvious effect on strong reflection signals, can make the structure of strong reflection tissues become clearer, enhances the contrast of images, but for weak signals, the calculated coherence factor coefficient is smaller, so that the weak signals can be further compressed, and partial images cannot be displayed.
It should be noted that there are many methods of ultrasound beam forming suitable for use in an ultrasound imaging apparatus, and the detailed description thereof will not be given here. In summary, although the ultrasound image obtained by the DAS beam forming method can well display the whole structure of the tissue, the display of the tissue boundary and the micro focus is not clear enough; although the MV beam synthesis method can improve the spatial resolution of the ultrasonic image, the ultrasonic image of uniform tissues is not compact; although CF beam forming methods can make the structure of strongly reflecting tissue clearer, it may also result in the ultrasound image of weakly reflecting tissue not being displayed.
Therefore, different ultrasonic beam forming methods have different advantages, can highlight the information of the target tissue in a certain aspect, have certain limitations at the same time, and only one ultrasonic beam forming method is difficult to obtain the optimal ultrasonic image. In order to improve the quality of ultrasonic images, the applicant proposes an ultrasonic imaging method for fusing ultrasonic images obtained by different ultrasonic beam forming methods. In the following, a specific embodiment will be described in detail how to integrate the advantages of each ultrasonic beam forming method in the fusion process, so as to obtain a higher quality ultrasonic image.
Referring to fig. 2, an ultrasound imaging method according to an embodiment of the present invention may include:
s101, acquiring an initial ultrasonic image of the target tissue corresponding to each ultrasonic beam synthesis method after processing the same channel echo data of the target tissue by adopting at least two different ultrasonic beam synthesis methods.
Herein, when referring to "different beam forming methods" or "multiple beam forming methods", it means that the beam forming methods have at least one difference in principle, steps and parameters, which includes both cases where the beam forming algorithm (principle) is different and cases where the algorithm (principle) is the same, but where the steps are different (e.g., increasing steps or decreasing steps or transforming step sequences, etc.) or where the parameters used are different, etc., the beam forming methods in these cases are considered as "different" or "different kind" beam forming methods.
As used herein, the term "channel echo data" may refer to data corresponding to a channel (corresponding to one or more array elements) of an ultrasound imaging device prior to beam forming, for example, either a radio frequency signal prior to demodulation, a baseband signal after demodulation, and so on.
In this embodiment, the channel echo data of the target tissue may be obtained in real time, that is, the ultrasonic probe transmits ultrasonic waves to the target tissue and receives ultrasonic echoes returned from the target tissue, so as to obtain the channel echo data of the target tissue in real time; the channel echo data of the target tissue stored in advance can be acquired from the storage device. The channel echo data may be either a radio frequency signal before demodulation or a baseband signal after demodulation. The same channel echo data is processed by different ultrasonic beam forming methods, and can be data obtained by performing ultrasonic scanning on target tissues, so that the data acquisition time can be reduced, and the requirement on the data storage space can be reduced.
At least two different ultrasound beam forming methods refer to ultrasound beam forming methods having at least one of a variety of principles, steps and parameters, and may include, for example, at least two of a delay-and-apodization sum beam forming method, a minimum variance adaptive beam forming method, a coherence factor beam forming method, a non-coherent beam forming method, a frequency domain beam forming method, and so forth. The processing of the channel echo data of the target tissue by the first ultrasonic beam forming method, the second ultrasonic beam forming method, and the third ultrasonic beam forming method, respectively, will be described as an example. In an alternative embodiment, the first ultrasonic beam forming method may be a DAS beam forming method, the second ultrasonic beam forming method may be an MV beam forming method, and the third ultrasonic beam forming method may be a CF beam forming method, that is, an ultrasonic beam forming method with different principles is adopted. In yet another alternative embodiment, the first ultrasonic beam forming method may be a rectangular window DAS beam forming method, the second ultrasonic beam forming method may be a gaussian window DAS beam forming method, and the third ultrasonic beam forming method may be a semicircular window DAS beam forming method, that is, an ultrasonic beam forming method with the same principle and different parameters is adopted. In another alternative embodiment, the first ultrasonic beam forming method may be a DAS beam forming method, the second ultrasonic beam forming method may be a CF beam forming method that calculates a coherence factor, performs weighted summation, and superimposes the coherence factor, and the third ultrasonic beam forming method may be a CF beam forming method that performs weighted summation, calculates a coherence factor, and superimposes the coherence factor, that is, an ultrasonic beam forming method with different principles and steps. It should be noted that the above combinations are only examples, and not limited thereto, and at least two different ultrasound beam forming methods may be selected in practical use according to specific situations.
It can be understood that different ultrasonic beam forming methods can highlight information of the target tissue from different angles, for example, a DAS beam forming method can highlight an overall structure of the target tissue, an MV beam forming method can highlight a micro focus in the target tissue by improving spatial resolution, and a CF beam forming method can enable a portion with strong reflection in the target tissue to be displayed more clearly. Therefore, the same channel echo data are respectively processed by adopting at least two different ultrasonic beam forming methods, so that the information of the target tissue can be obtained from a plurality of different angles, the information contained in the channel echo data can be fully mined, and the quality of ultrasonic imaging is improved.
It should be noted that, the ultrasonic beam forming method used for processing the channel echo data in this embodiment includes both the existing ultrasonic beam forming method in the field and the ultrasonic beam forming method that may occur in the future in the field.
The same channel echo data of the target tissue is processed by adopting different ultrasonic beam forming methods, image data of the target tissue corresponding to each different ultrasonic beam forming method can be obtained, and an initial ultrasonic image of the target tissue corresponding to each ultrasonic beam forming method can be generated according to the image data. Different ultrasound beam forming methods can highlight the information of the target tissue from different angles, so that the obtained initial ultrasound image will have various advantages.
Taking a DAS beam synthesis method, an MV beam synthesis method and a CF beam synthesis method as examples to respectively process channel echo data of the same target tissue, the DAS beam synthesis method is adopted to process the channel echo data to obtain first image data of the target tissue, the MV beam synthesis method is adopted to process the first image data of the target tissue to obtain second image data of the target tissue, and the CF beam synthesis method is adopted to process the first image data of the target tissue to obtain third image data of the target tissue. The resulting image data may be processed by an image processing module in the ultrasound imaging device processor to generate a corresponding initial ultrasound image. If a first ultrasonic image of the target tissue can be generated according to the first image data, the first ultrasonic image obtained by adopting the DAS beam synthesis method can well show the integral structure of the target tissue; according to the second image data, a second ultrasonic image of the target tissue can be generated, and the second ultrasonic image obtained by adopting the MV beam synthesis method has higher spatial resolution and can highlight and display micro focus in the target tissue; according to the third image data, a third ultrasonic image of the target tissue can be generated, and the third ultrasonic image obtained by adopting the CF beam synthesis method can display the part with strong reflection in the target tissue more clearly.
S102, for each pixel point in each initial ultrasonic image, determining the fusion coefficient of the pixel point according to the contrast and the pixel quality of the pixel point, so as to obtain a fusion coefficient map corresponding to each initial ultrasonic image. Here, the contrast may be positively correlated with the amount of change in the pixel value of the pixel point relative to the adjacent pixel point, the pixel quality may be negatively correlated with the difference between the pixel value of the pixel point and the pixel median, and the fusion coefficients may be positively correlated with the contrast and the pixel quality, respectively.
It should be noted that the term "fusion coefficient map" as used herein refers to a set of fusion coefficients corresponding to respective pixels in an initial ultrasound image, and does not mean that the fusion coefficient map is necessarily an "image" formed by the fusion coefficients.
N (N is more than or equal to 2) different ultrasonic beam forming methods are adopted to process the same channel echo data acquired from the target tissue, and N initial ultrasonic images with the same size can be obtained. The initial ultrasound image may be either a gray scale image or a color image; either a two-dimensional image or a three-dimensional image.
In this embodiment, the pixel is used as a unit, and for each pixel in each initial ultrasound image, the fusion coefficient of the pixel is determined according to the contrast and the pixel quality of the pixel. The contrast may be determined according to the pixel value variation between the pixel point and the adjacent pixel points, for example, in a two-dimensional image, the contrast may be determined according to the pixel difference between the pixel point and 4 adjacent pixel points above, below, left and right; in the three-dimensional image, the pixel difference value between the pixel point and the 6 adjacent pixel points can be determined. The adjacent pixel points can also be other pixel points in a 9-grid with the pixel point as a center. The number of adjacent pixels used in determining the contrast may be selected as desired. The contrast is positively correlated with the amount of change in pixel value, that is, the greater the pixel difference between a pixel point and an adjacent pixel point, the greater the contrast. And the fusion coefficient is positively correlated to the contrast, that is, the greater the contrast, the greater the fusion coefficient. Taking the DAS beam forming method and the MV beam forming method as an example, for the pixel points at the tissue boundary, the contrast in the initial ultrasound image obtained by the MV beam forming method will be greater than the contrast in the initial ultrasound image obtained by the DAS beam forming method, so by setting the fusion coefficient positively related to the contrast, the detail information of the target tissue will be better presented.
Taking a gray image as an example, the human eye feels the most comfortable to the intermediate gray value, and too bright or too dark can affect the resolution of the tissue by the human eye. The closer the pixel to the intermediate gray value, the higher the pixel quality; the farther from the intermediate gray value the lower the pixel quality. I.e. the difference between the pixel value of the pixel with negative correlation to the pixel point and the pixel median, wherein the pixel median may be preset, for example, for a gray image, the pixel median may be set to the intermediate gray value 128, for a normalized gray image, the pixel median may be set to 0.5; the median value of all pixel values in the initial ultrasound image may also be determined as the pixel median value from the initial ultrasound image calculation. Still taking the DAS beam forming method and the MV beam forming method as an example, for a pixel point at an image background, the pixel quality in an initial ultrasound image obtained by the DAS beam forming method is higher than the pixel quality in an initial ultrasound image obtained by the MV beam forming method, so that by setting a fusion coefficient positively related to the pixel quality, the overall structure information of a target tissue can be better displayed.
The fusion coefficients are in one-to-one correspondence with the pixel points, and after the fusion coefficient of each pixel point is determined, the N fusion coefficient maps which are the same as the N Zhang Chushi ultrasonic images in size and in one-to-one correspondence are obtained by arranging the fusion coefficients according to the position relation of the pixel points. If the initial ultrasound image is a three-dimensional image, the resulting fusion coefficient map is also a three-dimensional image.
S103, fusing all the obtained initial ultrasonic images of the target tissue according to the obtained fusion coefficient graphs corresponding to the initial ultrasonic images to obtain a fused ultrasonic image of the target tissue.
Each pixel point in the fused ultrasonic image is the result of the combined action of different ultrasonic beam forming methods, the fusion coefficient positively correlated with the contrast and the pixel quality can give consideration to the detail information and the whole structure information, and the fused ultrasonic image obtained by fusion can integrate the advantages of the beam forming methods.
In an alternative embodiment, the initial ultrasound image may be weighted and fused using the fusion coefficients in the fusion coefficient map as weights. The pixel values at the same position in the N initial ultrasound images may be weighted and summed according to the corresponding fusion coefficient, that is, the pixel value of each pixel point in the fused ultrasound image is the sum of the products of the pixel value at the position of the pixel point in all the initial ultrasound images and the corresponding fusion coefficient.
According to the ultrasonic imaging method provided by the embodiment, after the same channel echo data of the target tissue are processed by adopting at least two different ultrasonic beam forming methods, the obtained initial ultrasonic image of the target tissue corresponding to each ultrasonic beam forming method is fused by taking the pixel as a unit and adopting the fusion coefficient positively related to the contrast and the pixel quality, so that the detail information and the integral structure information of the target tissue are considered, the fused ultrasonic image can integrate the advantages of each beam forming method, the integral structure information and the detail information of the target tissue are better displayed, and the quality of the ultrasonic image is improved.
The following further describes how to fuse the initial ultrasound images to obtain a fused ultrasound image of the target tissue, taking the generated initial ultrasound image as an example, wherein the generated initial ultrasound image is N (N is greater than or equal to 2) two-dimensional gray scale images.
Let the normalized gray value of pixel point (x, y) in the ith initial ultrasound image be g i (x, y) contrast c of pixel point (x, y) i (x, y) canDetermined according to the following formula:
c i (x,y)=|g i (x,y-1)+g i (x-1,y)+g i (x+1,y)+g i (x,y+1)-4g i (x,y)|/4 (1)
that is, the contrast c of the pixel point (x, y) i (x, y) is the difference of the normalized gray value of the pixel and its upper, lower, left and right 4 neighboring pixels. C when the normalized gray value of the pixel point is greatly changed, such as the pixel point is positioned at the boundary of the target tissue i The value of (x, y) is large.
Pixel mass q of pixel point (x, y) i (x, y) may be determined according to the following equation:
wherein mu represents a median value of pixels, and the value range is 0-1. The closer the normalized gray value of a pixel is to mu, the pixel quality q i The higher (x, y) is, the closer to 1, the pixel quality q is when the normalized gray value of the pixel is far away from μ i The lower (x, y) is, the closer to 0. Since the human eye feels the most comfortable to the pixels of the intermediate gray value, either too bright or too dark affects the resolution of the tissue by the human eye, μ can be set to 0.5 for the normalized gray value. Sigma represents variance, and the value range is 0-1. Sigma is smaller as the normalized gray value g of the pixel i (x, y) away from μ, pixel mass q i The faster the (x, y) drop, the greater σ, with the normalized gray value g of the pixel i (x, y) away from μ, pixel mass q i The slower the (x, y) drop. Preferably, σ can take a value between 0.1 and 0.2.
Alternatively, to increase processing speed, pixel quality may be determined by way of a look-up table. Specifically, a mapping table between the normalized gray-scale value and the pixel quality may be established in advance.
In determining contrast c of pixel point (x, y) i (x, y) and pixel quality q i After (x, y), the quality coefficient cq of the pixel point (x, y) can be determined according to the following formula i (x, y), mass coefficient cq i (x, y) is used to reflect the quality of the pixel point (x, y).
Wherein w is c The weight for representing the contrast is 0-1; w (w) q The weight representing the pixel quality is 0-1. W can be determined from the fusion requirements c And w q Is a value of (a). For example, w may be reduced when more detailed information of the target tissue of interest is required c Increase w q The method comprises the steps of carrying out a first treatment on the surface of the When more overall structure information of the target tissue of interest is needed, w can be increased c Reduce w q
Then, according to the following expression, calculating a fusion coefficient w according to the quality coefficient of each initial ultrasonic image i (x,y):
When the mass coefficient w of the pixel point (x, y) i The fusion coefficients calculated according to equation (4) will not make sense when (x, y) (i=1, 2, … …, N) are poor, e.g., less than the threshold epsilon. Average fusion can be performed at this time, i.eOn one hand, the calculation amount for calculating the fusion coefficient can be saved, and on the other hand, the fusion quality can be improved.
Finally, each pixel point in the N initial ultrasonic images can be fused according to the following expression and the fusion coefficient to obtain the normalized gray value g (x, y) of the pixel point (x, y) in the fused ultrasonic image of the target tissue:
the above-mentioned process of determining the fusion coefficient based on the contrast and the pixel quality of the pixel point and fusing the initial ultrasound image may refer to fig. 3. For each pixel point in the initial ultrasonic image, firstly calculating the contrast and the pixel quality, then calculating the quality coefficient according to the contrast and the pixel quality to obtain a quality coefficient image, then calculating the fusion coefficient according to the quality coefficient image to obtain a fusion coefficient image, and finally fusing the initial ultrasonic image according to the fusion coefficient image to obtain a fused ultrasonic image of the target tissue.
To eliminate seams that may occur during the fusion of the initial ultrasound images on a pixel-by-pixel basis based on the contrast and pixel quality of the pixel points, the fusion may be performed in a multi-scale fusion manner. Referring to fig. 4, in the ultrasound imaging method provided in this embodiment, after obtaining a fusion coefficient map corresponding to each initial ultrasound image, all the obtained initial ultrasound images of the target tissue are fused according to the obtained fusion coefficient map corresponding to each initial ultrasound image, which specifically may include:
S1031, carrying out multi-scale decomposition on each initial ultrasonic image and the corresponding fusion coefficient graph thereof to obtain ultrasonic sub-images and corresponding fusion coefficient sub-images under a plurality of scales.
In this embodiment, the initial ultrasound image and the corresponding fusion coefficient map are decomposed in the same scale, that is, the number of ultrasound sub-images is equal to the number of fusion coefficient sub-images, the sizes are the same, and the ultrasound sub-images are in one-to-one correspondence. For example, the i (i=1, 2, … …, N) th initial ultrasound image g can be obtained i Decomposing into L scales, wherein the ultrasonic sub-image of the first scale isThen the ith initial ultrasound image g i Corresponding fusion coefficient map w i Is also decomposed into L scales, and the fusion coefficient subgraph of the first scale is +.>There will be N ultrasound sub-images and corresponding N fusion coefficient sub-images at each scale. The specific method for performing the multi-scale decomposition is not performed in this embodimentThe constraint may be, for example, a laplacian pyramid decomposition, a gaussian pyramid decomposition, a wavelet transform decomposition, an empirical mode decomposition, or the like.
S1032, fusing all the ultrasonic sub-images under the same scale according to the fusion coefficient subgraphs corresponding to the ultrasonic sub-images to obtain the fused ultrasonic sub-images under the scale.
In this embodiment, the N ultrasound sub-images under the first scale may be fused according to the following N fusion coefficient sub-images corresponding to the N ultrasound sub-images to obtain a fused ultrasound sub-image g under the first scale l
Finally, L fused ultrasonic sub-images g are obtained l (l=1,2,……,L)。
S1033, reconstructing the fused ultrasonic sub-images under a plurality of scales to obtain the fused ultrasonic image.
Fusion ultrasound sub-image g under L scales is obtained l And then, reconstructing according to a mode of gradually upsampling from the low-scale image pyramid and summing with the high-scale image pyramid, and finally obtaining the fused ultrasonic image of the target tissue.
According to the ultrasonic imaging method provided by the embodiment, the initial ultrasonic image and the corresponding fusion coefficient image are further subjected to multi-scale decomposition, the ultrasonic sub-images and the corresponding fusion coefficient sub-images are fused under the same scale, the fusion ultrasonic sub-images under each scale are finally reconstructed, and finally the fusion ultrasonic image of the target tissue is determined, so that seams possibly generated in the process of determining the fusion coefficient based on the contrast of pixel points and the pixel quality in pixel units can be effectively eliminated.
In order to preserve the detail information of the image as much as possible in the process of eliminating the seam, in an alternative embodiment, the Laplacian pyramid decomposition is performed on each initial ultrasonic image, and the detail of the image at different scales is represented by the ultrasonic sub-images at different scales so as to preserve the detail information of the image as much as possible; and carrying out Gaussian pyramid decomposition on the fusion coefficient graph corresponding to each initial ultrasonic image, and representing low-pass images under different scales by fusion coefficient subgraphs under different scales so as to smooth the images and eliminate joints. The specific implementation process is as shown in fig. 5, and the initial ultrasonic image is subjected to Laplacian pyramid decomposition to obtain a Laplacian ultrasonic sub-image pyramid; obtaining a fusion coefficient diagram according to the initial ultrasonic image, and carrying out Gaussian pyramid decomposition on the fusion coefficient diagram to obtain a Gaussian fusion coefficient subgraph pyramid; fusing the Laplace ultrasonic sub-image pyramid and the Gaussian fusion coefficient sub-image pyramid to obtain a fused ultrasonic sub-image pyramid; and reconstructing according to the fusion ultrasonic sub-image pyramid to obtain a fusion ultrasonic image.
An ultrasound imaging device may support a number of ultrasound beam forming methods, which clearly pose a great challenge to the power of the ultrasound imaging device if all of the ultrasound beam forming methods are used to process the channel echo data, how does the ultrasound beam forming method for processing the same channel echo data be determined? In an alternative embodiment, before the channel echo data is processed by adopting at least two different ultrasonic beam forming methods, displaying a plurality of ultrasonic beam forming methods on a display interface in a text or icon mode for a user to select, and determining at least two different ultrasonic beam forming methods according to the selection operation of the user. Referring to fig. 6, all ultrasound beam forming methods supported by the ultrasound imaging device may be displayed for selection by the user. Based on which the user can autonomously select the ultrasound beam synthesis method according to clinical experience.
Different tissues often have different imaging characteristics, for example, some tissues can form strongly reflected ultrasound signals, while some tissues can form weakly reflected ultrasound signals; the need for ultrasound examination of different tissues is often different, for example, for some tissues only the whole structure is needed to be checked, and for some tissues local details are needed to be checked. Combining the imaging characteristics of the tissue with the examination requirements to determine the adopted ultrasonic beam forming method can enable the ultrasonic beam forming method to be more matched with the tissue, and is beneficial to further improving the quality of ultrasonic imaging. For example, CF beam forming methods and MV beam forming methods may be employed for tissues that are capable of forming strongly reflected ultrasound signals and for which local details need to be viewed.
The mapping relation between the tissue type and the ultrasonic beam forming method can be pre-established according to the imaging characteristics of the tissue and the examination requirements, and the tissue type can be related to the adopted ultrasonic beam forming method, wherein the tissue type can be represented by an examination part, such as a heart, a liver, a uterus and the like, and can also be represented by other modes, such as an ultrasonic examination item. Table 1 is a schematic representation of a mapping relationship, and as shown in Table 1, when the type of the target tissue is heart, three ultrasonic beam synthesis methods A, B and C are adopted to process the channel echo data of the target tissue respectively; when the type of the target tissue is liver, four ultrasonic beam synthesis methods A, C, D and F are adopted to process the channel echo data of the target tissue respectively.
TABLE 1
Type of tissue Ultrasonic beam synthesis method
Heart and method for producing the same A、B、C
Liver A、C、D、F
Uterus E、F
In an alternative embodiment, in order to improve the quality of ultrasonic imaging, at least two different ultrasonic beam forming methods may be determined according to the type of the target tissue and the mapping relationship between the type of the pre-established tissue and the ultrasonic beam forming method before the channel echo data is processed by adopting the at least two different ultrasonic beam forming methods. The type of the target tissue can be determined according to the acquired channel echo data of the target tissue, or can be determined according to the input of a user during the ultrasonic examination, if the user inputs an instruction for performing the mammary gland ultrasonic examination, the type of the target tissue is mammary gland. Optionally, after at least two different ultrasonic beam forming methods are determined according to the mapping relationship, the determined ultrasonic beam forming methods may be displayed on a display interface in a text or icon manner for a user to view.
In another alternative embodiment, before the channel echo data is processed by adopting at least two different ultrasonic beam forming methods, at least two different ultrasonic beam forming methods can be determined according to the type of the target tissue and the mapping relationship between the type of the pre-established tissue and the ultrasonic beam forming methods, then the determined at least two different ultrasonic beam forming methods are displayed on a display interface in a text or icon mode for a user to select, and the ultrasonic beam forming method finally used for processing the channel echo data is determined according to the selection operation of the user. Taking the mapping relationship shown in table 1 as an example, when the type of the target tissue is liver, four ultrasonic beam synthesis methods A, C, D and F are displayed for the user to select, and if the user selects a and F, two ultrasonic beam synthesis methods a and F are finally adopted to process the channel echo data of the target tissue respectively. Taking the mapping relationship shown in table 1 as an example, when the type of the target tissue is liver, as shown in fig. 6, all the ultrasonic beam forming methods supported by the ultrasonic imaging apparatus may be displayed on the display interface, and four ultrasonic beam forming methods A, C, D and F may be displayed as default options.
On the basis of any one of the above embodiments, in order to facilitate the user to grasp the condition of the target tissue more comprehensively and more accurately, the ultrasound imaging method provided in this embodiment may further include: and displaying the initial ultrasonic image of the target tissue corresponding to each ultrasonic beam forming method on a display interface. Or, the fused ultrasonic image of the target tissue and the initial ultrasonic image of the target tissue corresponding to each ultrasonic beam synthesis method can be displayed on a display interface simultaneously or sequentially, so that a user can selectively check or compare the fused ultrasonic image and the initial ultrasonic image of the target tissue according to the needs.
Initial ultrasound images of the target tissue corresponding to each of the different ultrasound beam forming methods are generated from the same channel echo data, so that the target tissue can be displayed in a comparable manner without image registration. The ultrasound images may be tiled, scrolled, or overlaid. The present embodiment does not limit the type and number of the adopted ultrasonic beam forming methods and the number and display manner of the generated ultrasonic images. For example, a first ultrasonic image obtained by adopting a DAS beam synthesis method, a second ultrasonic image obtained by adopting a MV beam synthesis method and a third ultrasonic image obtained by adopting a CF beam synthesis method can be displayed on a display interface in a tiled mode, at the moment, a user can grasp the whole structure of a target tissue through the first ultrasonic image, view a tiny focus in the target tissue through the second ultrasonic image and observe a strong reflection part in the target tissue through the third ultrasonic image. The initial ultrasonic images obtained by adopting different beam forming methods are displayed on the display interface, so that a user can observe the target tissue from different angles, can grasp more information about the target tissue, and is helpful for grasping the condition of the target tissue more comprehensively and accurately.
Reference is made to various exemplary embodiments herein. However, those skilled in the art will recognize that changes and modifications may be made to the exemplary embodiments without departing from the scope herein. For example, the various operational steps and components used to perform the operational steps may be implemented in different ways (e.g., one or more steps may be deleted, modified, or combined into other steps) depending on the particular application or taking into account any number of cost functions associated with the operation of the system.
Additionally, as will be appreciated by one of skill in the art, the principles herein may be reflected in a computer program product on a computer readable storage medium preloaded with computer readable program code. Any tangible, non-transitory computer readable storage medium may be used, including magnetic storage devices (hard disks, floppy disks, etc.), optical storage devices (CD-ROMs, DVDs, blu-Ray disks, etc.), flash memory, and/or the like. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including means which implement the function specified. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified.
While the principles herein have been shown in various embodiments, many modifications of structure, arrangement, proportions, elements, materials, and components, which are particularly adapted to specific environments and operative requirements, may be used without departing from the principles and scope of the present disclosure. The above modifications and other changes or modifications are intended to be included within the scope of this document.
The foregoing detailed description has been described with reference to various embodiments. However, those skilled in the art will recognize that various modifications and changes may be made without departing from the scope of the present disclosure. Accordingly, the present disclosure is to be considered as illustrative and not restrictive in character, and all such modifications are intended to be included within the scope thereof. Also, advantages, other advantages, and solutions to problems have been described above with regard to various embodiments. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, system, article, or apparatus. Furthermore, the term "couple" and any other variants thereof are used herein to refer to physical connections, electrical connections, magnetic connections, optical connections, communication connections, functional connections, and/or any other connection.
The foregoing description of the invention has been presented for purposes of illustration and description, and is not intended to be limiting. Several simple deductions, modifications or substitutions may also be made by a person skilled in the art to which the invention pertains, based on the idea of the invention.

Claims (10)

1. An ultrasound imaging method, comprising:
acquiring initial ultrasonic images of the target tissue corresponding to each ultrasonic beam synthesis method after processing the same channel echo data of the target tissue by adopting at least two different ultrasonic beam synthesis methods;
for each pixel point in each initial ultrasonic image, determining a fusion coefficient of the pixel point according to the contrast and the pixel quality of the pixel point, so as to obtain a fusion coefficient diagram corresponding to each initial ultrasonic image, wherein the contrast is positively correlated with the pixel value variation of the pixel point relative to the adjacent pixel point, the pixel quality is negatively correlated with the difference value between the pixel value and the pixel median value of the pixel point, and the fusion coefficient is positively correlated with the contrast and the pixel quality respectively;
and fusing all the obtained initial ultrasonic images of the target tissue according to the obtained fusion coefficient map corresponding to each initial ultrasonic image to obtain a fused ultrasonic image of the target tissue.
2. The method of claim 1, wherein the fusing all of the obtained initial ultrasound images of the target tissue according to the obtained fusion coefficient map corresponding to each of the initial ultrasound images comprises:
and calculating the sum of products of the pixel values at the positions of all the pixel points in the initial ultrasonic image and the corresponding fusion coefficients to obtain the pixel value at the position of the pixel point in the fusion ultrasonic image.
3. The method of claim 1, wherein the fusing all of the obtained initial ultrasound images of the target tissue according to the obtained fusion coefficient map corresponding to each of the initial ultrasound images comprises:
carrying out multi-scale decomposition on each initial ultrasonic image and the corresponding fusion coefficient graph thereof to obtain ultrasonic sub-images and corresponding fusion coefficient sub-images under a plurality of scales;
fusing all the ultrasonic sub-images under the same scale according to the fusion coefficient subgraphs corresponding to the ultrasonic sub-images to obtain fused ultrasonic sub-images under the scale;
reconstructing the fused ultrasonic sub-images under a plurality of scales to obtain the fused ultrasonic image.
4. A method according to claim 3, wherein said multi-scale decomposing each initial ultrasound image and its corresponding fusion coefficient map comprises:
Carrying out Laplacian pyramid decomposition on each initial ultrasonic image;
and carrying out Gaussian pyramid decomposition on the fusion coefficient graph corresponding to each initial ultrasonic image.
5. The method of claim 1, wherein the pixel quality of the pixel point is determined according to the following expression:
where (x, y) denotes coordinates of the pixel point, q (x, y) denotes pixel quality of the pixel point, g (x, y) denotes normalized pixel value of the pixel point, μ denotes normalized pixel median, and σ is a constant between 0 and 1.
6. The method of claim 5, wherein the pixel quality is determined by means of a look-up table.
7. The method of any of claims 1-6, wherein prior to processing the same channel echo data of the target tissue using at least two different ultrasound beam forming methods, the method further comprises:
displaying a plurality of ultrasonic beam forming methods on a display interface in a text or icon mode for a user to select, and determining at least two different ultrasonic beam forming methods according to the selection operation of the user;
or alternatively, the process may be performed,
and determining the at least two different ultrasonic beam forming methods according to the type of the target tissue and the mapping relation between the type of the pre-established tissue and the ultrasonic beam forming methods.
8. The method of any one of claims 1-6, wherein the method further comprises:
and displaying the initial ultrasonic image of the target tissue corresponding to each ultrasonic beam forming method on a display interface.
9. An ultrasonic imaging apparatus, comprising:
an ultrasonic probe;
the transmitting circuit is used for outputting a corresponding transmitting sequence to the ultrasonic probe according to a set mode so as to control the ultrasonic probe to transmit corresponding ultrasonic waves;
the receiving circuit is used for receiving the ultrasonic echo signals output by the ultrasonic probe and outputting ultrasonic echo data;
the display is used for outputting visual information;
a processor for performing the ultrasound imaging method as claimed in any one of claims 1 to 8.
10. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor are for implementing the ultrasound imaging method of any of claims 1-8.
CN202210306929.1A 2022-03-25 2022-03-25 Ultrasonic imaging method and equipment Pending CN116831625A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210306929.1A CN116831625A (en) 2022-03-25 2022-03-25 Ultrasonic imaging method and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210306929.1A CN116831625A (en) 2022-03-25 2022-03-25 Ultrasonic imaging method and equipment

Publications (1)

Publication Number Publication Date
CN116831625A true CN116831625A (en) 2023-10-03

Family

ID=88171252

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210306929.1A Pending CN116831625A (en) 2022-03-25 2022-03-25 Ultrasonic imaging method and equipment

Country Status (1)

Country Link
CN (1) CN116831625A (en)

Similar Documents

Publication Publication Date Title
JP6367261B2 (en) Knowledge-based ultrasound image enhancement
JP7252206B2 (en) Ultrasound system with deep learning network for image artifact identification and removal
KR102014504B1 (en) Shadow suppression in ultrasound imaging
US10101450B2 (en) Medical image processing apparatus, a medical image processing method and a medical diagnosis apparatus
CN106600550B (en) Ultrasonic image processing method and system
US11432806B2 (en) Information processing apparatus, information processing method, and storage medium
US8696577B2 (en) Tongue imaging in medical diagnostic ultrasound
US10123779B2 (en) Method and apparatus for generating ultrasound image having enhanced quality
US20210161510A1 (en) Ultrasonic diagnostic apparatus, medical imaging apparatus, training device, ultrasonic image display method, and storage medium
US8394027B2 (en) Multi-plane/multi-slice processing for 2-D flow imaging in medical diagnostic ultrasound
CN112867444B (en) System and method for guiding acquisition of ultrasound images
US11272906B2 (en) Ultrasonic imaging device and method for controlling same
CN116831626A (en) Ultrasonic beam synthesis method and equipment
JP2016067704A (en) Ultrasonic diagnostic apparatus, ultrasonic image processor and ultrasonic image processing program
US20220330920A1 (en) Ultrasonic diagnostic apparatus and medical image processing apparatus
CN116831625A (en) Ultrasonic imaging method and equipment
CN117084716A (en) Blood flow spectrum imaging method and ultrasonic imaging device
JP6879041B2 (en) Ultrasound diagnostic equipment and ultrasonic image generation method
CN112545550A (en) Method and system for motion corrected wideband pulse inversion ultrasound imaging
CN116831616A (en) Ultrasonic imaging method and equipment
KR20140126227A (en) ultrasound system and clutter filtering method thereof
US20210228177A1 (en) Ultrasonic diagnostic apparatus, learning apparatus, and image processing method
CN117084717A (en) Blood flow imaging method and ultrasonic imaging device
CN115990036A (en) Ultrasonic imaging method and equipment
JP2023104734A (en) Ultrasonic diagnostic device and image processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication