CN112401934A - Ultrasound imaging method and system - Google Patents

Ultrasound imaging method and system Download PDF

Info

Publication number
CN112401934A
CN112401934A CN201910785078.1A CN201910785078A CN112401934A CN 112401934 A CN112401934 A CN 112401934A CN 201910785078 A CN201910785078 A CN 201910785078A CN 112401934 A CN112401934 A CN 112401934A
Authority
CN
China
Prior art keywords
ultrasonic
image
target
pixel
ultrasound
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910785078.1A
Other languages
Chinese (zh)
Inventor
董腾驹
徐志安
李雷
刘杰
袁海锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority to CN201910785078.1A priority Critical patent/CN112401934A/en
Publication of CN112401934A publication Critical patent/CN112401934A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
    • A61B8/5253Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode combining overlapping images, e.g. spatial compounding

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The application provides an ultrasonic imaging method and a system, wherein the ultrasonic imaging method comprises the following steps: transmitting a first ultrasonic wave of a first frequency to a scanning target; obtaining a first ultrasonic image based on an ultrasonic echo of the first ultrasonic wave; transmitting a second ultrasonic wave of a second frequency at least once to the scanning target; obtaining a second ultrasonic image based on an ultrasonic echo of the second ultrasonic wave, wherein the second ultrasonic image at least comprises compensation information for compensating the missing information of the sound shadow region in the first ultrasonic image, and the compensation information comprises pixel value information of pixel points of the sound shadow region; the first and second ultrasound images are compounded to obtain a target ultrasound image. By the scheme, the sound shadow in the finally obtained target ultrasonic image can be removed or weakened, and the image quality of the non-sound shadow part is not influenced.

Description

Ultrasound imaging method and system
Technical Field
The present application relates to the field of ultrasound imaging technology, and more particularly, to an ultrasound imaging method and system.
Background
In the ultrasonic scanning, a dark area often appears behind a strong echo tissue, for example, a dark area appears due to the fact that ultrasonic image information behind the part of the tissue is lost due to the strong reflection of the fetal skull and the vertebral column in the craniocerebral section and the fetal vertebral column section in the obstetrical examination, the phenomenon is called as sound shadow in engineering, the diagnosis information of the ultrasonic image is lost due to the existence of the sound shadow, great influence is brought to the diagnosis confidence and efficiency of a doctor, and great inconvenience is brought to the clinician.
Therefore, in view of the above problems, the present application provides a new ultrasound imaging method and system.
Disclosure of Invention
One aspect of the present application provides an ultrasound imaging method, including:
transmitting a first ultrasonic wave of a first frequency to a scanning target;
obtaining a first ultrasonic image based on an ultrasonic echo of the first ultrasonic wave;
transmitting a second ultrasonic wave of a second frequency at least once to the scanning target;
obtaining a second ultrasonic image based on an ultrasonic echo of the second ultrasonic wave, wherein the second ultrasonic image at least comprises compensation information for compensating the missing information of the sound shadow region in the first ultrasonic image, and the compensation information comprises pixel value information of pixel points of the sound shadow region;
the first and second ultrasound images are compounded to obtain a target ultrasound image.
Illustratively, the transmitting a second ultrasonic wave of a second frequency at least once to the scan target further includes:
identifying sound shadow areas in the scan target;
and emitting second ultrasonic waves of a second frequency to the sound shadow area at least once.
Illustratively, the identifying the sound shadow region in the scan target includes:
identifying a sound shadow region in the scan target based on intensity information of echo signals of a scan line of the first ultrasonic wave.
Illustratively, identifying a sound shadow region in the scan target based on the intensity information of the echo signal of each scan line of the first ultrasonic wave comprises:
and detecting intensity information of echo signals of a scanning line of the first ultrasonic wave, wherein when the signal intensity of a critical point in the echo signals is detected to be over-saturated, and the signal intensities of the echo signals in the direction along the scanning line after the critical point are all smaller than a threshold signal intensity, an area in the direction along the scanning line after the critical point is determined to be the sound shadow area.
Illustratively, the first frequency is greater than the second frequency.
Illustratively, the emission energy of the second ultrasonic wave is higher than the emission energy of the first ultrasonic wave.
Illustratively, the transmission waveform length of the second ultrasonic wave is longer than the transmission waveform length of the first ultrasonic wave.
Illustratively, the compounding the first and second ultrasound images to obtain a target ultrasound image comprises:
identifying pixel points in the first ultrasonic image and the second ultrasonic image, which are located in the sound shadow area;
compounding the information of the pixel points in at least the sonographic area in the second ultrasonic image with the first ultrasonic image to obtain the target ultrasonic image.
Illustratively, identifying pixel points in the first and second ultrasound images that are located in the sound shadow region includes:
identifying the pixel points located in the sound and shadow area based on the difference value of the pixel values of the pixel points corresponding to the second ultrasonic image and the first ultrasonic image, and determining the pixel points with the difference value of the pixel values within a preset threshold value as the pixel points corresponding to the sound and shadow area.
Illustratively, the compounding the first and second ultrasound images to obtain a target ultrasound image comprises:
determining a first target weight value of each pixel point in the second ultrasonic image to generate a first target weight map;
updating the pixel value of each pixel point in the second ultrasonic image based on the first target weight map;
compounding the updated second ultrasound image and the first ultrasound image to obtain a target ultrasound image.
Illustratively, the compounding the first and second ultrasound images to obtain a target ultrasound image comprises:
determining a first target weight value of each pixel point of at least the sound shadow area in the second ultrasonic image to generate a first target weight map;
updating the pixel value of each pixel point in the second ultrasonic image based on the first target weight map;
determining a second target weight value of each pixel point of the first ultrasonic image to generate a second target weight map;
updating the pixel value of each pixel point of the first ultrasonic image based on the second target weight map;
compounding the updated first ultrasound image and the updated second ultrasound image to obtain a target ultrasound image.
Illustratively, determining a first target weight value of each pixel point in the second ultrasound image to generate a first target weight map includes:
determining a first weight value of each pixel point based on a difference value of pixel values of corresponding pixel points in the first ultrasonic image and the second ultrasonic image to form a first weight map, wherein the first weight value increases with the increase of the difference value when the difference value is smaller than a first threshold value, and the first weight value decreases with the increase of the difference value when the difference value is greater than or equal to the first threshold value;
determining a second weight value of each pixel point according to the pixel value of each pixel point in the first ultrasonic image to form a second weight map, wherein when the pixel value of the pixel point is greater than a second threshold value, the second weight value is reduced along with the increase of the pixel value;
and multiplying the weight values of the corresponding pixel points of the first weight map and the second weight map to obtain the first target weight value so as to form the first target weight map.
Illustratively, updating the pixel value of each pixel point in the second ultrasound image based on the first target weight map comprises:
and multiplying the pixel value of each pixel point of the second ultrasonic image by the corresponding first target weight value in the first target weight map so as to update the pixel value of each pixel point of the second ultrasonic image.
Exemplarily, the method further comprises the following steps:
and the pixel value of each pixel point of the target ultrasonic image is the sum of the pixel value of the corresponding pixel point of the updated first ultrasonic image and the pixel value of the corresponding pixel point of the updated second ultrasonic image.
Illustratively, each said second target weight value in the second target weight map is greater than or equal to 1.
Illustratively, the method further comprises the steps of:
and carrying out gray mapping processing on the target ultrasonic image.
Illustratively, the target ultrasound image includes a one-dimensional ultrasound image or a two-dimensional ultrasound image.
Yet another aspect of the present application provides an ultrasound imaging system comprising:
a probe;
the transmitting circuit is used for exciting the probe to transmit first ultrasonic waves with a first frequency to a scanning target and transmit second ultrasonic waves with a second frequency to the scanning target at least once;
a processor to:
obtaining a first ultrasonic image based on an ultrasonic echo of the first ultrasonic wave;
obtaining a second ultrasonic image based on an ultrasonic echo of the second ultrasonic wave, wherein the second ultrasonic image at least comprises compensation information for compensating the missing information of the sound shadow region in the first ultrasonic image, and the compensation information comprises pixel value information of pixel points of the sound shadow region;
the first and second ultrasound images are compounded to obtain a target ultrasound image.
Illustratively, the processor is further configured to identify a sound shadow region in the scan target;
the transmitting circuit is also used for exciting the probe to transmit second ultrasonic waves of a second frequency to the sound shadow area at least once.
Illustratively, the processor is specifically configured to:
identifying a sound shadow region in the scan target based on intensity information of echo signals of a scan line of the first ultrasonic wave.
Illustratively, the processor is specifically configured to:
and detecting intensity information of echo signals of a scanning line of the first ultrasonic wave, wherein when the signal intensity of a critical point in the echo signals is detected to be over-saturated, and the signal intensities of the echo signals in the direction along the scanning line after the critical point are all smaller than a threshold signal intensity, an area in the direction along the scanning line after the critical point is determined to be the sound shadow area.
Illustratively, the first frequency is greater than the second frequency.
Illustratively, the emission energy of the second ultrasonic wave is higher than the emission energy of the first ultrasonic wave.
Illustratively, the transmission waveform length of the second ultrasonic wave is longer than the transmission waveform length of the first ultrasonic wave.
Illustratively, the processor is specifically configured to:
identifying pixel points in the first ultrasonic image and the second ultrasonic image, which are located in the sound shadow area;
compounding the information of the pixel points in at least the sonographic area in the second ultrasonic image with the first ultrasonic image to obtain the target ultrasonic image.
Illustratively, the processor is further configured to:
identifying the pixel points located in the sound and shadow area based on the difference value of the pixel values of the pixel points corresponding to the second ultrasonic image and the first ultrasonic image, and determining the pixel points with the difference value of the pixel values within a preset threshold value as the pixel points corresponding to the sound and shadow area.
Illustratively, the processor is specifically configured to:
determining a first target weight value of each pixel point of at least the sound shadow area in the second ultrasonic image to generate a first target weight map;
updating the pixel value of each pixel point in the second ultrasonic image based on the first target weight map;
compounding the updated second ultrasound image and the first ultrasound image to obtain a target ultrasound image.
Illustratively, the processor is specifically configured to:
determining a first target weight value of each pixel point of at least the sound shadow area in the second ultrasonic image to generate a first target weight map;
updating the pixel value of each pixel point in the second ultrasonic image based on the first target weight map;
determining a second target weight value of each pixel point of the first ultrasonic image to generate a second target weight map;
updating the pixel value of each pixel point of the first ultrasonic image based on the second target weight map;
compounding the updated first ultrasound image and the updated second ultrasound image to obtain a target ultrasound image.
Illustratively, the processor is further configured to:
determining a first weight value of each pixel point based on a difference value of pixel values of corresponding pixel points in the first ultrasonic image and the second ultrasonic image to form a first weight map, wherein the first weight value increases with the increase of the difference value when the difference value is smaller than a first threshold value, and the first weight value decreases with the increase of the difference value when the difference value is greater than or equal to the first threshold value;
determining a second weight value of each pixel point according to the pixel value of each pixel point in the first ultrasonic image to form a second weight map, wherein when the pixel value of the pixel point is greater than a second threshold value, the second weight value is reduced along with the increase of the pixel value;
and multiplying the weight values of the corresponding pixel points of the first weight map and the second weight map to obtain the first target weight value so as to form the first target weight map.
Illustratively, the processor is further configured to:
and multiplying the pixel value of each pixel point of the second ultrasonic image by the corresponding first target weight value in the first target weight map so as to update the pixel value of each pixel point of the second ultrasonic image.
Illustratively, the pixel value of each pixel point of the target ultrasound image is the sum of the pixel value of the corresponding pixel point of the updated first ultrasound image and the pixel value of the corresponding pixel point of the updated second ultrasound image.
Illustratively, each said second target weight value in the second target weight map is greater than or equal to 1.
Illustratively, the processor is further configured to:
and carrying out gray mapping processing on the target ultrasonic image.
Illustratively, the target ultrasound image includes a one-dimensional ultrasound image or a two-dimensional ultrasound image.
According to the ultrasonic imaging method and system, the first ultrasonic wave of the first frequency is emitted to the scanning target; obtaining a first ultrasonic image based on an ultrasonic echo of the first ultrasonic wave; transmitting a second ultrasonic wave of a second frequency at least once to the scanning target; obtaining a second ultrasonic image based on an ultrasonic echo of the second ultrasonic wave, wherein the second ultrasonic image at least comprises compensation information for compensating the missing information of the sound shadow region in the first ultrasonic image, and the compensation information comprises pixel value information of pixel points of the sound shadow region; and compounding the first ultrasonic image and the second ultrasonic image to obtain a target ultrasonic image, so that the sound shadow in the finally obtained target ultrasonic image is removed or weakened, the image quality of a non-sound shadow part is not influenced, and the diagnosis confidence and efficiency of a doctor according to the target ultrasonic image are improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
FIG. 1 is a schematic block diagram of an ultrasound imaging system of one embodiment of the present application;
FIG. 2 shows a flow chart of an ultrasound imaging method of an embodiment of the present application;
FIG. 3 illustrates a schematic view of the identification of sound shadows along a scan line according to one embodiment of the present application;
FIG. 4 shows a flow chart of an ultrasound imaging method of another embodiment of the present application;
FIG. 5 shows a schematic diagram of a first weight map of an embodiment of the present application;
FIG. 6 shows a schematic diagram of a second weight map of an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, exemplary embodiments according to the present application will be described in detail below with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the application described in the application without inventive step, shall fall within the scope of protection of the application.
In the following description, numerous specific details are set forth in order to provide a more thorough understanding of the present application. It will be apparent, however, to one skilled in the art, that the present application may be practiced without one or more of these specific details. In other instances, well-known features of the art have not been described in order to avoid obscuring the present application. It is to be understood that the present application is capable of implementation in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the application to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of the associated listed items.
In order to provide a thorough understanding of the present application, a detailed structure will be presented in the following description in order to explain the technical solutions presented in the present application. Alternative embodiments of the present application are described in detail below, however, the present application may have other implementations in addition to these detailed descriptions.
In particular, the ultrasound imaging method and system of the present application are described in detail below with reference to the accompanying drawings. The features of the following examples and embodiments may be combined with each other without conflict.
First, fig. 1 shows a schematic block diagram of an ultrasound imaging system in one embodiment of the present application. As shown in fig. 1, the ultrasound imaging system generally includes: a probe 1, a transmission circuit 2, a transmission/reception selection switch 3, a reception circuit 4, a beam synthesis circuit 5, a processor 6, a display 9, and the like.
The processor 6 may be a Central Processing Unit (CPU), image processing unit (GPU), Application Specific Integrated Circuit (ASIC), Field Programmable Gate Array (FPGA) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the ultrasound imaging system to perform desired functions. For example, the processor 6 can include one or more embedded processors, processor cores, microprocessors, logic circuits, hardware Finite State Machines (FSMs), Digital Signal Processors (DSPs), image processing units (GPUs), or a combination thereof.
In the ultrasonic imaging process, the transmission circuit 2 transmits a delay-focused transmission pulse having a certain amplitude and polarity to the probe 1 through the transmission/reception selection switch 3. The probe 1 is excited by the transmission pulse, transmits an ultrasonic wave to a scanning target (for example, an organ, a tissue, a blood vessel, etc. in a human body or an animal body, not shown in the figure), receives an ultrasonic echo with information of the scanning target, which is reflected and/or scattered from a target region, after a certain time delay, and converts the ultrasonic echo into an electric signal again. The receiving circuit receives the electric signals generated by the conversion of the probe 1, obtains ultrasonic echo signals, and sends the ultrasonic echo signals to the beam forming circuit 5. The beam forming circuit 5 performs focusing delay, weighting, channel summation and other processing on the ultrasonic echo signals, and then sends the ultrasonic echo signals to the processor 6 for relevant signal processing, for example, the processor 6 includes a signal processing unit 7 (e.g., a digital signal processor), and sends the ultrasonic echo signals to the signal processing unit 7 for relevant signal processing.
The ultrasonic echo signals processed by the signal processing unit 7 are sent to an image processing unit 8 (e.g., an image processing unit (GPU)). The image processing unit processes the signals differently according to different imaging modes required by a user to obtain image data of different modes, and then processes the image data through logarithmic compression, dynamic range adjustment, digital scan conversion and the like to form ultrasonic images of different modes, such as a B image, a C image, a D image and the like.
The ultrasound image generated by the image processing unit 8 is sent to the display 9 for display.
The probe 1 typically comprises an array of a plurality of array elements. At each transmission of the ultrasound wave, all or a part of all the elements of the probe 1 participate in the transmission of the ultrasound wave. At this time, each array element or each part of array elements participating in ultrasonic wave transmission is excited by the transmission pulse and respectively transmits ultrasonic waves, the ultrasonic waves respectively transmitted by the array elements are superposed in the transmission process to form a synthesized ultrasonic wave beam transmitted to a scanning target, and the direction of the synthesized ultrasonic wave beam is the ultrasonic transmission direction mentioned in the text.
The array elements participating in ultrasonic wave transmission can be simultaneously excited by the transmission pulse; alternatively, there may be a delay between the times at which the elements participating in the ultrasound transmission are excited by the transmit pulse. The propagation direction of the above-mentioned composite ultrasound beam can be changed by controlling the time delay between the times at which the array elements participating in the transmission of the ultrasound wave are excited by the transmit pulse, as will be explained in detail below.
By controlling the time delay between the times at which the array elements participating in the transmission of the ultrasound wave are excited by the transmit pulse, the ultrasound waves transmitted by the respective array elements can be superimposed at a predetermined position such that the intensity of the ultrasound wave is maximized at the predetermined position, i.e. the ultrasound waves transmitted by the respective array elements are "focused" at the predetermined position, which is referred to as a "focal point", such that the resulting ultrasound beam obtained is a beam focused at the focal point, referred to herein as a "focused ultrasound wave". Here, the elements involved in the transmission of the ultrasound wave operate with a predetermined transmission delay (i.e. there is a predetermined delay between the times at which the elements involved in the transmission of the ultrasound wave are excited by the transmit pulse), and the ultrasound wave transmitted by each element is focused at the focal point to form a focused ultrasound beam. When the focused ultrasonic beam is used for imaging, because the beam is focused at the focus, only one or a plurality of scanning lines can be obtained each time, all the scanning lines in the imaging area can be obtained after multiple times of emission, so that all the scanning lines are combined to obtain a frame of two-dimensional ultrasonic image of the imaging area, and for the condition that only one scanning line is used, a frame of one-dimensional ultrasonic image can be obtained through the scanning line.
Alternatively, by controlling the time delay between the times at which the array elements participating in the transmission of the ultrasonic wave are excited by the transmission pulse, the ultrasonic waves transmitted by the respective array elements participating in the transmission of the ultrasonic wave may not be focused or completely dispersed during propagation, but may form a plane wave which is substantially planar as a whole. Such an afocal plane wave may also be referred to as a "plane ultrasound beam".
Or, by controlling the time delay between the time when the array elements participating in the transmission of the ultrasonic wave are excited by the transmission pulse, the ultrasonic wave transmitted by each array element participating in the transmission of the ultrasonic wave is diverged in the propagation process, and is generally formed into a divergent wave as a whole. This divergent form of ultrasound may also be referred to as a "divergent ultrasound beam".
However, in the ultrasound scanning, a dark area often appears behind a strong echo tissue, for example, a dark area appears due to the fact that ultrasound image information behind the part of the tissue is lost due to the strong reflection of the fetal skull and the spinal column in the craniocerebral section and the fetal spinal column section in the obstetrical examination, which is called as sound shadow in engineering, and the existence of the sound shadow causes the loss of the diagnosis information of the ultrasound image, thereby having great influence on the diagnosis confidence and efficiency of a doctor and causing great inconvenience to the clinician.
Therefore, in order to solve the problem that the existence of the sound shadow causes the information of the ultrasound image to be missing, the present document provides an ultrasound imaging method comprising: transmitting a first ultrasonic wave of a first frequency to a scanning target; obtaining a first ultrasonic image based on an ultrasonic echo of the first ultrasonic wave; transmitting a second ultrasonic wave of a second frequency at least once to the scanning target; obtaining a second ultrasonic image based on an ultrasonic echo of the second ultrasonic wave, wherein the second ultrasonic image at least comprises compensation information for compensating the missing information of the sound shadow region in the first ultrasonic image, and the compensation information comprises pixel value information of pixel points of the sound shadow region; the first and second ultrasound images are compounded to obtain a target ultrasound image.
By the ultrasonic imaging method, the sound shadow in the finally obtained target ultrasonic image is removed or weakened, the image quality of the non-sound shadow part cannot be influenced, and further the diagnosis confidence and efficiency of a doctor according to the target ultrasonic image are improved.
Hereinafter, the ultrasonic imaging method according to the embodiment of the present application will be explained and explained in detail with continued reference to the drawings.
First, as shown in fig. 2, in step S201, a first ultrasonic wave of a first frequency is emitted to a scan target.
For example, as shown in FIG. 1, a transmit circuit excites the probe to transmit a first ultrasonic wave at a first frequency toward the scan target. Wherein, each array element in the probe 1 is configured with a corresponding delay line, and the probe is subjected to sound beam control and dynamic focusing by changing the delay time of each array element in the probe 1, so as to obtain different types of synthesized ultrasonic beams or different ultrasonic propagation directions. Optionally, the first ultrasonic wave comprises a focused ultrasonic wave, a non-focused ultrasonic wave or a broad beam ultrasonic wave, wherein the non-focused ultrasonic wave comprises at least one of a plane wave and a divergent wave. The emission of plane waves and diverging waves has been described above and will not be described in detail here.
Alternatively, the first frequency may be set reasonably according to actual needs, wherein the first frequency is set to meet the requirement of image quality of the non-sound shadow region in the first ultrasound image obtained based on the first frequency, for example, the first frequency may use a frequency higher than a set threshold frequency.
The scanning target can be organs, tissues, blood vessels and the like in a human body or an animal body, for example, the scanning target is a fetal craniocerebral section, a fetal spine section and the like.
Next, as shown in fig. 2, in step S202, a first ultrasound image is obtained based on the ultrasound echo of the first ultrasound wave.
Specifically, the receiving circuit receives echoes of the first ultrasonic waves transmitted in the above steps, and obtains one set of first ultrasonic echo signals every time the first ultrasonic waves are transmitted, for example, obtains one set of first ultrasonic echo signals every time the first ultrasonic waves are transmitted, or obtains multiple sets of first ultrasonic echo signals every time multiple sets of first ultrasonic waves are transmitted.
The method of obtaining the first ultrasound image based on the ultrasound echo of the first ultrasound wave may use any suitable method currently and generally used in the art in the future, and is not particularly limited herein. For example, the electric signals generated by the conversion of the probe 1 may be received by the receiving circuit, ultrasonic echo signals may be obtained, and the ultrasonic echo signals may be sent to the beam forming circuit 5. The beam forming circuit 5 performs focusing delay, weighting, channel summation and other processing on the ultrasonic echo signals, and then sends the ultrasonic echo signals to the processor 6 for relevant signal processing, for example, the processor 6 includes a signal processing unit 7 (e.g., a digital signal processor), and sends the ultrasonic echo signals to the signal processing unit 7 for relevant signal processing. The ultrasonic echo signals processed by the signal processing unit 7 are sent to an image processing unit 8 (e.g., an image processing unit (GPU)). The image processing unit processes the signals differently according to different imaging modes required by a user to obtain image data of different modes, and then processes the image data through logarithmic compression, dynamic range adjustment, digital scan conversion and the like to form ultrasonic images of different modes, such as a B image, a C image, a D image and the like.
The phenomenon is called as sound shadow in engineering because of the dark area which often appears behind the strong echo tissue, for example, the dark area appears due to the lack of ultrasonic image information behind the part of tissue caused by the strong reflection of the skull and the spine during the obstetrical examination, and a large part of the reason is caused by the too large attenuation caused by too small ultrasonic wave length and too high frequency, so that the sound shadow exists in the first ultrasonic image.
Next, with continued reference to fig. 2, in step S203, a second ultrasonic wave of a second frequency is emitted to the scan target at least once.
In one example, the emission energy of the second ultrasonic wave may be made higher than the emission energy of the first ultrasonic wave, so as to reduce or eliminate the sound shadow, for example, the first frequency may be made higher than the second frequency, and the sound shadow caused by too much attenuation due to too small ultrasonic wave length and too high frequency may be reduced or eliminated by using the second ultrasonic wave with low frequency, so as to obtain the tissue information of the sound shadow region for compensating the missing information in the first ultrasonic image.
Alternatively, the emission energy of the second ultrasonic wave may be made higher than the emission energy of the first ultrasonic wave by changing another parameter of the second ultrasonic wave, for example, the focal position of the second ultrasonic wave is made different from the focal position of the first ultrasonic wave, the emission aperture size of the second ultrasonic wave is made different from the emission aperture size of the first ultrasonic wave, or the emission waveform length of the second ultrasonic wave is made longer than the emission waveform length of the first ultrasonic wave.
In one example, the transmitting at least one second ultrasonic wave of a second frequency to the scan target further comprises: identifying sound shadow areas in the scan target; and emitting second ultrasonic waves of a second frequency to the sound shadow area at least once. The sound and shadow area in the scan target may be identified by any suitable method, for example, based on the intensity information of the echo signal of the scan line of the first ultrasonic wave, or may be identified based on the first ultrasonic image. The method includes the steps that through identifying the sound shadow area, a second ultrasonic wave is additionally transmitted aiming at the sound shadow area, but not limited to the sound shadow area, so that a second ultrasonic image is obtained based on the additionally transmitted second ultrasonic wave, and then compensation information used for compensating missing information of the sound shadow area in the first ultrasonic image is included, wherein the compensation information includes pixel value information of pixel points of the sound shadow area.
The identification of the sound shadow region may be performed by any method known to those skilled in the art for identifying the sound shadow region in the scan target based on the first ultrasound image, for example, the sound shadow region in the first ultrasound image may be determined by scanning each column of pixel points of the first ultrasound image and determining whether each column of the first ultrasound image contains a sound shadow through a priori knowledge.
In one example, identifying a sound shadow region in the scan target based on the intensity information of the echo signal of each scan line of the first ultrasonic wave comprises: detecting intensity information of an echo signal of a scan line of the first ultrasonic wave, wherein, as shown in fig. 3, when it is detected that a signal intensity of a critical point in the echo signal is over-saturated, wherein the over-saturated means that the signal intensity is greater than a threshold intensity, a range of the threshold intensity depends on a type of an ultrasonic imaging system, requirements of different types of ultrasonic imaging systems on the threshold intensity are different, and the threshold intensity is not specifically limited thereto, and the signal intensities of the echo signals in a direction along the scan line after the critical point are all less than the threshold signal intensity (for example, the signal intensities of the echo signals in the direction along the scan line after the critical point are all very small and close to 0), determining an area in the direction along the scan line after the critical point to be the sound shadow area. The fact that the signal intensity of the critical point is over-saturated may indicate that there is a strong reflection at the critical point, and the signal intensities of the echo signals in the direction along the scanning line after the critical point are all smaller than the threshold signal intensity, indicates that a low echo or a non-echo region occurs after the critical point, and thus, may be determined as a sound shadow region. The threshold signal intensity can be reasonably set according to the prior experience, and when the signal intensity of the echo signal is smaller than the threshold signal intensity, the corresponding region can be generally judged to be a sound shadow region.
Or, a second ultrasonic wave may be emitted to the same imaging region scanned by the first ultrasonic wave, so as to obtain information of the sound shadow region and ensure that an ultrasonic image of the same imaging region is obtained by subsequent processing.
In this context, the imaging region refers to a region of a scan target to which ultrasound imaging is required, for example, the scan target may be an organ, a tissue, a blood vessel, or the like in a human or animal body. The imaging region (i.e., scanning region) of the first ultrasonic wave scan and the imaging region (scanning region) of the second ultrasonic wave refer to regions corresponding to beams received by the probe for subsequent image processing.
Next, with reference to fig. 2 continuously, in step S204, a second ultrasound image is obtained based on the ultrasound echo of the second ultrasound wave, where the second ultrasound image at least includes compensation information for compensating for missing information of a sound and shadow region in the first ultrasound image, and the compensation information includes pixel value information of a pixel point of the sound and shadow region.
The method of obtaining the second ultrasound image based on the ultrasound echo of the second ultrasound wave may use any suitable method currently and generally used in the art in the future, and is not particularly limited herein.
Since the second frequency of the second ultrasonic wave is lower than the first frequency of the first ultrasonic wave, or the emission energy of the second ultrasonic wave is higher than the emission energy of the first ultrasonic wave, it can scan the sound and shadow region and obtain the echo signal, so that the second ultrasonic image obtained based on the ultrasonic echo signal at least includes compensation information for compensating the missing information of the sound and shadow region in the first ultrasonic image, and the compensation information may include pixel value information of pixel points of the sound and shadow region.
It should be noted that, in this document, the second ultrasonic wave may be transmitted once to the scan target, or the second ultrasonic wave may also be transmitted multiple times, where, when the second ultrasonic wave is transmitted multiple times, a set of sub-ultrasonic images may be obtained based on the second ultrasonic wave transmitted each time, multiple sets of sub-ultrasonic images are obtained through multiple transmissions, and the final second ultrasonic image is obtained after the multiple sets of sub-ultrasonic images are overlapped and compounded.
Next, with continued reference to fig. 2, in step S205, the first ultrasound image and the second ultrasound image are compounded to obtain a target ultrasound image. The first ultrasonic image has higher imaging quality for the non-sound shadow area, and the second ultrasonic image has higher imaging quality for the sound shadow area, so that the target ultrasonic image obtained by compounding the first ultrasonic image and the second ultrasonic image has high quality.
For example, the first ultrasound image and the second ultrasound image may be composited according to different imaging modes required by a user by using an image processing unit 8 (e.g., a Graphics Processing Unit (GPU)) shown in fig. 1 to obtain a target ultrasound image.
In one example, the method of compounding the first and second ultrasound images to obtain a target ultrasound image may further include: identifying pixel points in the first ultrasonic image and the second ultrasonic image, which are located in the sound shadow area; compounding information (for example, pixel value information of pixel points) of pixel points in at least the sonographic area in the second ultrasound image with the first ultrasound image to obtain the target ultrasound image.
The pixels in the sound and shadow area in the first ultrasonic image and the second ultrasonic image can be identified by any suitable method, for example, the pixels in the sound and shadow area can be identified based on the difference between the pixel values of the pixels corresponding to the second ultrasonic image and the first ultrasonic image, the pixels with the pixel values within a predetermined threshold can be identified as the pixels corresponding to the sound and shadow area, or the pixels in the sound and shadow area can be identified based on the pixel values of the pixels of the first ultrasonic image, the pixels with the pixel values lower than the threshold pixel values in the first ultrasonic image can be generally identified as the pixels corresponding to the sound and shadow area, the pixel value information of the pixels corresponding to the sound and shadow area can be composited to the first ultrasonic image, the pixel value information of the pixels in the non-sound and shadow area of the first ultrasonic image is reserved, and the missing information of the sound and shadow area in the first ultrasonic image is compensated, thereby obtaining a high quality target ultrasound image.
In this context, the first and second ultrasound images may be Low Dynamic Range (LDR) ultrasound images, and the target ultrasound image may be a High Dynamic Range (HDR) ultrasound image.
In one example, compounding the first and second ultrasound images to obtain a target ultrasound image may further include steps S1 to S3:
first, in step S1, a first target weight value of each pixel point in the second ultrasound image is determined to generate a first target weight map, where the first target weight value corresponding to a pixel point of the sound-shadow region in the second ultrasound image is greater than the first target weight value corresponding to a pixel point of the non-sound-shadow region, so that the pixel value information of the pixel point of the sound-shadow region in the second ultrasound image can compensate the information missing from the sound-shadow region in the first ultrasound image, and the image quality of the non-sound-shadow region in the first ultrasound image is not affected.
In one example, as shown in fig. 4, determining a first target weight value for each pixel point within at least an insonification region in the second ultrasound image to generate a first target weight map includes the following steps a1 to A3:
in step a1, determining a first weight value of each pixel point based on a difference between pixel values of corresponding pixel points in the first ultrasound image and the second ultrasound image to form a first weight map as shown in fig. 5, where the first weight value increases with an increase of the difference when the difference is smaller than a first threshold, and the first weight value decreases with an increase of the difference when the difference is greater than or equal to the first threshold, where the first threshold may be reasonably set according to a priori, where the pixel points whose difference is greater than or equal to the first threshold may be generally identified as some obviously erroneous pixel points, and the erroneous pixel points are filtered by assigning a smaller first weight value to the partial erroneous pixel points, and optionally, the pixel points whose difference is within a threshold interval may be assigned a maximum weight value, the pixel points in the threshold interval can be generally identified as the pixel points in the sound and image area, and therefore the pixel points are endowed with the maximum weight value, so that the pixel value information of the pixel points in the sound and image area in the second ultrasonic image is compensated to the first ultrasonic image in the subsequent compounding process, for the pixel points with the difference value smaller than the threshold interval, the first weight value is increased along with the increase of the difference value, the pixel points with the difference value larger than the threshold interval can be identified as some pixel points with obvious errors, and the first weight value is reduced along with the increase of the difference value, so that the wrong pixel points are filtered.
In step a2, determining a second weight value of each pixel point according to the pixel value of each pixel point in the first ultrasound image to form a second weight map, where as shown in fig. 6, when the pixel value of the pixel point is less than or equal to a second threshold, the second weight value may be a fixed weight value, for example, substantially between 0.7 and 1, or other suitable range, and for the pixel point whose pixel value in the first ultrasound image is less than or equal to the second threshold, the pixel point may be substantially determined as a pixel point belonging to the sound shadow region, so a higher weight value may be given thereto, and when the pixel value of the pixel point is greater than the second threshold, the second weight value decreases with the increase of the pixel value, for example, the second weight value gradually decreases from the fixed weight value to zero with the increase of the pixel value, and it may be substantially determined that the part of the pixel point may belong to a pixel point in the non-sound shadow region, therefore, the weighted values given to the pixels are gradually reduced, wherein the second weighted values of the pixels with the pixel values larger than or equal to the third threshold (for example, the third threshold is larger than the second threshold) can be set to be zero, and because the pixels with the pixel values larger than or equal to the third threshold are necessarily the pixels belonging to the non-sound shadow region, the weighted values of the pixels of the part can be assigned to be zero, so that the pixels of the sound shadow region in the ultrasonic image are given larger weighted values, and the pixels of the non-sound shadow region are given smaller weighted values.
In step a3, the first weight map and the second weight map are multiplied by the corresponding weight values of the pixels to obtain the first target weight value, so as to form the first target weight map, that is, the first weight value and the second weight value of the corresponding pixels are multiplied by each other to obtain the first target weight value of each pixel, the first target weight values of all the pixels form the first target weight map, and the first target weight map includes the first target weight value of each pixel.
Then, in step S2, as shown in fig. 4, the pixel value of each pixel point in the second ultrasound image is updated based on the first target weight map, for example, the pixel value of each pixel point in the second ultrasound image is multiplied by a corresponding first target weight value in the first target weight map to update the pixel value of each pixel point in the second ultrasound image, the first target weight value of the pixel point in the sound and shadow region in the first target weight map obtained through the above steps is obviously higher than the first target weight value of the pixel point in the non-sound and shadow region, so that the pixel point in the sound and shadow region in the updated second ultrasound image has a higher pixel value through the updating step to compensate the missing information of the sound and shadow region in the first ultrasound image, and the pixel point of the non-sound and shadow portion is given a relatively smaller weight value, therefore, the pixel points in the updated non-sonographic area in the second ultrasonic image have smaller pixel values, so that the influence on the information of the non-sonographic area in the first ultrasonic image during subsequent compounding is reduced or avoided.
Next, in step S3, the updated second ultrasound image and the first ultrasound image are compounded to obtain a target ultrasound image.
In one example, the updated second ultrasound image and the first ultrasound image may be directly compounded, or: continuing to refer to fig. 4, determining a second target weight value of each pixel point of the first ultrasound image to generate a second target weight map, where each second target weight value in the second target weight map may be reasonably set according to actual needs, for example, each second target weight value in the second target weight map is greater than or equal to 1, and preferably, each second target weight value in the second target weight map may be equal to 1, that is, the original first ultrasound image is compounded with the second ultrasound image without any processing. For example, a larger second target weight value may be assigned to the pixel point in the non-sound shadow region, and a smaller second target weight value may be assigned to the pixel point in the sound shadow region, so as to more retain the pixel value information of the pixel point in the non-sound shadow region in the first ultrasound image; then, updating the pixel value of each pixel point of the first ultrasonic image based on the second target weight map, namely multiplying the pixel value of each pixel point in the first ultrasonic image by a second target weight value in the second target weight map corresponding to the pixel value, so as to obtain an updated pixel value; and then compounding the updated first ultrasonic image and the updated second ultrasonic image to obtain a target ultrasonic image, wherein optionally, the pixel value of each pixel point of the target ultrasonic image is the sum of the pixel value of the corresponding pixel point of the updated first ultrasonic image and the pixel value of the corresponding pixel point of the updated second ultrasonic image. The weighted value of each pixel point is independent, so that the quality of the image in the non-sound shadow area is not influenced while the sound shadow is weakened or eliminated in the compounded image.
For example, the first ultrasound image and the second ultrasound image may be composited according to different imaging modes required by a user by using an image processing unit 8 (e.g., a Graphics Processing Unit (GPU)) as shown in fig. 1, so as to obtain a target ultrasound image of one frame.
Furthermore, the method of compounding the first ultrasound image (or the updated first ultrasound image) and the second ultrasound image (or the updated second ultrasound image) to obtain the target ultrasound image may also use any suitable method commonly used in the art currently and in the future.
The target ultrasound image may include a one-dimensional ultrasound image, a two-dimensional ultrasound image, or an ultrasound image of another dimension.
In one example, as shown in fig. 4, the method further comprises the steps of: and performing gray mapping processing on the target ultrasonic image. The gray Mapping process may be performed by any suitable method, for example, a Tone Mapping (Tone Mapping) algorithm or the like is used to map the target ultrasound image into a gray target ultrasound image with a gray value of 0 to 255 through the gray Mapping process, and the gray of the image is adjusted to make the processed image more comfortable for human eyes and better express information and features in the original image.
It should be noted that the order of the steps shown in fig. 2 may be changed as appropriate, for example, step S203 and step S204 may be placed before step S201. At least some of the steps in fig. 2 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or at least partially with respect to other steps or sub-steps or stages of other steps.
Based on the foregoing ultrasound imaging method, an embodiment of the present application further provides an ultrasound imaging system, as shown in fig. 1, including:
a probe 1;
a transmitting circuit 2 for exciting the probe 1 to transmit a first ultrasonic wave of a first frequency to a scan target and a second ultrasonic wave of a second frequency to the scan target at least once;
a receiving circuit 4 and a beam forming circuit 5 for receiving the ultrasonic echo of the first ultrasonic wave transmitted each time and receiving the ultrasonic echo of the second ultrasonic wave transmitted each time, respectively;
the processor 6 is configured to: obtaining a first ultrasonic image based on an ultrasonic echo of the first ultrasonic wave; obtaining a second ultrasonic image based on an ultrasonic echo of the second ultrasonic wave, wherein the second ultrasonic image at least comprises compensation information for compensating the missing information of the sound shadow region in the first ultrasonic image, and the compensation information comprises pixel value information of pixel points of the sound shadow region; compositing the first and second ultrasound images to obtain a target ultrasound image, optionally the target ultrasound image comprises a one-dimensional ultrasound image or a two-dimensional ultrasound image.
In one example, the ultrasound imaging system further includes a Display (not shown) for displaying information input by or provided to the user and various graphical user interfaces of the ultrasound imaging apparatus, which may be composed of graphics, text, icons, video, and any combination thereof, in this embodiment, the Display may Display the target ultrasound image, and the Display may include a Display panel, and optionally, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
In one example, the ultrasound imaging system further includes a storage device (not shown), which may include one or more computer program products, which may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. On which one or more computer program instructions may be stored that may be executed by processor 6 to implement the functions of the embodiments of the application described herein (as implemented by the processor) and/or other desired functions. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer-readable storage medium.
In one example, the ultrasound imaging system further includes an input device (not shown) which may be a device used by a user to input instructions and may include one or more of a keyboard, mouse, microphone, touch screen, and the like.
In one embodiment of the present application, the processor 6 is further configured to: identifying a sound shadow region in the scan target based on intensity information of echo signals of a scan line of the first ultrasonic wave. Illustratively, the processor is specifically configured to: and detecting intensity information of echo signals of a scanning line of the first ultrasonic wave, wherein when the signal intensity of a critical point in the echo signals is detected to be over-saturated, and the signal intensities of the echo signals in the direction along the scanning line after the critical point are all smaller than a threshold signal intensity, an area in the direction along the scanning line after the critical point is determined to be the sound shadow area.
In one embodiment of the present application, the first frequency is greater than the second frequency. Alternatively, the emission energy of the second ultrasonic wave is higher than the emission energy of the first ultrasonic wave, for example, the emission waveform length of the second ultrasonic wave is longer than the emission waveform length of the first ultrasonic wave.
In one embodiment of the present application, the processor 6 is further configured to: identifying pixel points in the first ultrasonic image and the second ultrasonic image, which are located in the sound shadow area; compounding the information of the pixel points in at least the sonographic area in the second ultrasonic image with the first ultrasonic image to obtain the target ultrasonic image. The processor 6 is also more specifically configured to: identifying the pixel points located in the sound and shadow area based on the difference value of the pixel values of the pixel points corresponding to the second ultrasonic image and the first ultrasonic image, and determining the pixel points with the difference value of the pixel values within a preset threshold value as the pixel points corresponding to the sound and shadow area.
In one embodiment of the present application, the processor 6 is further configured to: determining a first target weight value of each pixel point in the second ultrasonic image to generate a first target weight map; updating the pixel value of each pixel point in the second ultrasonic image based on the first target weight map; compounding the updated second ultrasound image and the first ultrasound image to obtain a target ultrasound image.
In one embodiment of the present application, the processor 6 is further configured to: determining a second target weight value of each pixel point of the first ultrasonic image to generate a second target weight map; updating the pixel value of each pixel point of the first ultrasonic image based on the second target weight map; and compounding the updated first ultrasonic image and the updated second ultrasonic image to obtain a target ultrasonic image, wherein optionally, the pixel value of each pixel point of the target ultrasonic image is the sum of the pixel value of the corresponding pixel point of the updated first ultrasonic image and the pixel value of the corresponding pixel point of the updated second ultrasonic image.
In one embodiment of the present application, the processor 6 is further configured to: determining a first weight value of each pixel point based on a difference value of pixel values of corresponding pixel points in the first ultrasonic image and the second ultrasonic image to form a first weight map, wherein the first weight value increases with the increase of the difference value when the difference value is smaller than a first threshold value, and the first weight value decreases with the increase of the difference value when the difference value is greater than or equal to the first threshold value; determining a second weight value of each pixel point according to the pixel value of each pixel point in the first ultrasonic image to form a second weight map, wherein when the pixel value of the pixel point is greater than a second threshold, the second weight value decreases with the increase of the pixel value, and optionally, each second target weight value in a second target weight map is greater than or equal to 1; and then, multiplying the weight values of the corresponding pixel points of the first weight map and the second weight map to obtain the first target weight value so as to form the first target weight map.
In one embodiment of the present application, the processor 6 is further configured to:
and multiplying the pixel value of each pixel point of the second ultrasonic image by the corresponding first target weight value in the first target weight map so as to update the pixel value of each pixel point of the second ultrasonic image.
In one embodiment of the present application, the processor 6 is further configured to: and carrying out gray mapping processing on the target ultrasonic image.
In addition, the embodiment of the application also provides a computer storage medium, and a computer program is stored on the computer storage medium. One or more computer program instructions may be stored on the computer-readable storage medium, which may be executed by a processor to execute the program instructions stored by the storage device to implement the functions of the embodiments of the present application (implemented by the processor) described herein and/or other desired functions, such as to perform the corresponding steps of the ultrasound imaging method according to the embodiments of the present application, and various applications and various data, such as various data used and/or generated by the applications, etc., may also be stored in the computer-readable storage medium.
For example, the computer storage medium may include, for example, a memory card of a smart phone, a storage component of a tablet computer, a hard disk of a personal computer, a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a portable compact disc read only memory (CD-ROM), a USB memory, or any combination of the above storage media.
In summary, according to the ultrasound imaging method and system of the embodiment of the present application, by transmitting a first ultrasound wave of a first frequency to a scan target; obtaining a first ultrasonic image based on an ultrasonic echo of the first ultrasonic wave; transmitting a second ultrasonic wave of a second frequency at least once to the scanning target; obtaining a second ultrasonic image based on an ultrasonic echo of the second ultrasonic wave, wherein the second ultrasonic image at least comprises compensation information for compensating the missing information of the sound shadow region in the first ultrasonic image, and the compensation information comprises pixel value information of pixel points of the sound shadow region; and compounding the first ultrasonic image and the second ultrasonic image to obtain a target ultrasonic image, so that the sound shadow in the finally obtained target ultrasonic image is removed or weakened, the image quality of a non-sound shadow part is not influenced, and the diagnosis confidence and efficiency of a doctor according to the target ultrasonic image are improved.
Although the example embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above-described example embodiments are merely illustrative and are not intended to limit the scope of the present application thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present application. All such changes and modifications are intended to be included within the scope of the present application as claimed in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the present application, various features of the present application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the application and aiding in the understanding of one or more of the various inventive aspects. However, the method of the present application should not be construed to reflect the intent: this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where such features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functionality of some of the modules according to embodiments of the present application. The present application may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present application may be stored on a computer readable medium or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.

Claims (34)

1. An ultrasound imaging method, characterized in that it comprises:
transmitting a first ultrasonic wave of a first frequency to a scanning target;
obtaining a first ultrasonic image based on an ultrasonic echo of the first ultrasonic wave;
transmitting a second ultrasonic wave of a second frequency at least once to the scanning target;
obtaining a second ultrasonic image based on an ultrasonic echo of the second ultrasonic wave, wherein the second ultrasonic image at least comprises compensation information for compensating the missing information of the sound shadow region in the first ultrasonic image, and the compensation information comprises pixel value information of pixel points of the sound shadow region;
the first and second ultrasound images are compounded to obtain a target ultrasound image.
2. The ultrasound imaging method of claim 1, wherein said transmitting at least one second ultrasound wave of a second frequency to said scan target further comprises:
identifying sound shadow areas in the scan target;
and emitting second ultrasonic waves of a second frequency to the sound shadow area at least once.
3. The ultrasound imaging method of claim 2, wherein said identifying sound shadow regions in said scan target comprises:
identifying a sound shadow region in the scan target based on intensity information of echo signals of a scan line of the first ultrasonic wave.
4. The ultrasonic imaging method according to claim 3, wherein identifying a sound shadow region in the scan target based on the intensity information of the echo signal of each scan line of the first ultrasonic wave comprises:
and detecting intensity information of echo signals of a scanning line of the first ultrasonic wave, wherein when the signal intensity of a critical point in the echo signals is detected to be over-saturated, and the signal intensities of the echo signals in the direction along the scanning line after the critical point are all smaller than a threshold signal intensity, an area in the direction along the scanning line after the critical point is determined to be the sound shadow area.
5. An ultrasound imaging method according to any of claims 1 to 4, wherein the first frequency is greater than the second frequency.
6. An ultrasound imaging method according to any of claims 1 to 5, characterized in that the emission energy of the second ultrasound waves is higher than the emission energy of the first ultrasound waves.
7. The ultrasonic imaging method according to claim 6, wherein a transmission waveform length of the second ultrasonic wave is longer than a transmission waveform length of the first ultrasonic wave.
8. The ultrasound imaging method of claim 1, wherein the compounding the first ultrasound image and the second ultrasound image to obtain a target ultrasound image comprises:
identifying pixel points in the first ultrasonic image and the second ultrasonic image, which are located in the sound shadow area;
compounding the information of the pixel points in at least the sonographic area in the second ultrasonic image with the first ultrasonic image to obtain the target ultrasonic image.
9. The method of ultrasound imaging according to claim 8, wherein identifying pixel points in the first and second ultrasound images that are located in an acoustic shadow region comprises:
identifying the pixel points located in the sound and shadow area based on the difference value of the pixel values of the pixel points corresponding to the second ultrasonic image and the first ultrasonic image, and determining the pixel points with the difference value of the pixel values within a preset threshold value as the pixel points corresponding to the sound and shadow area.
10. The ultrasound imaging method of claim 1, wherein the compounding the first ultrasound image and the second ultrasound image to obtain a target ultrasound image comprises:
determining a first target weight value of each pixel point of at least the sound shadow area in the second ultrasonic image to generate a first target weight map;
updating the pixel value of each pixel point in the second ultrasonic image based on the first target weight map;
compounding the updated second ultrasound image and the first ultrasound image to obtain a target ultrasound image.
11. The ultrasound imaging method of claim 1, wherein the compounding the first ultrasound image and the second ultrasound image to obtain a target ultrasound image comprises:
determining a first target weight value of each pixel point of at least the sound shadow area in the second ultrasonic image to generate a first target weight map;
updating the pixel value of each pixel point in the second ultrasonic image based on the first target weight map;
determining a second target weight value of each pixel point of the first ultrasonic image to generate a second target weight map;
updating the pixel value of each pixel point of the first ultrasonic image based on the second target weight map;
compounding the updated first ultrasound image and the updated second ultrasound image to obtain a target ultrasound image.
12. The ultrasound imaging method of claim 10, wherein determining a first target weight value for each pixel point in the second ultrasound image to generate a first target weight map comprises:
determining a first weight value of each pixel point based on a difference value of pixel values of corresponding pixel points in the first ultrasonic image and the second ultrasonic image to form a first weight map, wherein the first weight value increases with the increase of the difference value when the difference value is smaller than a first threshold value, and the first weight value decreases with the increase of the difference value when the difference value is greater than or equal to the first threshold value;
determining a second weight value of each pixel point according to the pixel value of each pixel point in the first ultrasonic image to form a second weight map, wherein when the pixel value of the pixel point is greater than a second threshold value, the second weight value is reduced along with the increase of the pixel value;
and multiplying the weight values of the corresponding pixel points of the first weight map and the second weight map to obtain the first target weight value so as to form the first target weight map.
13. The ultrasound imaging method of claim 10, wherein updating the pixel value of each pixel point in the second ultrasound image based on the first target weight map comprises:
and multiplying the pixel value of each pixel point of the second ultrasonic image by the corresponding first target weight value in the first target weight map so as to update the pixel value of each pixel point of the second ultrasonic image.
14. The ultrasound imaging method of claim 11, further comprising:
and the pixel value of each pixel point of the target ultrasonic image is the sum of the pixel value of the corresponding pixel point of the updated first ultrasonic image and the pixel value of the corresponding pixel point of the updated second ultrasonic image.
15. The method of ultrasound imaging according to claim 11, wherein each of said second target weight values in the second target weight map is greater than or equal to 1.
16. The ultrasound imaging method of claim 1, further comprising the steps of:
and carrying out gray mapping processing on the target ultrasonic image.
17. The ultrasound imaging method of claim 1, wherein the target ultrasound image comprises a one-dimensional ultrasound image or a two-dimensional ultrasound image.
18. An ultrasound imaging system, comprising:
a probe;
the transmitting circuit is used for exciting the probe to transmit first ultrasonic waves with a first frequency to a scanning target and transmit second ultrasonic waves with a second frequency to the scanning target at least once;
a processor to:
obtaining a first ultrasonic image based on an ultrasonic echo of the first ultrasonic wave;
obtaining a second ultrasonic image based on an ultrasonic echo of the second ultrasonic wave, wherein the second ultrasonic image at least comprises compensation information for compensating the missing information of the sound shadow region in the first ultrasonic image, and the compensation information comprises pixel value information of pixel points of the sound shadow region;
the first and second ultrasound images are compounded to obtain a target ultrasound image.
19. The ultrasound imaging system of claim 18, wherein the processor is further configured to identify sound shadow regions in the scan target;
the transmitting circuit is also used for exciting the probe to transmit second ultrasonic waves of a second frequency to the sound shadow area at least once.
20. The ultrasound imaging system of claim 19, wherein the processor is specifically configured to:
identifying a sound shadow region in the scan target based on intensity information of echo signals of a scan line of the first ultrasonic wave.
21. The ultrasound imaging system of claim 20, wherein the processor is specifically configured to:
and detecting intensity information of echo signals of a scanning line of the first ultrasonic wave, wherein when the signal intensity of a critical point in the echo signals is detected to be over-saturated, and the signal intensities of the echo signals in the direction along the scanning line after the critical point are all smaller than a threshold signal intensity, an area in the direction along the scanning line after the critical point is determined to be the sound shadow area.
22. The ultrasound imaging system of any of claims 18 to 21, wherein the first frequency is greater than the second frequency.
23. The ultrasound imaging system of any of claims 18 to 22, wherein the second ultrasound waves have a higher emission energy than the first ultrasound waves.
24. The ultrasound imaging system of claim 23,
the transmission waveform length of the second ultrasonic wave is longer than that of the first ultrasonic wave.
25. The ultrasound imaging system of claim 18, wherein the processor is specifically configured to:
identifying pixel points in the first ultrasonic image and the second ultrasonic image, which are located in the sound shadow area;
compounding the information of the pixel points in at least the sonographic area in the second ultrasonic image with the first ultrasonic image to obtain the target ultrasonic image.
26. The ultrasound imaging system of claim 25, wherein the processor is specifically configured to:
identifying the pixel points located in the sound and shadow area based on the difference value of the pixel values of the pixel points corresponding to the second ultrasonic image and the first ultrasonic image, and determining the pixel points with the difference value of the pixel values within a preset threshold value as the pixel points corresponding to the sound and shadow area.
27. The ultrasound imaging system of claim 18, wherein the processor is specifically configured to:
determining a first target weight value of each pixel point of at least the sound shadow area in the second ultrasonic image to generate a first target weight map;
updating the pixel value of each pixel point in the second ultrasonic image based on the first target weight map;
compounding the updated second ultrasound image and the first ultrasound image to obtain a target ultrasound image.
28. The ultrasound imaging system of claim 18, wherein the processor is specifically configured to:
determining a first target weight value of each pixel point of at least the sound shadow area in the second ultrasonic image to generate a first target weight map;
updating the pixel value of each pixel point in the second ultrasonic image based on the first target weight map;
determining a second target weight value of each pixel point of the first ultrasonic image to generate a second target weight map;
updating the pixel value of each pixel point of the first ultrasonic image based on the second target weight map;
compounding the updated first ultrasound image and the updated second ultrasound image to obtain a target ultrasound image.
29. The ultrasound imaging system of claim 27, wherein the processor is further configured to:
determining a first weight value of each pixel point based on a difference value of pixel values of corresponding pixel points in the first ultrasonic image and the second ultrasonic image to form a first weight map, wherein the first weight value increases with the increase of the difference value when the difference value is smaller than a first threshold value, and the first weight value decreases with the increase of the difference value when the difference value is greater than or equal to the first threshold value;
determining a second weight value of each pixel point according to the pixel value of each pixel point in the first ultrasonic image to form a second weight map, wherein when the pixel value of the pixel point is greater than a second threshold value, the second weight value is reduced along with the increase of the pixel value;
and multiplying the weight values of the corresponding pixel points of the first weight map and the second weight map to obtain the first target weight value so as to form the first target weight map.
30. The ultrasound imaging system of claim 27, wherein the processor is further configured to:
and multiplying the pixel value of each pixel point of the second ultrasonic image by the corresponding first target weight value in the first target weight map so as to update the pixel value of each pixel point of the second ultrasonic image.
31. The ultrasound imaging system of claim 30,
and the pixel value of each pixel point of the target ultrasonic image is the sum of the pixel value of the corresponding pixel point of the updated first ultrasonic image and the pixel value of the corresponding pixel point of the updated second ultrasonic image.
32. The ultrasound imaging system of claim 28, wherein each of the second target weight values in a second target weight map is greater than or equal to 1.
33. The ultrasound imaging system of claim 19, wherein the processor is further configured to:
and carrying out gray mapping processing on the target ultrasonic image.
34. The ultrasound imaging system of claim 19, wherein the target ultrasound image comprises a one-dimensional ultrasound image or a two-dimensional ultrasound image.
CN201910785078.1A 2019-08-23 2019-08-23 Ultrasound imaging method and system Pending CN112401934A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910785078.1A CN112401934A (en) 2019-08-23 2019-08-23 Ultrasound imaging method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910785078.1A CN112401934A (en) 2019-08-23 2019-08-23 Ultrasound imaging method and system

Publications (1)

Publication Number Publication Date
CN112401934A true CN112401934A (en) 2021-02-26

Family

ID=74780097

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910785078.1A Pending CN112401934A (en) 2019-08-23 2019-08-23 Ultrasound imaging method and system

Country Status (1)

Country Link
CN (1) CN112401934A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115670520A (en) * 2023-01-06 2023-02-03 深圳微创踪影医疗装备有限公司 Intravascular ultrasonic imaging method and device, computer equipment and storage medium
CN117994185A (en) * 2022-10-27 2024-05-07 数坤(深圳)智能网络科技有限公司 Ultrasound image acquisition method, device, medical scanning equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117994185A (en) * 2022-10-27 2024-05-07 数坤(深圳)智能网络科技有限公司 Ultrasound image acquisition method, device, medical scanning equipment and storage medium
CN115670520A (en) * 2023-01-06 2023-02-03 深圳微创踪影医疗装备有限公司 Intravascular ultrasonic imaging method and device, computer equipment and storage medium
WO2024146622A1 (en) * 2023-01-06 2024-07-11 深圳微创踪影医疗装备有限公司 Intravascular ultrasound imaging method and apparatus, computer device and storage medium

Similar Documents

Publication Publication Date Title
US9123139B2 (en) Ultrasonic image processing with directional interpolation in order to increase the resolution of an image
KR102014504B1 (en) Shadow suppression in ultrasound imaging
JP7199972B2 (en) Information processing device, information processing method, program
US8343054B1 (en) Methods and apparatus for ultrasound imaging
US20140050048A1 (en) Harmonic Ultrasound Imaging Using Synthetic Aperture Sequential Beamforming
JP2016093277A (en) Medical image processing apparatus, ultrasonic diagnostic apparatus, medical image processing method and medical image processing program
CN113017682A (en) Ultrasonic imaging equipment and method
CN112401934A (en) Ultrasound imaging method and system
CN102209496B (en) Ultrasonographic device and method for processing signal of ultrasonographic device
CN112294354B (en) Ultrasound imaging method and system
JP2006122666A (en) Ultrasonic imaging apparatus
CN110731795B (en) Processing method and device for spatial compound imaging
US20100113931A1 (en) Ultrasound System And Method For Providing Three-Dimensional Ultrasound Images
JP2014161478A (en) Ultrasonic diagnostic apparatus and control program for the same
US20120216617A1 (en) Method and system for nondestructive ultrasound testing
US20140276045A1 (en) Method and apparatus for processing ultrasound data using scan line information
US9757091B2 (en) Ultrasound diagnosis apparatus, medical image-processing apparatus, and method of processing medical images
JP2008220652A (en) Ultrasonic diagnostic apparatus and ultrasonic image generation program
US20110218440A1 (en) Ultrasonic diagnostic apparatus and signal processing method in ultrasonic diagnostic apparatus
JP2007222264A (en) Ultrasonograph
US20230305126A1 (en) Ultrasound beamforming method and device
CN113260315B (en) Contrast imaging
US11953591B2 (en) Ultrasound imaging system with pixel extrapolation image enhancement
US11224410B2 (en) Methods and systems for filtering ultrasound image clutter
CN113177930A (en) Ultrasonic image frequency compounding method and device, ultrasonic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination