CN111415308A - Ultrasonic image processing method and communication terminal - Google Patents

Ultrasonic image processing method and communication terminal Download PDF

Info

Publication number
CN111415308A
CN111415308A CN202010177729.1A CN202010177729A CN111415308A CN 111415308 A CN111415308 A CN 111415308A CN 202010177729 A CN202010177729 A CN 202010177729A CN 111415308 A CN111415308 A CN 111415308A
Authority
CN
China
Prior art keywords
edge
value
pixel
image
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010177729.1A
Other languages
Chinese (zh)
Other versions
CN111415308B (en
Inventor
付传卿
丁勇
孙会超
于倩蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Medical Equipment Co Ltd
Original Assignee
Qingdao Hisense Medical Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Medical Equipment Co Ltd filed Critical Qingdao Hisense Medical Equipment Co Ltd
Priority to CN202010177729.1A priority Critical patent/CN111415308B/en
Publication of CN111415308A publication Critical patent/CN111415308A/en
Application granted granted Critical
Publication of CN111415308B publication Critical patent/CN111415308B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an ultrasonic image processing method and a communication terminal, wherein the method comprises the following steps: dividing an ultrasonic image to be processed into a non-edge area and an edge area; determining the discrete degree value of the pixel value in each preset direction in the preset neighborhood of the pixel point for each pixel point in the ultrasonic image to be processed; if the pixel point is the pixel point in the non-edge region, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction; and if the pixel points are pixel points in the edge region, determining the edge direction according to the discrete degree value and the gradient value of the pixel points in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction. By the method, the distinguishability and continuity of tissues and organs are improved, and the method is used for at least solving the problems of sharp points, flickering points or image blurring caused by the marginal enhancement in the related art.

Description

Ultrasonic image processing method and communication terminal
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to an ultrasonic image processing method and a communication terminal.
Background
The ultrasonic imaging system acquires ultrasonic images of the tissues and organs according to the received ultrasonic return signals of the tissues and organs. Due to the fact that the tissues and the organs have nonuniformity and the interference phenomenon of the ultrasonic signals, the acquired ultrasonic images have obvious speckle noise, and meanwhile, the edges of the tissues and the organs are not obvious enough. These problems not only affect the resolvability and continuity of the tissue organ, but also reduce the accuracy and understandability of the ultrasound images.
In the related art, the directionality enhancement is performed on the pixel points in the edge region based on the gradient value in the conventional isotropic filtering. The processing mode can lead to the generation of a sharp point or a flashing point after the image processing; in the related art, isotropic mean filtering is adopted for non-edge area pixel points, and in this way, the problem of image blurring can occur.
Disclosure of Invention
The embodiment of the application provides an ultrasonic image processing method and a terminal, which are used for solving at least one of the following problems: the directional enhancement based on the gradient value easily causes the generation of a sharp point or a flicker point, and the isotropic mean filtering adopted for the non-edge area pixel points easily causes the image to be blurred.
According to an aspect of an exemplary embodiment, there is provided an ultrasound image processing method including:
dividing an ultrasonic image to be processed into a non-edge area and an edge area;
determining the discrete degree value of the pixel value in each preset direction in the preset neighborhood of the pixel point for each pixel point in the ultrasonic image to be processed;
if the pixel point is the pixel point in the non-edge region, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction;
and if the pixel points are pixel points in the edge region, determining the edge direction according to the discrete degree value and the gradient value of the pixel points in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction.
The beneficial effects produced by the embodiment are as follows: the method includes the steps that firstly, an ultrasonic image to be processed is divided into an edge area and a non-edge area; the method realizes targeted processing of different areas and can ensure respective characteristics of the edge area and the non-edge area. For example, according to the discrete degree value of each pixel point, the pixel points in the non-edge area are subjected to image filtering processing, so that smooth transition between the pixel points and the pixel points in the adjacent area can be ensured, and abrupt points and flashing points can be avoided, and the image can be prevented from being blurred. And the edge direction can be well selected by the discrete distance value and the gradient value, and the pixel points in the edge direction are directionally smoothed, so that the edge pixel points can avoid sharp points and flickering points. And the accuracy and the understandability of the ultrasonic image are improved by further combining the edge enhancement processing on the pixel points of the edge area.
In some exemplary embodiments, the method further comprises:
determining the number of the pixel points which are divided into the edge regions in the preset neighborhood of each pixel point;
if the number of the pixel points divided into the edge areas belongs to a specified range, determining the pixel points to be positioned at the junction of the non-edge area and the edge area; wherein the lower limit of the specified range is a first number threshold, and the upper limit of the specified range is a second number threshold;
for each pixel point located at the joint, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction to obtain a non-edge pixel value of the pixel point; and the number of the first and second electrodes,
determining an edge direction according to the discrete degree value and the gradient value of the pixel point in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction to obtain an edge pixel value of the pixel point;
and carrying out weighted summation on the non-edge pixel value and the edge pixel value according to a set weight coefficient to obtain a pixel value processed by the pixel point.
The beneficial effects produced by the embodiment are as follows: the embodiment provides the implementation mode of the fusion processing of the joint of the edge area and the non-edge area, so that the pixel point of the joint can fuse the characteristics of the edge and the non-edge, the characteristics of the pixel point of the joint are met, and the obvious difference trace of the joint of the edge area and the non-edge area caused by the simple adoption of the edge or non-edge processing mode at the joint is avoided.
In some exemplary embodiments, the dividing the ultrasound image to be processed into the non-edge region and the edge region includes:
determining gradient values of all pixel points in the ultrasonic image to be processed, and acquiring a preset occupation ratio of an edge area corresponding to the ultrasonic image to be processed; dividing the pixel points of the gradient values in the preset ratio range into a first edge area of the ultrasonic image to be processed; determining the first edge area as an edge area in the ultrasonic image to be processed; and the gradient values of the pixel points within the preset ratio range are all higher than the gradient values of the pixel points outside the preset ratio range.
The beneficial effects produced by the embodiment are as follows: this embodiment gives the division of the edge area and the non-edge area with the preset duty range and the gradient value. The preset proportion range can be set according to different ultrasonic image characteristics, so that the division of pixel points of the edge region and the non-edge region can be carried out in the ultrasonic image subdivision field (namely different ultrasonic images) by combining the characteristics of the ultrasonic images, and the division result of the edge region and the non-edge region is more accurate.
Dividing pixel points with the gray values larger than a specified gray threshold value in the ultrasonic image to be processed into a second edge area of the ultrasonic image to be processed; determining the second edge area as an edge area in the ultrasonic image to be processed;
the beneficial effects produced by the embodiment are as follows: the embodiment provides the method for dividing the edge region and the non-edge region by combining the gray value characteristics of the edge region, the dividing mode is simple and easy to realize, and the accuracy of dividing the edge region and the non-edge region can be improved.
And in a third mode, the pixel point collection in the first edge region and the second edge region is used as the edge region in the ultrasonic image to be processed.
The beneficial effects produced by the embodiment are as follows: this embodiment gives the division of the edge area and the non-edge area with the preset duty range and the gradient value. The preset proportion range can be set according to different ultrasonic image characteristics, so that the division of pixel points in the edge region and the non-edge region can be carried out in the ultrasonic image subdivision field (namely different ultrasonic images) by combining the characteristics of the ultrasonic images, and the pixel points in the edge region and the non-edge region can be further searched by combining a gray value mode. The method can identify the pixel points in the edge region from different characteristic angles of the pixel points in the edge region, such as gradient values and gray value angles, so that the dividing result of the edge region and the non-edge region is more accurate.
In some exemplary embodiments, the performing, according to the discrete degree value of the pixel point in each preset direction, image filtering processing on the pixel point includes:
configuring corresponding weight values for each preset direction according to the discrete degree values of the pixel points in each preset direction; the sum of the weighted values configured in each preset direction is a designated value, and the discrete degree value of each preset direction and the corresponding weighted value form a negative correlation relationship;
and taking the weighted summation result of the pixel values in each preset direction as the pixel value of the pixel point after image filtering processing.
The beneficial effects produced by the embodiment are as follows: in this embodiment, the image filtering processing is performed on the pixels in the non-edge region by combining the discrete degree values of the pixels in the non-edge region in each preset direction in the preset neighborhood of each pixel. The processing carried out by the method avoids the problem that the image is blurred or a false edge is generated in the correlation technique according to linear filtering or nonlinear filtering, so that the filtering processing effect of the pixel point image in the non-edge area is better.
In some exemplary embodiments, the determining an edge direction according to the discrete degree value and the gradient value of the pixel point in each preset direction includes:
determining a first direction corresponding to the maximum discrete degree value in the discrete degree values in all preset directions, and determining a second direction corresponding to the maximum gradient value in the gradient values in all directions;
when the first direction and the second direction are the same direction, determining that the direction of the first direction is an edge direction; the edge direction represents that the pixel point has the maximum discrete level value and/or the maximum gradient value in the direction;
and when the first direction and the second direction are different directions and are not perpendicular to each other, determining that the directions respectively perpendicular to the first direction and the second direction are edge directions.
The beneficial effects produced by the embodiment are as follows: by the embodiment, the embodiment of determining the edge region in the edge region is defined, and a basis is provided for subsequent directional smoothing processing or edge enhancement processing.
In some exemplary embodiments, the discrete magnitude value is a variance-to-mean ratio; the variance-mean ratio is the ratio of the pixel value variance and the pixel value mean of the pixel point in each preset direction.
The beneficial effects produced by the embodiment are as follows: the implementation provides a preferred implementation mode of the discrete degree value, the implementation mode is easy to implement, the calculated amount is small, and the corresponding processing of dividing the edge area and the non-edge area of each pixel point can be realized through the variance-mean ratio of the pixel points.
In some exemplary embodiments, performing the directional smoothing process according to the determined edge direction includes:
performing image smoothing on the pixel points according to pixel points in the preset neighborhood, which are perpendicular to the edge direction, determining a first pixel value after the image smoothing, and determining a first pixel value after the image smoothing;
performing edge enhancement processing according to the determined edge direction, including:
determining a one-dimensional second order differential value of the pixel point in the edge direction; acquiring a variance value and a preset multiplier factor of pixel values of the pixel points in the edge direction in the preset neighborhood;
determining a second pixel value after edge enhancement processing according to the one-dimensional second-order differential value, the variance value and the preset multiplier factor; the one-dimensional second-order differential value, the variance value and the preset multiplier factor respectively form positive correlation with the second pixel value;
and determining the sum of the first pixel value and the second pixel value as the pixel value of the pixel point after directional smoothing processing and edge enhancement processing.
The beneficial effects produced by the embodiment are as follows: this embodiment provides an implementation manner of performing directional smoothing and edge enhancement processing on an edge region, which improves the effect of processing an ultrasound image.
In some exemplary embodiments, the second pixel value is determined according to the following formula:
R_out2=laplace*σ*α
wherein the R _ out2 represents the second pixel value, the σ represents the variance value, and the α represents the multiplier factor.
The beneficial effects produced by the embodiment are as follows: according to the formula of the edge enhancement processing provided in the embodiment, the effect of image edge enhancement is improved, and the point flicker phenomenon in real-time ultrasonic display brought by the edge enhancement operation is inhibited.
In some exemplary embodiments, before the dividing the ultrasound image to be processed into the non-edge region and the edge region, acquiring the image to be processed includes:
acquiring an original ultrasonic image, and performing multi-scale decomposition on the original ultrasonic image to obtain images of at least two scales, wherein the image of each scale is respectively used as the ultrasonic image to be processed;
the method further comprises the following steps:
after the ultrasonic images to be processed are processed, performing multi-scale reconstruction and fusion on the processed ultrasonic images; and outputting the fused ultrasonic image.
The beneficial effects produced by the embodiment are as follows: the embodiment considers that the information of the ultrasound image related to the ultrasound diagnosis usually appears in the details of multiple scales of the ultrasound image, so the image information of each scale in the ultrasound image is fused by the implementation of the multi-scale decomposition of the acquired original ultrasound image.
In some exemplary embodiments, the multi-scale decomposition is a pyramidal decomposition;
the multi-scale reconstruction and fusion of the processed ultrasonic images comprises the following steps:
fusing the images of all scales into a layer of image with the largest scale according to the order of the scales from small to large; wherein the content of the first and second substances,
when the ultrasonic images of two adjacent scales are fused, the image of the next scale is enlarged to the image of the previous scale; and then carrying out weighted summation on pixel values of pixel points at the same positions in the ultrasonic images of the two adjacent scales to obtain a fusion result of fusing the image of the next scale into the image of the previous scale.
The beneficial effects produced by the embodiment are as follows: in this embodiment, after the original ultrasound image is divided into to-be-processed ultrasound images of multiple scales, corresponding image processing is performed, and then corresponding multi-scale reconstruction and fusion are performed to obtain a final processing result of the ultrasound image.
According to another aspect of the exemplary embodiments, there is provided a communication terminal including:
an input-output unit configured to receive an ultrasound image formed from an ultrasound return signal and output the ultrasound image for display;
a display panel configured to display a display interface of an application program for displaying the ultrasound image;
a backlight assembly configured to be positioned at a rear surface of the display panel, the backlight assembly including a plurality of backlight partitions, each of which may emit light of different brightness;
a processor respectively connected with the input and output unit, the display panel and the backlight assembly, and configured to:
dividing an ultrasonic image to be processed into a non-edge area and an edge area;
determining the discrete degree value of the pixel value in each preset direction in the preset neighborhood of the pixel point for each pixel point in the ultrasonic image to be processed;
if the pixel point is the pixel point in the non-edge region, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction;
and if the pixel points are pixel points in the edge region, determining the edge direction according to the discrete degree value and the gradient value of the pixel points in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction.
In some exemplary embodiments, the processor is further configured to:
determining the number of the pixel points which are divided into the edge regions in the preset neighborhood of each pixel point;
if the number of the pixel points divided into the edge areas belongs to a specified range, determining the pixel points to be positioned at the junction of the non-edge area and the edge area; wherein the lower limit of the specified range is a first number threshold, and the upper limit of the specified range is a second number threshold;
for each pixel point located at the joint, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction to obtain a non-edge pixel value of the pixel point; and the number of the first and second electrodes,
determining an edge direction according to the discrete degree value and the gradient value of the pixel point in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction to obtain an edge pixel value of the pixel point;
and carrying out weighted summation on the non-edge pixel value and the edge pixel value according to a set weight coefficient to obtain a pixel value processed by the pixel point.
In some exemplary embodiments, the processor is configured to perform, when dividing the ultrasound image to be processed into a non-edge region and an edge region, performing:
determining gradient values of all pixel points in the ultrasonic image to be processed, and acquiring a preset ratio of an edge area corresponding to the ultrasonic image to be processed; dividing the pixel points of the gradient values in the preset ratio range into a first edge area of the ultrasonic image to be processed; determining the first edge area as an edge area in the ultrasonic image to be processed; the gradient values of the pixels in the preset ratio range are all higher than the gradient values of the pixels outside the preset ratio range; alternatively, the first and second electrodes may be,
dividing pixel points of which the gray values are greater than a specified gray threshold value in the ultrasonic image to be processed into a second edge area of the ultrasonic image to be processed; determining the second edge area as an edge area in the ultrasonic image to be processed; alternatively, the first and second electrodes may be,
and taking the pixel point collection in the first edge region and the second edge region as the edge region in the ultrasonic image to be processed.
In some exemplary embodiments, the processor is configured to, when performing image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction, perform:
configuring corresponding weight values for each preset direction according to the discrete degree values of the pixel points in each preset direction; the sum of the weighted values configured in each preset direction is a designated value, and the discrete degree value of each preset direction and the corresponding weighted value form a negative correlation relationship;
and taking the weighted summation result of the pixel values in each preset direction as the pixel value of the pixel point after image filtering processing.
In some exemplary embodiments, the processor is configured to, when determining the edge direction according to the discrete degree value and the gradient value of the pixel point in each preset direction, perform:
determining a first direction corresponding to the maximum discrete degree value in the discrete degree values in all preset directions, and determining a second direction corresponding to the maximum gradient value in the gradient values in all directions;
when the first direction and the second direction are the same direction, determining that the direction of the first direction is an edge direction; wherein the edge direction indicates that the pixel point has a maximum discrete level value and/or a maximum gradient value in the direction;
and when the first direction and the second direction are different directions and are not perpendicular to each other, determining that the directions respectively perpendicular to the first direction and the second direction are edge directions.
In some exemplary embodiments, the discrete magnitude value is a variance-to-mean ratio; the variance-mean ratio is the ratio of the pixel value variance and the pixel value mean of the pixel point in each preset direction.
In some exemplary embodiments, the processor is configured to perform, when performing the directional smoothing processing according to the determined edge direction:
performing image smoothing on the pixel points according to pixel points in the preset neighborhood, which are perpendicular to the edge direction, determining a first pixel value after the image smoothing, and determining a first pixel value after the image smoothing;
the processor is configured to perform edge enhancement processing according to the determined edge direction, and execute:
determining a one-dimensional second order differential value of the pixel point in the edge direction; acquiring a variance value and a preset multiplier factor of pixel values of the pixel points in the edge direction in the preset neighborhood;
determining a second pixel value after edge enhancement processing according to the one-dimensional second-order differential value, the variance value and the preset multiplier factor; the one-dimensional second-order differential value, the variance value and the preset multiplier factor respectively form positive correlation with the second pixel value;
and determining the sum of the first pixel value and the second pixel value as the pixel value of the pixel point after directional smoothing processing and edge enhancement processing.
In some exemplary embodiments, the second pixel value is determined according to the following formula:
R_out2=laplace*σ*α
wherein the R _ out2 represents the second pixel value, the σ represents the variance value, and the α represents the multiplier factor.
In some exemplary embodiments, the processor is configured to, before dividing the ultrasound image to be processed into the non-edge region and the edge region, execute:
acquiring an original ultrasonic image, and performing multi-scale decomposition on the original ultrasonic image to obtain images of at least two scales, wherein the image of each scale is respectively used as the ultrasonic image to be processed;
the processor is further configured to perform:
after the ultrasonic images to be processed are processed, performing multi-scale reconstruction and fusion on the processed ultrasonic images; and outputting the fused ultrasonic image.
In some exemplary embodiments, the multi-scale decomposition is a pyramidal decomposition;
the processor is configured to perform, when performing multi-scale reconstruction and fusion on each processed ultrasound image:
fusing the images of all scales into a layer of image with the largest scale according to the order of the scales from small to large; wherein the content of the first and second substances,
when the ultrasonic images of two adjacent scales are fused, the image of the next scale is enlarged to the image of the previous scale; and then carrying out weighted summation on pixel values of pixel points at the same positions in the ultrasonic images of the two adjacent scales to obtain a fusion result of fusing the image of the next scale into the image of the previous scale.
According to a further aspect of the exemplary embodiments, there is provided a computer storage medium having stored therein computer program instructions which, when run on a computer, cause the computer to perform the ultrasound image processing method as described above.
An ultrasound image processing method and a communication terminal provided in an embodiment of the present application include: dividing an ultrasonic image to be processed into a non-edge area and an edge area; determining the discrete degree value of the pixel value in each preset direction in the preset neighborhood of the pixel point for each pixel point in the ultrasonic image to be processed; if the pixel point is the pixel point in the non-edge region, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction; and if the pixel points are pixel points in the edge region, determining the edge direction according to the discrete degree value and the gradient value of the pixel points in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction. By the method, the distinguishability and continuity of tissues and organs are improved, and the problems of image blurring or a 'step' effect and the like are greatly avoided.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a communication terminal according to an embodiment of the present application;
fig. 2 is a schematic view of an application scenario of an ultrasound image processing method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an application principle according to an embodiment of the present application;
fig. 4 is a flowchart illustrating an ultrasound image processing method according to an embodiment of the present application;
FIG. 5 is a schematic view of a preset orientation according to an embodiment of the present application;
fig. 6 is an effect diagram of an ultrasound image processing method according to an embodiment of the present application;
fig. 7 is a schematic flowchart of another ultrasound image processing method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a communication terminal according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the accompanying drawings, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 shows a schematic configuration of a communication terminal 100.
The following describes an embodiment specifically taking the communication terminal 100 as an example. It should be understood that the communication terminal 100 shown in fig. 1 is only an example, and the communication terminal 100 may have more or less components than those shown in fig. 1, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
A block diagram of a hardware configuration of a communication terminal 100 according to an exemplary embodiment is exemplarily shown in fig. 1.
As shown in fig. 1, the communication terminal 100 may include, for example: RF (radio frequency) circuit 110, memory 120, display unit 130, ultrasound image acquisition device 140, sensor 150, audio circuit 160, Wireless Fidelity (Wi-Fi) module 170, processor 180, bluetooth module 181, and power supply 190. In the embodiment of the present application, the input/output unit may be at least one of the audio circuit 160, the bluetooth module 181, the Wi-Fi module 170, and the ultrasound image capturing device 140.
The RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and may receive downlink data of a base station and then send the downlink data to the processor 180 for processing; the uplink data may be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
The memory 120 may be used to store software programs and data. The processor 180 executes various functions of the communication terminal 100 and data processing by executing software programs or data stored in the memory 120. The memory 120 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The memory 120 stores an operating system that enables the communication terminal 100 to operate. The memory 120 may store an operating system and various application programs, and may also store codes for executing the terminal data processing method according to the embodiment of the present application.
The display unit 130 may be used to display input numbers or characters or image information and generate signal inputs related to user settings and function control of the communication terminal 100, for example, and specifically, the display unit 130 may include a touch screen 131 disposed on the front of the communication terminal 100 and may collect touch operations of a user thereon or nearby, such as clicking a button, dragging a scroll box, and the like, for example.
The display unit 130 may also be used to display a display interface of an application program of the photographing screen, for example. Specifically, the display unit 130 may include a display screen 132 disposed on the front surface of the communication terminal 100. The display screen 132 may be configured in the form of a liquid crystal display, a light emitting diode, or the like, for example. The display unit 130 may be used to display the interfaces of the various embodiments described in this application.
The touch screen 131 may cover the display screen 132, or the touch screen 131 and the display screen 132 may be integrated to implement the input and output functions of the communication terminal 100, and after the integration, the touch screen may be referred to as a touch display screen for short. In the present application, the display unit 130 may display the application programs and the corresponding operation steps.
As an input-output device, the ultrasound image acquisition apparatus 140 may be used to capture still images or video, for example. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing elements convert the light signals into electrical signals which are then passed to the processor 180 for conversion into digital image signals. The audio circuit 160, the bluetooth module 181, the Wi-Fi module 170, etc. may interact with other devices (e.g., medical instruments or other components of the communication terminal) for example to receive or output a captured image to the other devices.
The communication terminal 100 may further comprise at least one sensor 150, such as an acceleration sensor 151, a distance sensor 152, a fingerprint sensor 153, a temperature sensor 154. The communication terminal 100 may also be configured with other sensors such as a gyroscope, barometer, hygrometer, thermometer, infrared sensor, optical sensor, motion sensor, and the like.
The audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between a user and the communication terminal 100. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161. The communication terminal 100 may also be provided with a volume button for adjusting the volume of the sound signal. On the other hand, the microphone 162 converts the collected sound signal into an electrical signal, converts the electrical signal into audio data after being received by the audio circuit 160, and outputs the audio data to the RF circuit 110 to be transmitted to, for example, another terminal or outputs the audio data to the memory 120 for further processing. In this application, the microphone 162 may capture the voice of the user.
Wi-Fi belongs to a short-distance wireless transmission technology, and the communication terminal 100 may help a user to send and receive e-mails, browse webpages, access streaming media, and the like through the Wi-Fi module 170, which provides a wireless broadband internet access for the user.
The processor 180 is a control center of the communication terminal 100, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the communication terminal 100 and processes data by running or executing software programs stored in the memory 120 and calling data stored in the memory 120.
In some embodiments, processor 180 may include one or more processing units; the processor 180 may also integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a baseband processor, which mainly handles wireless communications. It will be appreciated that the baseband processor described above may also be integrated into the processor 180. In the present application, the processor 180 may run an operating system, an application program, a user interface display, and a touch response, and the processing method described in the embodiments of the present application. In addition, the processor 180 is coupled with the input-output unit and the display unit.
And the bluetooth module 181 is configured to perform information interaction with other bluetooth devices having a bluetooth module through a bluetooth protocol. For example, the communication terminal 100 may establish a bluetooth connection with a device (e.g., a medical device) having a bluetooth module through the bluetooth module 181, so as to perform data interaction.
The communication terminal 100 also includes a power supply 190 (such as a battery) to power the various components. The power supply may be logically connected to the processor 180 through a power management system to manage charging, discharging, power consumption, etc. through the power management system. The communication terminal 100 may also be configured with power buttons for powering the terminal on and off, and for locking the screen.
Referring to fig. 2, an application scenario of the ultrasound image processing method provided in the embodiment of the present application is schematically illustrated, where the scenario includes a user 200, a terminal 201, a backend server 202, and a probe 203.
In the method, firstly, ultrasonic waves are sent through a probe 203, after a signal returned according to the ultrasonic waves is received by a background server 202, the background server 202 processes the returned signal and generates an original ultrasonic image, and because the directly obtained ultrasonic image has obvious speckle noise, in order to output a more accurate and understandable ultrasonic image, in the embodiment of the application, each pixel point area in the ultrasonic image is divided into an edge area and a non-edge area, and then corresponding image processing is respectively performed on the pixel points in the edge area and the non-edge area according to the discrete degree value of each pixel point in each preset direction in the preset neighborhood of each pixel point. Finally, the user 200 can view the processed ultrasound image through the display interface of the display panel on the terminal 201. The resolvability and the continuity of the tissue and the organ in the ultrasonic image are ensured through the processed ultrasonic image, so that the accuracy and the understandability of the ultrasonic image are improved.
The terminal 201 and the backend server 202 may be communicatively connected through a communication network, which may be a local area network, a wide area network, or the like.
It should be noted that the application scenario shown in fig. 2 is only an example, and the embodiment of the present application does not limit this.
The ultrasonic imaging system acquires ultrasonic image information of the tissues and organs according to the received ultrasonic return signals of the tissues and organs. Due to the fact that the tissues and the organs have nonuniformity and the interference phenomenon of the ultrasonic signals, the acquired ultrasonic images have obvious speckle noise, and meanwhile, the edges of the tissues and the organs are not obvious enough. These problems not only affect the resolvability and continuity of the tissue organ, but also reduce the accuracy and understandability of the ultrasound images.
In the related art, in the conventional isotropic filtering, only gradient values of pixels are considered for directionality enhancement of pixels in an edge region, and in this case, an abrupt point or a flashing point is generated after image processing; and isotropic mean filtering is adopted for non-edge area pixel points, and the problem of image blurring can be caused in the mode. Alternatively, an anisotropic diffusion filtering method based on partial differential equations is adopted, but this method may cause problems such as "step" effect due to the diffusion processing method.
Fig. 3 is a schematic diagram of an application principle according to an embodiment of the present application. The part can be implemented by a part of modules or functional components of the communication terminal shown in fig. 1, and only the main components will be described below, while other components, such as a memory, a controller, a control circuit, etc., will not be described herein again.
As shown in fig. 3, the application environment may include a user interface 310 to be waited for a user operation provided via an input and output unit, a display unit 320 for displaying the user interface, and a processor 330 controlling display of the user interface.
The display unit 320 may include a display panel 321, a backlight assembly 322. The display panel 321 is configured to display an image, the backlight assembly 322 is disposed at the back of the display panel 321, and the backlight assembly 322 may include a plurality of backlight partitions (not shown), each of which may emit light with different brightness to illuminate the display panel 321.
The processor 330 may be configured to control the backlight brightness of each backlight partition in the backlight assembly 322 and cause each backlight partition in the backlight assembly to light the display panel 321 according to the corresponding backlight brightness.
The processor 330 may include an analysis unit 331 and a transmission unit 332. The analysis unit 331 may be configured to divide the ultrasound image to be processed into a non-edge region and an edge region; determining the discrete degree value of the pixel value in each preset direction in the preset neighborhood of the pixel point for each pixel point in the ultrasonic image to be processed; if the pixel point is the pixel point in the non-edge region, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction; and if the pixel points are pixel points in the edge region, determining the edge direction according to the discrete degree value and the gradient value of the pixel points in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction. The transmitting unit 332 is configured to transmit the processed ultrasound image to the display unit 320 for display.
In view of the above, the present application provides an ultrasound image processing method, which mainly includes: firstly, dividing each pixel point in an ultrasonic image to be processed into a non-edge area and an edge area; and then according to the discrete degree value of the pixel value in each preset direction in the preset neighborhood of each pixel point, respectively carrying out image filtering processing on the pixel points in the non-edge region and carrying out directional smoothing processing and edge enhancement processing on the pixel points in the edge region.
Based on the above description, fig. 4 is a schematic flowchart illustrating in detail a flow of an ultrasound image processing method according to an embodiment of the present application, where the flow specifically includes:
step 401: dividing the ultrasonic image to be processed into a non-edge area and an edge area.
In an alternative embodiment, the original ultrasound image obtained is taken as the ultrasound image to be processed. Or, in another alternative embodiment, since the information related to the ultrasound diagnosis in the ultrasound image generally appears in the details of the image at multiple scales; in implementation, an original ultrasound image is obtained, multi-scale decomposition is performed on the original ultrasound image to obtain images of at least two scales, and the image of each scale is respectively used as the ultrasound image to be processed. The implementation manner of the multi-scale decomposition includes a pyramid decomposition manner, a wavelet multi-scale decomposition manner, and the like, which is not limited in this application.
In the above embodiment, if the original ultrasound image is decomposed into a plurality of images with different scales by using a pyramid decomposition method, the length and width of the next-layer scale image are respectively one half of the length and width of the previous-layer scale image during decomposition, so that images with multiple scales can be obtained; wherein, the images of all scales have different detail information of the original ultrasonic image. For example, assuming that the original ultrasound image has a dimension M × N, after performing the overscaler decomposition into a plurality of images of different scales, the dimension of the down-layer scale image is shown in the following table, see table 1:
TABLE 1
Figure BDA0002411373130000171
After determining the ultrasound images to be processed, for each ultrasound image to be processed, dividing it into a non-edge region and an edge region, the determination can be made through three possible embodiments:
the first method is as follows: firstly, determining gradient values of all pixel points in the ultrasonic image to be processed, and acquiring a preset ratio of an edge area corresponding to the ultrasonic image to be processed. During implementation, the gradient value of each pixel point is determined, and optionally, the gradient value of each pixel point is determined through a Sobel operator (Sobel operator), which is assumed to be G; wherein G is a non-negative value; in addition, the embodiment of determining the gradient map in the present application is not limited to the sobel operator, and for example, an optimization method of the sobel operator may be adopted.
And the preset occupation ratio of the edge region is acquired and assumed to be gamma, wherein gamma is a proportional value and is expressed as the proportional value of the number of the pixel points belonging to the edge region in all the pixel points in the whole ultrasonic image. It should be noted that the ultrasound images of different parts of the body have different preset ratios of the peripheral regions, and the preset ratio is an empirical value, for example, the ultrasound images of the stomach and the ultrasound images of the chest have different preset ratios.
After the gradient value and the preset ratio are determined, dividing pixel points of the gradient value in the preset ratio range into a first edge area of the ultrasonic image to be processed. In implementation, because the pixel points in the edge region have the characteristic of higher gradient, the pixel points in the gradient value which belong to the preset ratio range are determined as a first edge region; for example, if the preset ratio is 30%, and the ultrasound image to be processed has 100 pixel points in total, 30 pixel points are sequentially taken from the highest gradient value as the pixel points of the edge region. Therefore, the gradient values of the pixels in the preset ratio range are all higher than the gradient values of the pixels outside the preset ratio range, that is, the gradient values of the pixels belonging to the edge region are higher than the gradient values of the pixels belonging to the non-edge region.
Then, the first edge area is determined as an edge area in the ultrasonic image to be processed.
The second method comprises the following steps: and dividing pixel points of which the gray values are greater than a specified gray threshold value in the ultrasonic image to be processed into a second edge area of the ultrasonic image to be processed. Because the brighter region in the image is generally the edge region, in order to not omit edge information, a higher gray threshold value T is specified, and the pixel points with the gray value larger than T in each pixel point in the image to be processed are determined as the edge region.
And determining the second edge area as an edge area in the ultrasonic image to be processed.
The third method comprises the following steps: in order to ensure that the pixel points determined as the edge regions in the to-be-processed image are more accurately attached to the pixel points corresponding to the actual edge regions, in implementation, the pixel points in the first edge region and the second edge region are gathered to serve as the edge regions in the to-be-processed ultrasonic image. According to the embodiment, the edge region is determined according to the gradient value of the pixel point and the collection of the pixel points which belong to the edge region and are determined by the gray value implementation mode, so that the accuracy of dividing the edge region and the non-edge region of the image to be processed is improved.
Step 402: and determining the discrete degree value of the pixel value in each preset direction in the preset neighborhood of the pixel point for each pixel point in the ultrasonic image to be processed.
In one possible implementation manner, the discrete degree value is a variance-to-mean ratio of pixel values in each preset direction in a preset neighborhood of each pixel point, where the variance-to-mean ratio is a ratio of a variance of the pixel values in each preset direction to a mean of the pixel values, in implementation, it is assumed that the preset neighborhood of the pixel point is a 3 * 3 neighborhood, and there are 4 preset directions in the preset neighborhood, and referring to fig. 5, a calculation formula for determining the variance-to-mean ratio in each preset direction is as follows:
Figure BDA0002411373130000191
wherein j represents a preset direction, i represents each pixel point in each preset direction, and N represents N pixel points in the preset direction; x is the number ofiRepresenting the pixel value corresponding to each pixel point in the preset direction,
Figure BDA0002411373130000192
representing the average pixel value in the predetermined direction, is determined by:
Figure BDA0002411373130000193
for example, as shown in direction 1 of FIG. 5, the corresponding pixel has an average value of
Figure BDA0002411373130000194
Then it corresponds to a variance to mean ratio of
Figure BDA0002411373130000195
Step 403: and if the pixel point is the pixel point in the non-edge region, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction.
When the method is implemented, corresponding weight values are configured for all preset directions according to the discrete degree values of the pixel points in all preset directions; the sum of the weighted values configured in each preset direction is a designated value, and the discrete degree value of each preset direction and the corresponding weighted value form a negative correlation relationship; and taking the weighted summation result of the pixel values in each preset direction as the pixel value of the pixel point after image filtering processing.
Hypothesis pixel pointsThe preset neighborhood of (A) comprises 4 preset directions, and the corresponding variance-mean ratio is ratio1、ratio2、ratio3And ratio4If the corresponding size relationship is ratio1>ratio2>ratio3>ratio4If the magnitude relationship of the corresponding weighted value is w1<w2<w3<w4Wherein each preset direction corresponds to a weight value wjThe sum of (a) and (b) is 1; according to the determined weight value of each preset direction, the result of weighting and summing the pixel values of each pixel point can be determined according to the following formula:
Value=w1*value1+w2*value2+w3*value3+w4*value4
wherein Value represents a pixel Value after image filtering of the pixel point, ValueiAnd expressing the pixel value of the pixel point in the preset direction of the preset neighborhood.
Step 404: and if the pixel points are pixel points in the edge region, determining the edge direction according to the discrete degree value and the gradient value of the pixel points in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction.
Determining a first direction corresponding to the maximum discrete degree value in the discrete degree values in all preset directions, and determining a second direction corresponding to the maximum gradient value in the gradient values in all directions; when the first direction and the second direction are the same direction, determining that the first direction is an edge direction; wherein the edge direction represents that the pixel point has a maximum discrete level value and/or a maximum gradient value in the direction; and when the first direction and the second direction are different directions and are not vertical to each other, determining that the first direction and the second direction are both edge directions.
To more clearly understand how to determine the edge direction, see table 2 below:
TABLE 2
Relationship between the first direction and the second direction Whether it is determined as the edge direction
Same direction Determining the direction of the first direction as an edge direction
Different directions but not perpendicular to each other Determining that the first direction and the second direction are both edge directions
In different directions but perpendicular to each other Uncertainty is edge direction
For example, referring to the preset direction diagram in fig. 5, if the first direction corresponding to the maximum discrete magnitude value in the discrete magnitude values is the direction 1, and the second direction corresponding to the maximum gradient magnitude value in the gradient magnitudes is also the direction 1, it is determined that the first direction and the second direction are the same direction, and then it is determined that the direction 1 is the edge direction. Alternatively, if the first direction corresponding to the maximum discrete-degree value among the discrete-degree values is the direction 1, the second direction corresponding to the maximum gradient value among the gradient values is the direction 2, and the directions 1 and 2 are different directions, but they are not perpendicular to each other, then both the directions 1 and 2 can be regarded as edge directions. In addition, if the first direction corresponding to the maximum discrete magnitude value among the discrete magnitude values is the direction 1, the second direction corresponding to the maximum gradient magnitude value among the gradient magnitude values is the direction 3, the directions 1 and 3 are different directions, and the two directions are perpendicular to each other, then it is not taken as the edge direction.
After determining the edge direction, the two aspects of directional smoothing and edge enhancement processing are performed on the edge direction, which are described below:
(I) directional smoothing
And performing image smoothing on the pixel points according to pixel points in the preset neighborhood, which are perpendicular to the edge direction, and determining a first pixel value after the image smoothing. In implementation, the adopted image smoothing method is optional, and the first pixel value of the center pixel point after the image smoothing processing is performed can be determined according to the average value of the pixel values of all the pixel points in the edge direction, and is assumed to be recorded as R _ out1. In the embodiment of the image smoothing process, the pixel values of each pixel point in the edge direction may be weighted and averaged, and the embodiment of the directional smoothing process in the edge direction is not limited in the present application.
For example, referring to fig. 5, if the edge direction is the direction 1, the average value of the pixels in the direction 3 is used as the first pixel value R _ out for smoothing the directionality of the pixel1(ii) a Or, if the edge directions are direction 1 and direction 2, firstly, determining that the vertical direction of the direction 1 is direction 3, and the vertical direction of the direction 2 is direction 4; then, optionally, the first pixel value R _ out with smooth directivity of the pixel point is determined according to the average value of the pixel points in the direction 3 and the direction 41
(II) edge enhancement treatment
Determining a one-dimensional second order differential value of the pixel point in the edge direction, wherein in implementation, a formula which determines the pixel point in the edge direction as the second order differential value is as follows:
laplace=xi+1+xi-1-2*xi
wherein x isiRepresenting the pixel point, namely a central pixel point; x is the number ofi+1And xi-1The pixel values of two adjacent pixels of the central pixel in the edge direction are respectively.
Acquiring a variance value and a preset multiplier factor of pixel values of the pixel points in the edge direction in the preset neighborhood; determining a second pixel value after edge enhancement processing according to the one-dimensional second-order differential value, the variance value and the preset multiplier factor; the one-dimensional second-order differential value, the variance value and the preset multiplier factor respectively form positive correlation with the second pixel value; and determining the sum of the first pixel value and the second pixel value as the pixel value of the pixel point after directional smoothing processing and edge enhancement processing.
In the above embodiment, the second pixel value is determined according to the following formula:
R_out2=laplace*σ*α
wherein the R _ out2Representing the second pixel value, the sigma representing the variance value, the α representing the multiplier factor.
Determining the sum of the first pixel value and the second pixel value as a pixel value of the pixel point after directional smoothing processing and edge enhancement processing; the final result after processing the pixel points in the edge area is determined as the following formula:
R_out=R_out1+R_out2
in addition, since the pixel point at the intersection between the edge area and the non-edge area indicates the intersection of the information of different areas, there may be a clear trace of area distinction. In implementation, it is necessary to perform fusion processing on the pixels located in the joint area.
In this embodiment, after dividing each pixel point in an ultrasound image to be processed into a non-edge region and an edge region, determining a pixel point located at a junction according to a dividing result, determining the number of the pixel points divided into the edge region in a preset adjacent region of each pixel point, and determining that the pixel point is located at the junction between the non-edge region and the edge region if the number of the pixel points divided into the edge region belongs to a specified range, wherein the lower limit of the specified range is a first number threshold and the upper limit of the specified range is a second number threshold.
And for each pixel point positioned at the joint, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction to obtain a non-edge pixel value of the pixel point, and assuming that the obtained non-edge pixel value is1(ii) a Determining an edge direction according to the discrete degree value and the gradient value of the pixel point in each preset direction, performing directional smoothing processing and edge enhancement processing according to the determined edge direction to obtain an edge pixel value of the pixel point, and assuming that the obtained edge pixel value is2Weighting and summing the non-edge pixel value and the edge pixel value according to a set weight coefficient, wherein the weight coefficient is α, the pixel value after the pixel point processing is obtained, and the pixel value is P _ out, and is determined according to the following formula:
P_out=1*α+2*(1-α)
in a possible application scenario, a rotation button is present on the display unit 130 as shown in fig. 1 for adjusting the weight coefficient α to obtain a fusion effect map under different weight coefficients α, and a weight coefficient α corresponding to an effect map with a most suitable fusion result can be determined according to the obtained different effect maps.
It should be noted that, in a possible embodiment, the fusion processing is performed on the pixels located in the handover area, and the above embodiment is directly performed on the determined pixels located in the handover area. Or, in another possible embodiment, after dividing into an edge region and a non-edge region and performing corresponding image processing operations, since the edge region is mainly used for implementing enhancement on edge information, and the non-edge region is mainly used for implementing noise reduction on pixel points of the non-edge region; after the edge area and the non-edge area are divided and processed respectively, in order to avoid obvious artificial traces, the above fusion processing can be further performed on the pixel points of the handover area.
If the to-be-processed images are scale images subjected to multi-scale decomposition, dividing each to-be-processed ultrasonic image into an edge region and a non-edge region, respectively executing corresponding processing, and performing multi-scale reconstruction and fusion on each processed ultrasonic image; and outputting the fused ultrasonic image. Fusing the images of all scales into a layer of image with the largest scale according to the order of the scales from small to large; when the ultrasonic images of two adjacent scales are fused, the image of the next scale is enlarged to the image of the previous scale; and then carrying out weighted summation on pixel values of pixel points at the same positions in the ultrasonic images of the two adjacent scales to obtain a fusion result of fusing the image of the next scale into the image of the previous scale.
Referring to fig. 6, an effect diagram of an ultrasound image processing method according to an embodiment of the present application is shown, in which, as shown in fig. 6, a region a is an ultrasound image to be processed with speckle noise and a region b is an ultrasound image processed by the method of the present application. The processed ultrasonic image can be determined to achieve obvious noise reduction effect and edge enhancement effect. For example, as shown in a contrast region "1" in fig. 6, the speckle noise of the processed contrast region is obviously weakened, which can be understood as an image filtering process for the non-edge region pixel; or, as shown in the contrast region "2", the processed contrast region enhances the edge, and smoothes the region beside the edge, so that the edge region in the ultrasound image is well preserved and enhanced, and blurring of tissues or organs or discontinuity of the edge is avoided, which can be understood as directional smoothing and edge enhancement processing for the edge region; alternatively, the term "3" is also understood to mean an image processing for the edge region. Furthermore, there is also no evidence of a significant two-zone difference at the interface for the edge zone or the non-edge zone.
It should be noted that the original ultrasound image is obtained in real time, that is, the ultrasound image obtained by sending a return signal after the ultrasound wave is copied by the probe is directly processed by the method of the present application to output the processed ultrasound image, and the method is mainly used for realizing the ultrasound analysis of the tissues and organs in medicine.
For better understanding of the present application, referring to fig. 7, another schematic flow chart of an ultrasound image processing method provided in an embodiment of the present application is shown, including:
step 701: and carrying out multi-scale decomposition on the obtained original ultrasonic image to obtain at least two multi-scale images, and taking each scale image as an ultrasonic image to be processed.
If the ultrasound image to be processed is the original ultrasound image, the process is directly executed from step 702.
Step 702: and determining a first edge area according to the gradient value of each pixel point in the ultrasonic image to be processed and the preset occupation ratio of the corresponding edge area.
Step 703: and determining a second edge region according to the gray value of each pixel point in the ultrasonic image to be processed, and taking the collection of the first edge region and the second edge region as the edge region.
In this embodiment, the edge regions may be separately determined by the first edge region or the second edge region, respectively.
After the to-be-processed ultrasound image is divided into the border region and the non-border region, steps 704a and 704b are performed for the non-border region and the border region, respectively.
Step 704a is to perform image filtering processing on the pixel points in the non-edge region according to the discrete degree values of the pixel points in each preset direction.
Step 704 b: and aiming at the pixel points in the edge region, determining the edge direction according to the discrete degree value and the gradient value of each pixel point in each preset direction, and performing directional smoothing and edge sharpening according to the determined edge direction.
Step 704 c: and determining the edge direction according to the discrete degree value and the gradient value of each pixel point in each preset direction aiming at the pixel points in the edge region, and if the edge direction is not determined, not processing.
Step 704 d: and respectively executing the processing of the non-edge area and the edge area to obtain a non-edge pixel value and an edge pixel value aiming at the pixel point positioned at the joint of the non-edge area and the edge area, and performing weighted summation to obtain the pixel value after the pixel point is processed.
The steps 704a, 704b, 704c and 704d determine a processing mode with pertinence for various types of pixel points in the ultrasound image to be processed.
Step 705: and determining an ultrasonic image processed for the ultrasonic image to be processed.
Step 706: and performing multi-scale reconstruction and fusion on the processed ultrasonic image, and outputting the fused ultrasonic image.
If the ultrasound image to be processed is an original ultrasound image, the step does not need to be executed correspondingly. In the step, the corresponding multi-scale reconstruction and fusion are respectively carried out on the images to be processed of all scales after the original ultrasonic image is subjected to the multi-scale decomposition, so that the processed ultrasonic images corresponding to the original ultrasonic images are obtained.
Based on the same technical concept, fig. 8 exemplarily shows a communication terminal 800 provided in an embodiment of the present application, where the communication terminal 800 specifically includes:
an input-output unit 810 configured to receive an ultrasound image formed from the ultrasound return signal and output the ultrasound image for display;
a display panel 820 configured to display a display interface of an application program for displaying the ultrasound image;
a backlight assembly 830 configured to be positioned at a rear surface of the display panel, the backlight assembly including a plurality of backlight partitions, each of which may emit light of different brightness;
a processor 840, respectively connected to the input/output unit 810, the display panel 820, and the backlight assembly 830, and configured to:
dividing an ultrasonic image to be processed into a non-edge area and an edge area;
determining the discrete degree value of the pixel value in each preset direction in the preset neighborhood of the pixel point for each pixel point in the ultrasonic image to be processed;
if the pixel point is the pixel point in the non-edge region, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction;
and if the pixel points are pixel points in the edge region, determining the edge direction according to the discrete degree value and the gradient value of the pixel points in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction.
In some exemplary embodiments, the processor 840 is further configured to:
determining the number of the pixel points which are divided into the edge regions in the preset neighborhood of each pixel point;
if the number of the pixel points divided into the edge areas belongs to a specified range, determining the pixel points to be positioned at the junction of the non-edge area and the edge area; wherein the lower limit of the specified range is a first number threshold, and the upper limit of the specified range is a second number threshold;
for each pixel point located at the joint, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction to obtain a non-edge pixel value of the pixel point; and the number of the first and second electrodes,
determining an edge direction according to the discrete degree value and the gradient value of the pixel point in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction to obtain an edge pixel value of the pixel point;
and carrying out weighted summation on the non-edge pixel value and the edge pixel value according to a set weight coefficient to obtain a pixel value processed by the pixel point.
In some exemplary embodiments, the processor 840 is configured to perform, when dividing the ultrasound image to be processed into the non-edge region and the edge region, performing:
determining gradient values of all pixel points in the ultrasonic image to be processed, and acquiring a preset ratio of an edge area corresponding to the ultrasonic image to be processed; dividing the pixel points of the gradient values in the preset ratio range into a first edge area of the ultrasonic image to be processed; determining the first edge area as an edge area in the ultrasonic image to be processed; the gradient values of the pixels in the preset ratio range are all higher than the gradient values of the pixels outside the preset ratio range; alternatively, the first and second electrodes may be,
dividing pixel points of which the gray values are greater than a specified gray threshold value in the ultrasonic image to be processed into a second edge area of the ultrasonic image to be processed; determining the second edge area as an edge area in the ultrasonic image to be processed; alternatively, the first and second electrodes may be,
and taking the pixel point collection in the first edge region and the second edge region as the edge region in the ultrasonic image to be processed.
In some exemplary embodiments, the processor 840 is configured to, when performing image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction, perform:
configuring corresponding weight values for each preset direction according to the discrete degree values of the pixel points in each preset direction; the sum of the weighted values configured in each preset direction is a designated value, and the discrete degree value of each preset direction and the corresponding weighted value form a negative correlation relationship;
and taking the weighted summation result of the pixel values in each preset direction as the pixel value of the pixel point after image filtering processing.
In some exemplary embodiments, the processor 840 is configured to, when determining the edge direction according to the discrete degree value and the gradient value of the pixel point in each preset direction, perform:
determining a first direction corresponding to the maximum discrete degree value in the discrete degree values in all preset directions, and determining a second direction corresponding to the maximum gradient value in the gradient values in all directions;
when the first direction and the second direction are the same direction, determining that the direction perpendicular to the first direction is an edge direction; the edge direction represents a preset direction with the smallest included angle with the edge extension direction in all preset directions in a preset neighborhood of the pixel point;
and when the first direction and the second direction are different directions and are not perpendicular to each other, determining that the directions respectively perpendicular to the first direction and the second direction are edge directions.
In some exemplary embodiments, the discrete magnitude value is a variance-to-mean ratio; the variance-mean ratio is the ratio of the pixel value variance and the pixel value mean of the pixel point in each preset direction.
In some exemplary embodiments, the processor 840 is configured to perform, when performing the directional smoothing processing according to the determined edge direction:
performing image smoothing on the pixel points according to pixel points in the edge direction in the preset neighborhood, and determining a first pixel value after the image smoothing;
the processor 840 is configured to perform an edge enhancement process according to the determined edge direction, and perform:
determining a one-dimensional second order differential value of the pixel point in the edge direction; acquiring a variance value and a preset multiplier factor of pixel values of the pixel points in the edge direction in the preset neighborhood;
determining a second pixel value after edge enhancement processing according to the one-dimensional second-order differential value, the variance value and the preset multiplier factor; the one-dimensional second-order differential value, the variance value and the preset multiplier factor respectively form positive correlation with the second pixel value;
and determining the sum of the first pixel value and the second pixel value as the pixel value of the pixel point after directional smoothing processing and edge enhancement processing.
In some exemplary embodiments, the second pixel value is determined according to the following formula:
R_out2=laplace*σ*α
wherein the R _ out2 represents the second pixel value, the σ represents the variance value, and the α represents the multiplier factor.
In some exemplary embodiments, the processor 840 is configured to, before dividing the ultrasound image to be processed into the non-edge region and the edge region, perform, when acquiring the image to be processed:
acquiring an original ultrasonic image, and performing multi-scale decomposition on the original ultrasonic image to obtain images of at least two scales, wherein the image of each scale is respectively used as the ultrasonic image to be processed;
the processor 840 is further configured to perform:
after the ultrasonic images to be processed are processed, performing multi-scale reconstruction and fusion on the processed ultrasonic images;
and outputting the fused ultrasonic image.
In some exemplary embodiments, the multi-scale decomposition is a pyramidal decomposition;
the processor 840 is configured to perform, when performing multi-scale reconstruction and fusion on each processed ultrasound image:
fusing the images of all scales into a layer of image with the largest scale according to the order of the scales from small to large; wherein the content of the first and second substances,
when the ultrasonic images of two adjacent scales are fused, the image of the next scale is enlarged to the image of the previous scale; and then carrying out weighted summation on pixel values of pixel points at the same positions in the ultrasonic images of the two adjacent scales to obtain a fusion result of fusing the image of the next scale into the image of the previous scale.
The details of the above-mentioned communication terminal and its function implementation can be referred to the above related description in conjunction with fig. 1-7, and are not repeated herein.
In some possible implementations, various aspects of the methods provided by the embodiments of the present application may also be implemented in the form of a program product including program code for causing a computer device to perform the steps of the methods for data processing according to various exemplary implementations of the present application described in the present specification when the program code runs on the computer device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A program product for executing data processing according to an embodiment of the present application may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a server apparatus. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an information delivery, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium other than a readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the periodic network action system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including AN object oriented programming language such as Java, C + +, or the like, as well as conventional procedural programming languages, such as the "C" language or similar programming languages.
The method for executing data processing in the embodiment of the application further provides a storage medium readable by the computing device, namely, the content is not lost after power failure. The storage medium stores therein a software program comprising program code which, when executed on a computing device, when read and executed by one or more processors, implements any of the above data processing aspects of the embodiments of the present application.
The present application is described above with reference to block diagrams and/or flowchart illustrations of methods, apparatus (systems) and/or computer program products according to embodiments of the application. It will be understood that one block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, and/or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the subject application may also be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.). Furthermore, the present application may take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system. In the context of this application, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely exemplary of the present application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the present application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to include such modifications and variations.

Claims (20)

1. A method of ultrasound image processing, the method comprising:
dividing an ultrasonic image to be processed into a non-edge area and an edge area;
determining the discrete degree value of the pixel value in each preset direction in the preset neighborhood of the pixel point for each pixel point in the ultrasonic image to be processed;
if the pixel point is the pixel point in the non-edge region, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction;
and if the pixel points are pixel points in the edge region, determining the edge direction according to the discrete degree value and the gradient value of the pixel points in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction.
2. The method of claim 1, further comprising:
determining the number of the pixel points which are divided into the edge regions in the preset neighborhood of each pixel point;
if the number of the pixel points divided into the edge areas belongs to a specified range, determining the pixel points to be positioned at the junction of the non-edge area and the edge area; wherein the lower limit of the specified range is a first number threshold, and the upper limit of the specified range is a second number threshold;
for each pixel point located at the joint, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction to obtain a non-edge pixel value of the pixel point; and the number of the first and second electrodes,
determining an edge direction according to the discrete degree value and the gradient value of the pixel point in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction to obtain an edge pixel value of the pixel point;
and carrying out weighted summation on the non-edge pixel value and the edge pixel value according to a set weight coefficient to obtain a pixel value processed by the pixel point.
3. The method of claim 1, wherein the dividing the ultrasound image to be processed into a non-border region and a border region comprises:
determining gradient values of all pixel points in the ultrasonic image to be processed, and acquiring a preset ratio of an edge area corresponding to the ultrasonic image to be processed; dividing the pixel points of the gradient values in the preset ratio range into a first edge area of the ultrasonic image to be processed; determining the first edge area as an edge area in the ultrasonic image to be processed; the gradient values of the pixels in the preset ratio range are all higher than the gradient values of the pixels outside the preset ratio range; alternatively, the first and second electrodes may be,
dividing pixel points of which the gray values are greater than a specified gray threshold value in the ultrasonic image to be processed into a second edge area of the ultrasonic image to be processed; determining the second edge area as an edge area in the ultrasonic image to be processed; alternatively, the first and second electrodes may be,
and taking the pixel point collection in the first edge region and the second edge region as the edge region in the ultrasonic image to be processed.
4. The method according to claim 1 or 2, wherein the performing image filtering processing on the pixel points according to the discrete degree values of the pixel points in each preset direction includes:
configuring corresponding weight values for each preset direction according to the discrete degree values of the pixel points in each preset direction; the sum of the weighted values configured in each preset direction is a designated value, and the discrete degree value of each preset direction and the corresponding weighted value form a negative correlation relationship;
and taking the weighted summation result of the pixel values in each preset direction as the pixel value of the pixel point after image filtering processing.
5. The method according to claim 1 or 2, wherein the determining the edge direction according to the discrete degree value and the gradient value of the pixel point in each preset direction comprises:
determining a first direction corresponding to the maximum discrete degree value in the discrete degree values in all preset directions, and determining a second direction corresponding to the maximum gradient value in the gradient values in all directions;
when the first direction and the second direction are the same direction, determining that the direction of the first direction is an edge direction; wherein the edge direction indicates that the pixel point has a maximum discrete level value and/or a maximum gradient value in the direction;
and when the first direction and the second direction are different directions and are not vertical to each other, determining that the first direction and the second direction are both edge directions.
6. The method of claim 1, wherein the discrete magnitude value is a variance-to-mean ratio; the variance-mean ratio is the ratio of the pixel value variance and the pixel value mean of the pixel point in each preset direction.
7. The method according to claim 1 or 2, wherein performing directional smoothing according to the determined edge direction comprises:
performing image smoothing on the pixel points according to pixel points in the preset neighborhood, which are perpendicular to the edge direction, determining a first pixel value after the image smoothing, and determining a first pixel value after the image smoothing;
performing edge enhancement processing according to the determined edge direction, including:
determining a one-dimensional second order differential value of the pixel point in the edge direction; acquiring a variance value and a preset multiplier factor of pixel values of the pixel points in the edge direction in the preset neighborhood;
determining a second pixel value after edge enhancement processing according to the one-dimensional second-order differential value, the variance value and the preset multiplier factor; the one-dimensional second-order differential value, the variance value and the preset multiplier factor respectively form positive correlation with the second pixel value;
and determining the sum of the first pixel value and the second pixel value as the pixel value of the pixel point after directional smoothing processing and edge enhancement processing.
8. The method of claim 7, wherein the second pixel value is determined according to the following formula:
R_out2=laplace*σ*α
wherein the R _ out2 represents the second pixel value, the σ represents the variance value, and the α represents the multiplier factor.
9. The method according to claim 1, wherein before the dividing the ultrasound image to be processed into the non-edge region and the edge region, acquiring the ultrasound image to be processed comprises:
acquiring an original ultrasonic image, and performing multi-scale decomposition on the original ultrasonic image to obtain images of at least two scales, wherein the image of each scale is respectively used as the ultrasonic image to be processed;
the method further comprises the following steps:
after the ultrasonic images to be processed are processed, performing multi-scale reconstruction and fusion on the processed ultrasonic images;
and outputting the fused ultrasonic image.
10. The method of claim 9, wherein the multi-scale decomposition is a pyramidal decomposition;
the multi-scale reconstruction and fusion of the processed ultrasonic images comprises the following steps:
fusing the images of all scales into a layer of image with the largest scale according to the order of the scales from small to large; wherein the content of the first and second substances,
when the ultrasonic images of two adjacent scales are fused, the image of the next scale is enlarged to the image of the previous scale; and then carrying out weighted summation on pixel values of pixel points at the same positions in the ultrasonic images of the two adjacent scales to obtain a fusion result of fusing the image of the next scale into the image of the previous scale.
11. A communication terminal, comprising:
an input-output unit configured to receive an ultrasound image formed from an ultrasound return signal and output the ultrasound image for display;
a display panel configured to display a display interface of an application program for displaying the ultrasound image;
a backlight assembly configured to be positioned at a rear surface of the display panel, the backlight assembly including a plurality of backlight partitions, each of which may emit light of different brightness;
a processor respectively connected with the input and output unit, the display panel and the backlight assembly, and configured to:
dividing an ultrasonic image to be processed into a non-edge area and an edge area;
determining the discrete degree value of the pixel value in each preset direction in the preset neighborhood of the pixel point for each pixel point in the ultrasonic image to be processed;
if the pixel point is the pixel point in the non-edge region, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction;
and if the pixel points are pixel points in the edge region, determining the edge direction according to the discrete degree value and the gradient value of the pixel points in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction.
12. The terminal of claim 11, wherein the processor is further configured to:
determining the number of the pixel points which are divided into the edge regions in the preset neighborhood of each pixel point;
if the number of the pixel points divided into the edge areas belongs to a specified range, determining the pixel points to be positioned at the junction of the non-edge area and the edge area; wherein the lower limit of the specified range is a first number threshold, and the upper limit of the specified range is a second number threshold;
for each pixel point located at the joint, carrying out image filtering processing on the pixel point according to the discrete degree value of the pixel point in each preset direction to obtain a non-edge pixel value of the pixel point; and the number of the first and second electrodes,
determining an edge direction according to the discrete degree value and the gradient value of the pixel point in each preset direction, and performing directional smoothing processing and edge enhancement processing according to the determined edge direction to obtain an edge pixel value of the pixel point;
and carrying out weighted summation on the non-edge pixel value and the edge pixel value according to a set weight coefficient to obtain a pixel value processed by the pixel point.
13. The terminal according to claim 11, wherein the processor is configured to perform, when dividing the ultrasound image to be processed into a non-edge region and an edge region, performing:
determining gradient values of all pixel points in the ultrasonic image to be processed, and acquiring a preset ratio of an edge area corresponding to the ultrasonic image to be processed; dividing the pixel points of the gradient values in the preset ratio range into a first edge area of the ultrasonic image to be processed; determining the first edge area as an edge area in the ultrasonic image to be processed; the gradient values of the pixels in the preset ratio range are all higher than the gradient values of the pixels outside the preset ratio range; alternatively, the first and second electrodes may be,
dividing pixel points of which the gray values are greater than a specified gray threshold value in the ultrasonic image to be processed into a second edge area of the ultrasonic image to be processed; determining the second edge area as an edge area in the ultrasonic image to be processed; alternatively, the first and second electrodes may be,
and taking the pixel point collection in the first edge region and the second edge region as the edge region in the ultrasonic image to be processed.
14. The terminal according to claim 11 or 12, wherein the processor is configured to, when performing image filtering processing on the pixel points according to the discrete degree values of the pixel points in each preset direction, perform:
configuring corresponding weight values for each preset direction according to the discrete degree values of the pixel points in each preset direction; the sum of the weighted values configured in each preset direction is a designated value, and the discrete degree value of each preset direction and the corresponding weighted value form a negative correlation relationship;
and taking the weighted summation result of the pixel values in each preset direction as the pixel value of the pixel point after image filtering processing.
15. The terminal according to claim 11 or 12, wherein the processor is configured to, when determining the edge direction according to the discrete degree value and the gradient value of the pixel point in each preset direction, perform:
determining a first direction corresponding to the maximum discrete degree value in the discrete degree values in all preset directions, and determining a second direction corresponding to the maximum gradient value in the gradient values in all directions;
when the first direction and the second direction are the same direction, determining that the direction of the first direction is an edge direction; the edge direction represents that the pixel point has the maximum discrete level value and/or the maximum gradient value in the direction;
and when the first direction and the second direction are different directions and are not perpendicular to each other, determining that the directions respectively perpendicular to the first direction and the second direction are edge directions.
16. The terminal of claim 11, wherein the discrete magnitude value is a variance-to-mean ratio; the variance-mean ratio is the ratio of the pixel value variance and the pixel value mean of the pixel point in each preset direction.
17. The terminal according to claim 11 or 12, wherein the processor is configured to perform, when performing directional smoothing processing according to the determined edge direction:
performing image smoothing on the pixel points according to pixel points in the preset neighborhood, which are perpendicular to the edge direction, determining a first pixel value after the image smoothing, and determining a first pixel value after the image smoothing;
the processor is configured to perform edge enhancement processing according to the determined edge direction, and execute:
determining a one-dimensional second order differential value of the pixel point in the edge direction; acquiring a variance value and a preset multiplier factor of pixel values of the pixel points in the edge direction in the preset neighborhood;
determining a second pixel value after edge enhancement processing according to the one-dimensional second-order differential value, the variance value and the preset multiplier factor; the one-dimensional second-order differential value, the variance value and the preset multiplier factor respectively form positive correlation with the second pixel value;
and determining the sum of the first pixel value and the second pixel value as the pixel value of the pixel point after directional smoothing processing and edge enhancement processing.
18. A terminal according to claim 17, wherein the second pixel value is determined according to the formula:
R_out2=laplace*σ*α
wherein the R _ out2 represents the second pixel value, the σ represents the variance value, and the α represents the multiplier factor.
19. The terminal according to claim 11, wherein the processor is configured to execute, when acquiring the image to be processed before dividing the ultrasound image to be processed into the non-edge region and the edge region:
acquiring an original ultrasonic image, and performing multi-scale decomposition on the original ultrasonic image to obtain images of at least two scales, wherein the image of each scale is respectively used as the ultrasonic image to be processed;
the processor is further configured to perform:
after the ultrasonic images to be processed are processed, performing multi-scale reconstruction and fusion on the processed ultrasonic images;
and outputting the fused ultrasonic image.
20. The terminal of claim 19, wherein the multi-scale decomposition is a pyramidal decomposition;
the processor is configured to perform, when performing multi-scale reconstruction and fusion on each processed ultrasound image:
fusing the images of all scales into a layer of image with the largest scale according to the order of the scales from small to large; wherein the content of the first and second substances,
when the ultrasonic images of two adjacent scales are fused, the image of the next scale is enlarged to the image of the previous scale; and then carrying out weighted summation on pixel values of pixel points at the same positions in the ultrasonic images of the two adjacent scales to obtain a fusion result of fusing the image of the next scale into the image of the previous scale.
CN202010177729.1A 2020-03-13 2020-03-13 Ultrasonic image processing method and communication terminal Active CN111415308B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010177729.1A CN111415308B (en) 2020-03-13 2020-03-13 Ultrasonic image processing method and communication terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010177729.1A CN111415308B (en) 2020-03-13 2020-03-13 Ultrasonic image processing method and communication terminal

Publications (2)

Publication Number Publication Date
CN111415308A true CN111415308A (en) 2020-07-14
CN111415308B CN111415308B (en) 2023-04-28

Family

ID=71494499

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010177729.1A Active CN111415308B (en) 2020-03-13 2020-03-13 Ultrasonic image processing method and communication terminal

Country Status (1)

Country Link
CN (1) CN111415308B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113012185A (en) * 2021-03-26 2021-06-22 影石创新科技股份有限公司 Image processing method, image processing device, computer equipment and storage medium
CN113892967A (en) * 2021-09-07 2022-01-07 青岛海信医疗设备股份有限公司 Ultrasonic image processing method and ultrasonic instrument

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102306377A (en) * 2011-09-21 2012-01-04 深圳市理邦精密仪器股份有限公司 Method and device for reducing noise in ultrasound image
CN104732493A (en) * 2015-03-18 2015-06-24 西安电子科技大学 SAR image de-noising algorithm based on Primal Sketch classification and SVD domain improvement MMSE estimation
CN109242782A (en) * 2017-07-11 2019-01-18 深圳市道通智能航空技术有限公司 Noise processing method and processing device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102306377A (en) * 2011-09-21 2012-01-04 深圳市理邦精密仪器股份有限公司 Method and device for reducing noise in ultrasound image
CN104732493A (en) * 2015-03-18 2015-06-24 西安电子科技大学 SAR image de-noising algorithm based on Primal Sketch classification and SVD domain improvement MMSE estimation
CN109242782A (en) * 2017-07-11 2019-01-18 深圳市道通智能航空技术有限公司 Noise processing method and processing device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
付传卿 等: "图像梯度引导的窗口形状自适应双边滤波" *
王海菊 等: "自适应高斯滤波图像去噪算法" *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113012185A (en) * 2021-03-26 2021-06-22 影石创新科技股份有限公司 Image processing method, image processing device, computer equipment and storage medium
CN113012185B (en) * 2021-03-26 2023-08-29 影石创新科技股份有限公司 Image processing method, device, computer equipment and storage medium
CN113892967A (en) * 2021-09-07 2022-01-07 青岛海信医疗设备股份有限公司 Ultrasonic image processing method and ultrasonic instrument
CN113892967B (en) * 2021-09-07 2023-10-27 青岛海信医疗设备股份有限公司 Ultrasonic image processing method and ultrasonic instrument

Also Published As

Publication number Publication date
CN111415308B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN110210571B (en) Image recognition method and device, computer equipment and computer readable storage medium
CN110059744B (en) Method for training neural network, method and equipment for processing image and storage medium
CN108594997B (en) Gesture skeleton construction method, device, equipment and storage medium
CN107945163B (en) Image enhancement method and device
CN110097576B (en) Motion information determination method of image feature point, task execution method and equipment
JP2022500754A (en) Image segmentation methods, equipment, computer equipment and computer programs
CN111091576A (en) Image segmentation method, device, equipment and storage medium
TW202209254A (en) Image segmentation method, electronic equipment and computer-readable storage medium thereof
WO2022042290A1 (en) Virtual model processing method and apparatus, electronic device and storage medium
CN110149517B (en) Video processing method and device, electronic equipment and computer storage medium
CN111415308A (en) Ultrasonic image processing method and communication terminal
CN108665510B (en) Rendering method and device of continuous shooting image, storage medium and terminal
CN113962859B (en) Panorama generation method, device, equipment and medium
EP4249869A1 (en) Temperature measuring method and apparatus, device and system
CN112884666A (en) Image processing method, image processing device and computer storage medium
CN111932463A (en) Image processing method, device, equipment and storage medium
CN114693593A (en) Image processing method, device and computer device
CN110991457A (en) Two-dimensional code processing method and device, electronic equipment and storage medium
CN110807769B (en) Image display control method and device
CN110619614B (en) Image processing method, device, computer equipment and storage medium
CN113902636A (en) Image deblurring method and device, computer readable medium and electronic equipment
JP2022548453A (en) Image segmentation method and apparatus, electronic device and storage medium
CN109218620B (en) Photographing method and device based on ambient brightness, storage medium and mobile terminal
CN115439577A (en) Image rendering method and device, terminal equipment and storage medium
CN115601316A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant