CN116616817B - Ultrasonic heart rate detection method and device, ultrasonic equipment and storage medium - Google Patents

Ultrasonic heart rate detection method and device, ultrasonic equipment and storage medium Download PDF

Info

Publication number
CN116616817B
CN116616817B CN202310900911.9A CN202310900911A CN116616817B CN 116616817 B CN116616817 B CN 116616817B CN 202310900911 A CN202310900911 A CN 202310900911A CN 116616817 B CN116616817 B CN 116616817B
Authority
CN
China
Prior art keywords
ultrasonic image
current
ultrasonic
image
corrected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310900911.9A
Other languages
Chinese (zh)
Other versions
CN116616817A (en
Inventor
黄灿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wisonic Medical Technology Co ltd
Original Assignee
Shenzhen Wisonic Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Wisonic Medical Technology Co ltd filed Critical Shenzhen Wisonic Medical Technology Co ltd
Priority to CN202310900911.9A priority Critical patent/CN116616817B/en
Publication of CN116616817A publication Critical patent/CN116616817A/en
Application granted granted Critical
Publication of CN116616817B publication Critical patent/CN116616817B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/02Measuring pulse or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5269Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
    • A61B8/5276Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts due to motion
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image

Abstract

The invention discloses an ultrasonic heart rate detection method, an ultrasonic heart rate detection device, ultrasonic equipment and a storage medium. The method comprises the following steps: acquiring Mi first feature points corresponding to N frames of ultrasonic images; based on Mi first feature points corresponding to the N frames of ultrasonic images, global motion estimation is carried out on each current ultrasonic image relative to a reference ultrasonic image, and global motion information of each current ultrasonic image is determined; image reconstruction is carried out based on global motion information of each current ultrasonic image, and a corrected ultrasonic image corresponding to each current ultrasonic image is obtained; performing differential processing on the N-1 frame correction ultrasonic image and the reference ultrasonic image to obtain differential average values corresponding to the N-1 correction ultrasonic images; and carrying out heart rate conversion on the differential average value corresponding to the N-1 corrected ultrasonic images to obtain a heart rate chart corresponding to the N frames of ultrasonic images. The method can realize the motion filtering effect, can effectively eliminate unstable motion interference of the probe, and has the advantages of higher heart rate graph detection, better effect and convenient operation.

Description

Ultrasonic heart rate detection method and device, ultrasonic equipment and storage medium
Technical Field
The present invention relates to the field of ultrasonic detection technologies, and in particular, to an ultrasonic heart rate detection method, an ultrasonic heart rate detection device, an ultrasonic apparatus, and a storage medium.
Background
Clinically, the heart rate value of a patient can effectively help a doctor evaluate the state of illness, and the heart rate value of the patient is rapidly and accurately obtained, so that the heart rate value of the patient is always a significant research direction. At present, in clinical application of ultrasonic equipment, if an accurate heart rate value is required to be obtained, an ECG electrocardiograph monitoring device is generally connected to acquire multi-lead electrocardiograph data in an external mode, and the multi-lead electrocardiograph data and an ultrasonic B-mode image are synchronously displayed on a display, so that a doctor can conveniently observe the tissue state and know the heart rate value at the same time. However, this method requires external equipment, which increases complexity of equipment and lines, and is not convenient to carry and use. Therefore, each ultrasonic manufacturer is researching how to detect the heart rate value directly through the ultrasonic equipment without using external equipment, so that the work of a clinician is convenient.
Currently, there are two ways to obtain heart rate directly by ultrasound means. The first is to use the doppler mode of the ultrasound device to calculate the heart rate value physically according to the doppler effect, which has a similar defect to the external device, and to open the doppler mode of the ultrasound device, select the sampling gate, and calculate the heart rate value, so that the doctor is required to perform some operations, and in this mode, the ultrasound image frame rate is reduced more, which has an effect on the doctor's observation of the tissue motion at a high frame rate. Secondly, the heart rate value is directly calculated through a conventional B mode, the accuracy calculated by the mode is doubtful, the technical scheme is rough, the actual effect is not ideal, and the method mainly has the following two limitations: one is the frame rate requirement, since the processing algorithm is required to be fast to evaluate the frame rate, the display of high frame rate cannot be affected, that is, the complexity of the algorithm cannot be too high; the second is that the effective anti-interference can be realized, the interference refers to the unstable state of the image, and the main reason is that the doctor can continuously adjust the probe position in the process of scanning tissues to find the optimal scanning position, and the ultrasonic B image is unstable in the process, so that in order to realize the effective anti-interference, a more accurate and complex algorithm is required to be adopted for processing, and the complexity of the algorithm is higher.
Disclosure of Invention
The embodiment of the invention provides an ultrasonic heart rate detection method, an ultrasonic heart rate detection device, ultrasonic equipment and a storage medium, which are used for solving the problems of inconvenient operation and poor effect in the existing ultrasonic detection process.
An ultrasonic heart rate detection method comprising:
acquiring Mi first characteristic points corresponding to N frames of ultrasonic images, wherein the ultrasonic images comprise 1 frame of reference ultrasonic image and N-1 frames of current ultrasonic images, and Mi is the number of the first characteristic points in the ith ultrasonic image;
global motion estimation is carried out on each current ultrasonic image relative to the reference ultrasonic image based on Mi first feature points corresponding to the ultrasonic images of N frames, and global motion information of each current ultrasonic image relative to the reference ultrasonic image is determined;
based on global motion information of each current ultrasonic image relative to the reference ultrasonic image, carrying out image reconstruction on each current ultrasonic image to obtain a corrected ultrasonic image corresponding to each current ultrasonic image;
performing differential processing on the N-1 frames of corrected ultrasonic images and the reference ultrasonic image to obtain differential average values corresponding to the N-1 corrected ultrasonic images;
and carrying out heart rate conversion on the differential average value corresponding to the N-1 corrected ultrasonic images to obtain a heart rate chart corresponding to the N frames of ultrasonic images.
Preferably, the global motion estimation is performed on each current ultrasound image with respect to the reference ultrasound image based on Mi first feature points corresponding to the N frames of ultrasound images, and the determining global motion information of each current ultrasound image with respect to the reference ultrasound image includes:
based on Mi first feature points corresponding to the N frames of ultrasonic images, P second feature points of the N frames of ultrasonic images are determined;
carrying out affine transformation processing on each current ultrasonic image relative to the reference ultrasonic image based on the P second characteristic points to obtain an affine transformation matrix corresponding to each current ultrasonic image;
based on affine transformation matrixes corresponding to the current ultrasonic images, carrying out feature screening on the P second feature points, and determining Bi background feature points corresponding to the current ultrasonic images;
and determining global motion information of each current ultrasonic image relative to the reference ultrasonic image based on Bi background feature points corresponding to each current ultrasonic image.
Preferably, the feature screening is performed on P second feature points based on an affine transformation matrix corresponding to each current ultrasonic image, and determining Bi background feature points corresponding to each current ultrasonic image includes:
Determining feature distribution probability corresponding to each second feature point based on an affine transformation matrix corresponding to each current ultrasonic image, feature point coordinates of P second feature points in each current ultrasonic image and feature point coordinates in a reference ultrasonic image;
determining target distribution probability based on the feature distribution probability corresponding to the P second feature points;
and if the feature distribution probability is greater than or equal to the target distribution probability, determining second feature points corresponding to the feature distribution probability as Bi background feature points corresponding to each current ultrasonic image.
Preferably, the determining global motion information of each current ultrasound image relative to the reference ultrasound image based on Bi background feature points corresponding to each current ultrasound image includes:
determining Bi feature point displacements of each current ultrasonic image relative to the reference ultrasonic image based on Bi background feature points corresponding to each current ultrasonic image;
interpolation processing is carried out on all pixel points in the reference ultrasonic image based on Bi characteristic point displacements of each current ultrasonic image relative to the reference ultrasonic image, and pixel point displacements corresponding to all pixel points are determined;
And determining the pixel point displacement corresponding to all the pixel points as global motion information of each current ultrasonic image relative to the reference ultrasonic image.
Preferably, the image reconstruction of each current ultrasound image based on global motion information of each current ultrasound image relative to the reference ultrasound image, to obtain a corrected ultrasound image corresponding to each current ultrasound image, includes:
correcting all current pixel points in each current ultrasonic image based on global motion information of each current ultrasonic image relative to the reference ultrasonic image, and determining corrected pixel points of all current pixel points in the reference ultrasonic image;
interpolation processing is carried out on the reference gray values of the reference ultrasonic image based on the correction pixel points of all the current pixel points in the reference ultrasonic image, and correction gray values corresponding to all the correction pixel points are obtained;
and acquiring a corrected ultrasonic image corresponding to each current ultrasonic image according to the corrected gray values corresponding to all the corrected pixel points.
Preferably, the performing differential processing on the N-1 frames of corrected ultrasonic images and the reference ultrasonic image to obtain a differential average value corresponding to the N-1 corrected ultrasonic images includes:
Carrying out differential calculation on pixel gray values corresponding to all pixel points in each corrected ultrasonic image and pixel gray values corresponding to the same pixel point in the reference ultrasonic image, and determining gray difference values corresponding to all pixel points in each corrected ultrasonic image;
the first d pixel points with larger absolute values of the gray difference values are determined to be d target pixel points corresponding to each corrected ultrasonic image;
and carrying out average value processing on the absolute values of the gray difference values corresponding to the d target pixel points, and obtaining a differential average value corresponding to each corrected ultrasonic image.
Preferably, the performing heart rate conversion on the differential average value corresponding to the N-1 corrected ultrasound images to obtain a heart rate chart corresponding to the N frames of ultrasound images includes:
constructing a heart motion curve based on the difference average value corresponding to the N-1 corrected ultrasonic images;
sequentially sliding the local sliding windows on a heart motion curve, calculating local extremum values of the heart motion curve, and determining the number of extremum values corresponding to each local sliding window;
determining the number of extreme values corresponding to each local sliding window, and determining the actually measured heart rate value corresponding to each local sliding window;
And acquiring heart rate graphs corresponding to the N frames of ultrasonic images based on the actually measured heart rate values corresponding to all the local sliding windows.
An ultrasonic heart rate detection device comprising:
the device comprises a first characteristic point acquisition module, a second characteristic point acquisition module and a third characteristic point acquisition module, wherein the first characteristic point acquisition module is used for acquiring Mi first characteristic points corresponding to N frames of ultrasonic images, the ultrasonic images comprise 1 frame of reference ultrasonic image and N-1 frames of current ultrasonic images, and Mi is the number of the first characteristic points in the ith ultrasonic image;
the global motion information acquisition module is used for carrying out global motion estimation on each current ultrasonic image relative to the reference ultrasonic image based on Mi first characteristic points corresponding to the N frames of ultrasonic images, and determining global motion information of each current ultrasonic image relative to the reference ultrasonic image;
the corrected ultrasonic image acquisition module is used for carrying out image reconstruction on each current ultrasonic image based on global motion information of each current ultrasonic image relative to the reference ultrasonic image to acquire a corrected ultrasonic image corresponding to each current ultrasonic image;
the difference average value acquisition module is used for carrying out difference processing on the N-1 frames of corrected ultrasonic images and the reference ultrasonic images to acquire difference average values corresponding to the N-1 corrected ultrasonic images;
And the heart rate map acquisition module is used for carrying out heart rate conversion on the difference average values corresponding to the N-1 corrected ultrasonic images to acquire heart rate maps corresponding to the N frames of ultrasonic images.
Preferably, the global motion information acquisition module includes:
the second feature point determining unit is used for determining P second feature points of the N frames of ultrasonic images based on Mi first feature points corresponding to the N frames of ultrasonic images;
an affine transformation matrix obtaining unit, configured to perform affine transformation processing on each current ultrasonic image with respect to the reference ultrasonic image based on the P second feature points, to obtain an affine transformation matrix corresponding to each current ultrasonic image;
the background feature point determining unit is used for carrying out feature screening on the P second feature points based on an affine transformation matrix corresponding to each current ultrasonic image, and determining Bi background feature points corresponding to each current ultrasonic image;
and the global motion information determining unit is used for determining global motion information of each current ultrasonic image relative to the reference ultrasonic image based on Bi background characteristic points corresponding to each current ultrasonic image.
Preferably, the background feature point determining unit includes:
the characteristic distribution probability determining subunit is used for determining the characteristic distribution probability corresponding to each second characteristic point based on an affine transformation matrix corresponding to each current ultrasonic image, the characteristic point coordinates of the P second characteristic points in each current ultrasonic image and the characteristic point coordinates in the reference ultrasonic image;
the target distribution probability determining subunit is used for determining target distribution probability based on the feature distribution probabilities corresponding to the P second feature points;
and the background feature point determining subunit is used for determining second feature points corresponding to the feature distribution probability as Bi background feature points corresponding to each current ultrasonic image if the feature distribution probability is greater than or equal to the target distribution probability.
Preferably, the global motion information determining unit includes:
a feature point displacement determination subunit, configured to determine Bi feature point displacements of each current ultrasound image relative to the reference ultrasound image based on Bi feature points corresponding to each current ultrasound image;
the pixel point displacement determination subunit is used for carrying out interpolation processing on all pixel points in the reference ultrasonic image based on Bi characteristic point displacements of each current ultrasonic image relative to the reference ultrasonic image, and determining pixel point displacements corresponding to all pixel points;
And the global motion information determining subunit is used for determining the pixel point displacement corresponding to all the pixel points as global motion information of each current ultrasonic image relative to the reference ultrasonic image.
Preferably, the corrected ultrasound image acquisition module includes:
the correction pixel point determining unit is used for correcting all current pixel points in each current ultrasonic image based on global motion information of each current ultrasonic image relative to the reference ultrasonic image, and determining correction pixel points of all current pixel points in the reference ultrasonic image;
the correction gray value acquisition unit is used for carrying out interpolation processing on the reference gray value of the reference ultrasonic image based on correction pixel points of all current pixel points in the reference ultrasonic image to acquire correction gray values corresponding to all correction pixel points;
and the corrected ultrasonic image acquisition unit is used for acquiring a corrected ultrasonic image corresponding to each current ultrasonic image according to the corrected gray values corresponding to all the corrected pixel points.
Preferably, the differential average value obtaining module includes:
the gray difference value determining unit is used for carrying out differential calculation on the pixel gray values corresponding to all the pixel points in each corrected ultrasonic image and the pixel gray values corresponding to the same pixel point in the reference ultrasonic image, and determining the gray difference value corresponding to all the pixel points in each corrected ultrasonic image;
A target pixel point determining unit, configured to determine the first d pixel points with larger absolute values of the gray scale difference values as d target pixel points corresponding to each corrected ultrasound image;
and the differential average value acquisition unit is used for carrying out average value processing on the absolute values of the gray level difference values corresponding to the d target pixel points to acquire a differential average value corresponding to each corrected ultrasonic image.
Preferably, the heart rate map acquisition module includes:
the heart motion curve construction unit is used for constructing a heart motion curve based on the difference average value corresponding to the N-1 corrected ultrasonic images;
the extremum quantity determining unit is used for sequentially sliding on a heart motion curve by adopting local sliding windows, carrying out local extremum calculation on the heart motion curve and determining the extremum quantity corresponding to each local sliding window;
the measured heart rate value determining unit is used for determining the number of extreme values corresponding to each local sliding window and determining the measured heart rate value corresponding to each local sliding window;
and the heart rate graph acquisition unit is used for acquiring the heart rate graph corresponding to the ultrasonic image of N frames based on the actually measured heart rate values corresponding to all the local sliding windows.
An ultrasound device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the above-mentioned ultrasound heart rate detection method when executing the computer program.
A computer readable storage medium storing a computer program which when executed by a processor implements the above-described ultrasonic heart rate detection method.
According to the ultrasonic heart rate detection method, the ultrasonic heart rate detection device, the ultrasonic equipment and the storage medium, first characteristic points in N frames of ultrasonic images are extracted, global motion estimation is carried out on the basis of all the first characteristic points, global motion information of each current ultrasonic image relative to a reference ultrasonic image is determined, so that the global motion information can effectively reflect the motion characteristics of the whole image, and interference of unstable probes on the whole motion of the image is eliminated; then, based on global motion information, carrying out image reconstruction on each current ultrasonic image to update the pixel gray value in the current ultrasonic image, so that the overall shape of the corrected ultrasonic image is similar to that of the reference ultrasonic image, background motion can be effectively filtered, and the foreground image characteristics of the ultrasonic image are highlighted; and finally, carrying out differential processing on the N-1 frame correction ultrasonic image and the reference ultrasonic image, and determining N-1 differential average values so as to carry out heart rate conversion based on the N-1 differential average values and determine a corresponding heart rate graph. According to the scheme, external equipment is not needed, heart rate calculation can be carried out in a B mode which is the most conventional mode by purely using ultrasonic equipment, and the clinical operation efficiency of doctors is improved; the current ultrasonic image is subjected to image reconstruction based on global motion information, so that a motion filtering effect can be realized, unstable motion interference of a probe can be effectively eliminated, a doctor is not required to particularly control the probe in the heart rate chart detection process, a more accurate detection result can be obtained, the operation of the doctor is facilitated, and the concentration of the doctor is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments of the present invention will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an ultrasound device in an embodiment of the invention;
FIG. 2 is a flow chart of an ultrasonic heart rate detection method according to an embodiment of the invention;
FIG. 3 is a flowchart of step S202 in FIG. 2;
FIG. 4 is a flowchart of step S303 in FIG. 3;
FIG. 5 is a flowchart of step S304 in FIG. 3;
FIG. 6 is a flowchart of step S203 in FIG. 2;
FIG. 7 is a flowchart of step S204 in FIG. 2;
FIG. 8 is a flowchart of step S205 in FIG. 2;
fig. 9 is a schematic diagram of an ultrasonic heart rate detection device according to an embodiment of the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The ultrasonic heart rate detection method provided by the embodiment of the invention can be applied to the ultrasonic equipment shown in figure 1, and the ultrasonic equipment comprises a main controller, an ultrasonic probe connected with the main controller, a beam synthesis processor, an image processor and a display screen.
The main controller is a controller of the ultrasonic equipment and is connected with other functional modules in the ultrasonic equipment, including but not limited to an ultrasonic probe, a beam synthesis processor, an image processor, a display screen and other functional modules, and is used for controlling the operation of each functional module.
An ultrasonic probe is a transmitting and receiving device of ultrasonic waves. In this example, in order to ensure that ultrasound images at different angles can have a larger transverse scanning coverage, that is, ensure that ultrasound images at different angles have a larger overlapping range, the existing ultrasound probe generally comprises a plurality of elongated piezoelectric transducers (each single piezoelectric transducer is called as an array element) with the same size at equal intervals; or the piezoelectric transducers are arranged in a two-dimensional array, i.e. array elements are arranged in a two-dimensional matrix shape. The piezoelectric transducer in the ultrasonic probe converts the voltage pulse excitation applied to the piezoelectric transducer into mechanical vibration, so that ultrasonic waves are emitted outwards; the ultrasonic wave propagates in the medium such as human tissue, and echo analog signals such as reflected waves and scattered waves are generated, each piezoelectric transducer can convert the echo analog signals into echo electric signals, amplify and analog-to-digital convert the echo electric signals into echo digital signals, and then the echo digital signals are sent to the beam synthesis processor.
The beam synthesis processor is connected with the ultrasonic probe and is used for receiving the echo digital signals sent by the ultrasonic probe, carrying out beam synthesis on the echo digital signals of one or more channels, obtaining one or more paths of echo synthesis signals, and sending the echo synthesis signals to the image processor.
The image processor is connected with the beam synthesis processor and is used for receiving the echo synthesized signal sent by the beam synthesis processor, and carrying out data analysis, modulo, logarithmic compression, space compounding and other image preprocessing operations on the echo synthesized signal. The data analysis is to analyze the echo synthesized signals into a two-dimensional matrix of the number of sampling points multiplied by the number of scanning lines, but the signals at each position are complex signals, and in order to image more intuitively, the complex signals are subjected to modulo operation to obtain the energy of the signals, and the energy is used for representing the signals at the position. The logarithmic compression is to make a logarithmic function transformation on the data after the modulo operation, and compress the dynamic range to make the organization level of the image clearer. After that, the same area image transmitted and received at a plurality of angles is compounded by using a space compounding operation, so that the coherence effect is weakened, the speckle noise is reduced, and the resolution of the whole image is improved. After the image preprocessing operation, the image post-processing operation such as gain and dynamic transformation needs to be performed on the ultrasonic image, and the processing process is not substantially different from that of conventional ultrasonic imaging, which is not described in detail herein.
As an example, the image processor may be a graphics card processor (Graphics Processing Unit, hereinafter referred to as GPU), which is a processor designed for performing the mathematical and geometric calculations necessary for complex graphics rendering, and helps to increase the efficiency of ultrasound image generation. In this example, the image processor is adopted to be dedicated to image processing, so that the main controller is liberated from the task of image processing, more system tasks can be executed, and the overall performance of the ultrasonic device is improved.
In one embodiment, an ultrasonic heart rate detection method is provided, and the method is applied to the image processor shown in fig. 1, for example, and the ultrasonic heart rate detection method includes:
s201: acquiring Mi first characteristic points corresponding to N frames of ultrasonic images, wherein the ultrasonic images comprise 1 frame of reference ultrasonic image and N-1 frames of current ultrasonic images, and Mi is the number of the first characteristic points in the ith ultrasonic image;
s202: based on Mi first feature points corresponding to the N frames of ultrasonic images, global motion estimation is carried out on each current ultrasonic image relative to the reference ultrasonic image, and global motion information of each current ultrasonic image relative to the reference ultrasonic image is determined;
s203: based on global motion information of each current ultrasonic image relative to a reference ultrasonic image, carrying out image reconstruction on each current ultrasonic image to obtain a corrected ultrasonic image corresponding to each current ultrasonic image;
S204: performing differential processing on the N-1 frame correction ultrasonic image and the reference ultrasonic image to obtain differential average values corresponding to the N-1 correction ultrasonic images;
s205: and carrying out heart rate conversion on the differential average value corresponding to the N-1 corrected ultrasonic images to obtain a heart rate chart corresponding to the N frames of ultrasonic images.
The first feature points refer to feature points extracted from each ultrasonic image, wherein the feature points refer to points with obvious features and can facilitate subsequent global motion estimation. The reference ultrasound image is an unprocessed ultrasound image that is subsequently used as a reference fiducial. The current ultrasound image refers to an unprocessed ultrasound image other than the reference ultrasound image. N is the number of ultrasound images acquired in need of buffering, and is at least the number of image frames of one cardiac cycle.
As an example, in step S201, the image processor may acquire N frames of ultrasound images, perform feature point extraction on the N frames of ultrasound images, determine Mi first feature points corresponding to each frame of ultrasound image, where Mi is the number of first feature points extracted from the i-th frame of ultrasound image, where i=1, 2, …, N. In this example, in the N frames of ultrasound images, the number Mi of the first feature points extracted by each frame of ultrasound image may be the same or different.
For example, the image processor may perform Harris corner extraction on each ultrasound image, and determine the extracted Harris corner as the first feature point; the method can also be used for extracting the Harris corner points based on multiple scales for each ultrasonic image, the extracted Harris corner points based on multiple scales are determined to be the first characteristic points, good effects can be achieved for both the Harris corner points and the Harris corner points based on multiple scales, the image characteristics are obvious, the subsequent operation is convenient, the calculated amount is less, and the Harris corner point extraction process is a basic algorithm of image processing and is not repeated here. For example, the image processor may further extract SIFT feature points or other feature points satisfying the feature significance, which may facilitate the subsequent determination of the feature point for global motion estimation as the first feature point.
In this example, after acquiring N frames of ultrasound images acquired and buffered in a period of time, the image processor may determine the 1 st frame of ultrasound image as a reference ultrasound image, determine other N-1 frames of ultrasound images as current ultrasound images, and may also determine the 1 st frame of reference ultrasound image and the N-1 frames of current ultrasound images using other rules.
For example, after each frame of ultrasonic image is obtained, the image processor may perform feature point extraction on the ultrasonic image, determine Mi first feature points corresponding to the ith frame of ultrasonic image, count the current buffer frame number, and perform subsequent global motion estimation when the current buffer frame number reaches N frames. Where N is the number of image frames of at least one cardiac cycle, and is a preset buffered frame of an ultrasound image that needs to be buffered. The current buffered frame number refers to the frame number of the buffered ultrasound image from the time of buffering the 1 st frame of ultrasound image to the current time. Generally, the more the number of buffer frames N, the better the effect, but the more the number of buffer frames N, the larger the subsequent calculation amount, so the determination of N needs to trade-off the performance and the effect, which is actually selected by the numerical value of the experiment on the specific hardware device.
The global motion estimation refers to motion estimation of all pixel points between two frames of images. Global motion information refers to information obtained after global motion estimation is performed on two frames of images.
As an example, in step S202, after buffering the N frames of ultrasound images and the corresponding Mi first feature points, the image processor needs to calculate global motion information of the N-1 frames of current ultrasound images relative to the reference ultrasound image by using the Mi first feature points, where the global motion information may reflect motion characteristics of the whole image, so as to eliminate interference of probe instability on the whole motion of the image. In general, the result of probe instability must be a uniform change in the overall image that will cause the first feature point of the overall image to conform to a defined affine transformation pattern for global motion estimation of the first feature point of the N-frame ultrasound image for the purpose of defining the affine transformation pattern for subsequent image reformation.
The image reconstruction process refers to a process of changing the pixel gray value of an image to acquire a new image.
In step S203, after determining global motion information of each current ultrasonic image relative to the reference ultrasonic image, the image processor may determine a pixel position of a current pixel in each current ultrasonic image in the reference ultrasonic image based on the global motion information, interpolate the gray values of the pixel positions by using the pixel gray values of all the pixels in the reference ultrasonic image, update the pixel gray values in the current ultrasonic image by using the interpolated pixel gray values, and perform image reconstruction on the pixel gray value corresponding to each pixel in the current ultrasonic image to obtain a corrected ultrasonic image corresponding to each current ultrasonic image, so that the overall shape of the corrected ultrasonic image is relatively similar to that of the reference ultrasonic image, and the background motion can be effectively filtered, while the effective image characteristics are retained.
As an example, in step S204, after acquiring N-1 frames of corrected ultrasound images having a shape similar to that of the whole of the reference ultrasound image, the image processor may perform differential processing on the N-1 frames of corrected ultrasound images and the reference ultrasound image, specifically, perform differential calculation on the pixel gray value of each pixel point in each corrected ultrasound image and the pixel gray value of the same pixel point in the reference ultrasound image, determine the gray difference value corresponding to each pixel point, and then perform average processing on the acquired gray difference values to determine the differential average value g corresponding to each corrected ultrasound image.
As an example, in step S205, the image processor may perform heart rate conversion on the N-1 differential average values g to obtain a heart rate map corresponding to the N-1 frame of corrected ultrasonic images. For example, a heart motion curve is drawn by taking N-1 as a horizontal axis and taking a differential average value g as a vertical axis, and the heart motion curve is analyzed and processed to form a heart rate chart corresponding to N frames of ultrasonic images. In this example, the calculated heart rate map may be displayed on the terminal display in synchronization with the original B-image.
In this example, after each current ultrasonic image is subjected to image reconstruction, the image processor can buffer a corrected ultrasonic image after image reconstruction, the corrected ultrasonic image is basically consistent with the background of the reference ultrasonic image, and the image features of the foreground are relatively prominent, so that a certain filtering effect is achieved; after buffering the frame corrected ultrasound image, the frame corrected ultrasound image may be low pass filtered in the time direction to reduce noise interference, where the low pass filtering is not limited and a basic signal processing filter may be used.
In the embodiment, first feature points in N frames of ultrasonic images are extracted, global motion estimation is performed based on all the first feature points, global motion information of each current ultrasonic image relative to a reference ultrasonic image is determined, so that the global motion information can effectively reflect the motion characteristics of the whole image, and interference of unstable probes on the whole motion of the image is eliminated; then, based on global motion information, carrying out image reconstruction on each current ultrasonic image to update the pixel gray value in the current ultrasonic image, so that the overall shape of the corrected ultrasonic image is similar to that of the reference ultrasonic image, background motion can be effectively filtered, and the foreground image characteristics of the ultrasonic image are highlighted; and finally, carrying out differential processing on the N-1 frame correction ultrasonic image and the reference ultrasonic image, and determining N-1 differential average values so as to carry out heart rate conversion based on the N-1 differential average values and determine a corresponding heart rate graph. According to the scheme, external equipment is not needed, heart rate calculation can be carried out in a B mode which is the most conventional mode by purely using ultrasonic equipment, and the clinical operation efficiency of doctors is improved; the current ultrasonic image is subjected to image reconstruction based on global motion information, so that a motion filtering effect can be realized, unstable motion interference of a probe can be effectively eliminated, a doctor is not required to particularly control the probe in the heart rate chart detection process, a more accurate detection result can be obtained, the operation of the doctor is facilitated, and the concentration of the doctor is improved.
In an embodiment, as shown in fig. 3, step S202, that is, performing global motion estimation on each current ultrasound image with respect to a reference ultrasound image based on Mi first feature points corresponding to N frames of ultrasound images, determines global motion information of each current ultrasound image with respect to the reference ultrasound image, includes:
s301: based on Mi first feature points corresponding to the N frames of ultrasonic images, P second feature points existing in the N frames of ultrasonic images are determined;
s302: carrying out affine transformation processing on each current ultrasonic image relative to the reference ultrasonic image based on the P second characteristic points to obtain an affine transformation matrix corresponding to each current ultrasonic image;
s303: based on affine transformation matrixes corresponding to each current ultrasonic image, carrying out feature screening on the P second feature points, and determining Bi background feature points corresponding to each current ultrasonic image;
s304: and determining global motion information of each current ultrasonic image relative to the reference ultrasonic image based on Bi background characteristic points corresponding to each current ultrasonic image.
The second feature points refer to first feature points where N frames of ultrasonic images exist.
As an example, in step S301, after acquiring the first feature points of the N frames of ultrasound images, the image processor needs to perform feature matching on Mi first feature points of the N frames of ultrasound images by using a feature matching algorithm to determine second feature points of the N frames of ultrasound images, where the number of the second feature points may be set to P. For example, the 1 st frame of ultrasound image includes several first feature points of A1, B1, C1 and D1, the 2 nd frame of ultrasound image includes several first feature points of A2, B2 and C2, and the 3 rd frame of ultrasound image includes several first feature points of A3, B3, C3, E3 and F3, after feature matching is performed, it may be determined that P second feature points that coexist are A, B and C.
In this example, mi first feature points corresponding to the N ultrasound images are feature points determined by performing separate calculation based on a single ultrasound image, and during the motion process, some first feature points may be moved out of the image area or even leave the current scanning plane, so that the matching processing needs to be performed on the feature matching of the first feature points of the N ultrasound images, and P second feature points always in the image area are determined. Since feature matching is a conventional image processing algorithm technology, there are many basic algorithms for realizing the step function, and the matching processing can be performed by adopting, but not limited to, a SAD absolute error sum algorithm, an SSD error square sum algorithm, an NCC normalized cross correlation algorithm, an optical flow algorithm and the like, which are not described in detail herein.
Understandably, in performing feature matching on the first feature points corresponding to the N frames of ultrasonic images, determining P second feature points existing in all of the N frames of ultrasonic images, where the second feature points may have the same position information or may have different position information although the second feature points exist in all of the ultrasonic images; moreover, the number P of second feature points has an influence on the number N of buffer frames of the ultrasound image, and in general, the larger the number N of buffer frames is, the smaller the number P of second feature points that are matched to the always present number P of second feature points is, and the more robust the second feature points are.
The affine transformation matrix is a matrix formed by fitting affine transformation parameters between two frames of images based on the second characteristic points.
As an example, in step S302, after obtaining the P second feature points, the image processor may perform affine transformation processing on each current ultrasound image with respect to the reference ultrasound image based on the basic assumption of affine transformation based on the position information of the P second feature points, and fit affine transformation parameters thereof to obtain an affine transformation matrix corresponding to each current ultrasound image. In this example, the corresponding affine transformation matrix may be fitted by using, but not limited to, a least square method to ensure the fitting efficiency, or may be fitted by using RANSAC to make the fitting effect more robust. For example, P second feature points existing in each of the N frames of ultrasound images are A, B and C, the 1 st frame of ultrasound image is determined as a reference ultrasound image, the 2 nd to N frames of ultrasound images are current ultrasound images, an affine transformation matrix corresponding to the current ultrasound image can be fitted by using the second feature points A1/B1/C1 corresponding to the 1 st frame of ultrasound image and the second feature points A2/B2/C2 corresponding to the 2 nd frame of ultrasound image, an affine transformation matrix corresponding to the current ultrasound image can be fitted by using the second feature points A1/B1/C1 corresponding to the 1 st frame of ultrasound image and the second feature points A3/B23/C3 corresponding to the 3 rd frame of ultrasound image, and so on, an affine transformation matrix corresponding to the N-1 frame of current ultrasound image relative to the reference ultrasound image can be fitted.
The background feature points are feature points for reflecting the overall background motion.
As an example, in step S303, after determining the affine transformation matrix of each current ultrasound image with respect to the reference ultrasound image based on the P second feature points, the image processor may perform feature screening on the P second feature points based on the affine transformation matrix of each current ultrasound image with respect to the reference ultrasound image and the feature point coordinates of the P second feature points in the two ultrasound images, so as to screen Bi background feature points purely belonging to the overall background motion from the P second feature points, where Bi is the number of the background feature points determined in the i-th current ultrasound image. In this example, feature screening analysis may be performed on the P second feature points based on the feature point coordinates of the P second feature points in the reference ultrasound image, the feature point coordinates of the P second feature points in each current ultrasound image, and the affine transformation matrix corresponding thereto, so as to screen out background feature points capable of reflecting the overall background change.
As an example, in step S304, the image processor determines, based on Bi background feature points, feature point coordinates corresponding to the background feature points in each current ultrasound image and feature point coordinates corresponding to the background feature points in the reference ultrasound image, calculates feature point displacements according to the feature point coordinates of the two frames of images, and further determines, by using the feature point displacements, pixel point displacements of all pixels in the current ultrasound image with respect to all pixels in the same position in the reference ultrasound image, and further global motion information of the current ultrasound image with respect to the reference ultrasound image.
In the embodiment, the Mi first feature points corresponding to the N frames of ultrasonic images are subjected to matching processing, P second feature points existing in all the ultrasonic images are determined, and because the P second feature points exist in all the ultrasonic images, guarantee is provided for calculating an affine transformation matrix between the two ultrasonic images; based on feature point coordinates in two frames of ultrasonic images in the P second feature points and affine transformation matrixes between the two frames of ultrasonic images, carrying out feature screening on the P second feature points so as to screen out feature points purely belonging to integral background motion from the P second feature points, determining the feature points as background feature points, guaranteeing the stability of the background feature points and removing interference of other feature points; and finally, according to the B background characteristic points, determining the pixel point displacement of all the pixel points of the N-1 frame current ultrasonic image relative to the reference ultrasonic image, and determining the pixel point displacement as global motion information, so that the global motion information can effectively reflect the motion condition of all the pixel points in the two frames of ultrasonic images.
In an embodiment, as shown in fig. 4, step S303, that is, performing feature screening on P second feature points based on an affine transformation matrix corresponding to each current ultrasound image, determines Bi background feature points corresponding to each current ultrasound image, includes:
S401: determining feature distribution probability corresponding to each second feature point based on an affine transformation matrix corresponding to each current ultrasonic image, feature point coordinates of P second feature points in each current ultrasonic image and feature point coordinates in a reference ultrasonic image;
s402: determining target distribution probability based on the feature distribution probability corresponding to the P second feature points;
s403: if the feature distribution probability is greater than or equal to the target distribution probability, determining second feature points corresponding to the feature distribution probability as Bi background feature points corresponding to each current ultrasonic image.
As an example, in step S401, after obtaining the affine transformation matrix corresponding to the frame current ultrasound image determined by the P second feature points, the image processor may determine the feature distribution probability corresponding to each second feature point based on the feature point coordinates of the P second feature points in the reference ultrasound image and the feature point coordinates of the P second feature points in each current ultrasound image, in combination with the affine transformation matrix of each current ultrasound image relative to the reference ultrasound image.
For example, the image processor may process the coordinates of the feature points of the P second feature points in the reference ultrasound image and the current ultrasound image of each frame and the affine transformation matrix therebetween by using a gaussian distribution algorithm, to determine a feature distribution probability corresponding to each second feature point, where the gaussian distribution algorithm is used in the process as follows:
Wherein A is k Refers to an affine transformation matrix of the K-th frame current ultrasonic image relative to a reference ultrasonic image; pos ref The feature point coordinates of the P second feature points in the reference ultrasonic image; pos curr The feature point coordinates of the P second feature points in the current ultrasonic image of the kth frame are referred to; sigma is an algorithm parameter of gaussian distribution, which can be determined experimentally;the method is an exponential function, specifically adopts a configuration of the exponential function and the interior, and the whole is an expression of a Gaussian distribution function; />For the feature distribution probability of the ith second feature point, in general, if a certain second feature point completely satisfies the relationship of affine transformation matrix, its calculated feature distribution probability +.>If a certain second feature point does not satisfy the relation of affine transformation matrix at all, the calculated feature distribution probability is +.>The value of (2) is 0.
Wherein the target distribution probability is a distribution probability for evaluating whether or not the overall background motion is reached.
As an example, in step S402, the image processor determines the feature distribution probability of the P second feature pointsProbability of distribution of P features>And (5) performing analysis processing to determine a distribution probability standard for reflecting the overall background motion. For example, probability may be distributed for P features >Carrying out mean processing, and determining the calculated mean value as a target distribution probability; probability of distribution from P features>And (3) screening out the maximum characteristic distribution probability, and determining the product of the maximum characteristic distribution probability and a preset proportionality coefficient as a target distribution probability.
As an example, in step S403, the image processor may acquire feature distribution probabilities of P second feature points in each current ultrasound imageRespectively comparing the calculated target distribution probabilities with the calculated target distribution probabilities; if the characteristic distribution probabilityGreater than or equal to the target distribution probability, it is determined that it meets the distribution probability criterion of the overall background motion, the feature distribution probability can be determined +.>The corresponding second feature points are background feature points; conversely, if the feature distribution probability +.>If the probability is smaller than the target distribution probability, determining that the probability does not meet the distribution probability standard of the overall background motion, and determining the feature distribution probabilityThe corresponding second feature point is not a background feature point. Understandably, bi background feature points corresponding to each current ultrasound image are selected from the P second feature points, and the numbers of the background feature points in different current ultrasound images may be the same or different.
For example, in calculating the feature distribution probability of the ith second feature pointAfter that, can meetIn this condition, the second feature point may be determined as a background feature point, wherein,means that the probability of feature distribution for P second feature points +.>Performing maximum value taking processing to determine the maximum characteristic distribution probability; />Is a preset proportionality coefficient; />The calculated target distribution probability is used; if it isAnd determining a second feature point corresponding to the feature distribution probability as a background feature point.
In this embodiment, the feature distribution probability corresponding to each second feature point is determined based on the affine transformation matrix corresponding to the current ultrasonic image of the N-1 frame, the corresponding target distribution probability is determined by analyzing based on the feature distribution probabilities corresponding to all the second feature points, so as to ensure the objectivity and accuracy of the target distribution probability, and the second feature points with the feature distribution probability greater than or equal to the target distribution probability are determined as background feature points, so as to screen out the background feature points reflecting the overall background motion.
In one embodiment, as shown in fig. 5, step S304, that is, determining global motion information of each current ultrasound image relative to the reference ultrasound image based on Bi background feature points corresponding to each current ultrasound image, includes:
S501: determining Bi feature point displacements of each current ultrasonic image relative to the reference ultrasonic image based on the Bi background feature points corresponding to each current ultrasonic image;
s502: interpolation processing is carried out on all pixel points in the reference ultrasonic image based on Bi feature point displacements of each current ultrasonic image relative to the reference ultrasonic image, and pixel point displacements corresponding to all pixel points are determined;
s503: and determining the pixel point displacement corresponding to all the pixel points as global motion information of each current ultrasonic image relative to the reference ultrasonic image.
The characteristic point displacement refers to the displacement of each background characteristic point between the characteristic point coordinates in the current ultrasonic image and the reference ultrasonic image.
As an example, in step S501, after determining Bi background feature points reflecting the overall background motion, the image processor may obtain feature point coordinates of each background feature point in the current ultrasound image and feature point coordinates in the reference ultrasound image, perform difference calculation on the feature point coordinates in the current ultrasound image and the feature point coordinates in the reference ultrasound image, and determine a feature point displacement of each background feature point, where the feature point displacement may be understood as a displacement of the background motion and is a relatively stable displacement. For example, feature point coordinates A1/B1/C1/D1 of 4 background feature points are determined in the reference ultrasound image, feature point coordinates A2/B2/C2/D2 of 4 background feature points are determined in the current ultrasound image, feature point displacement V (a) =a2-A1 of the background feature point a, feature point displacement V (B) =b2-B1 of the background feature point B, feature point displacement V (C) =c2-C1 of the background feature point C, and feature point displacement V (D) =d2-D1 of the background feature point D may be calculated.
As an example, in step S502, after obtaining the feature point displacement corresponding to the Bi background feature points of the i-th current ultrasound image relative to the reference ultrasound image, the image processor may perform interpolation processing on all the pixels in the reference ultrasound image based on the Bi background feature points, and specifically may perform interpolation processing by using, but not limited to, a bilinear interpolation algorithm, so as to determine the pixel point displacement of all the pixels in each current ultrasound image relative to all the pixels in the reference ultrasound image.
As an example, in step S503, the image processor determines the pixel displacement corresponding to all the pixels as global motion information of the current ultrasound image with respect to the reference ultrasound image. For example, for two frames of ultrasonic images with the resolution of 640 x 480, after the four feature point displacements V (a), V (B), V (C) and V (D) are determined, the pixel point displacements of all the pixels in the reference ultrasonic image relative to the i-th current ultrasonic image, instead of the feature point displacements, can be obtained according to the four feature point removal interpolation, so as to determine to which pixel position each pixel in the reference ultrasonic image moves in the i-th current ultrasonic image, and thus acquire global motion information.
In the embodiment, in Bi background feature points of each current ultrasonic image moving relative to a reference ultrasonic image, feature point coordinates of each background feature point in the current ultrasonic image and the reference ultrasonic image are utilized to determine feature point displacement of each background feature point between two frames of ultrasonic images, wherein the displacement reflects background movement; and then, carrying out interpolation processing on all pixel points in the reference ultrasonic image by utilizing Bi characteristic point displacements to determine the pixel positions reached by all pixel points in the reference ultrasonic image in the current ultrasonic image, thereby acquiring global motion information, and enabling the global motion information to effectively reflect the motion condition of all pixel points.
In an embodiment, as shown in fig. 6, step S203, that is, performing image reconstruction on each current ultrasound image based on global motion information of each current ultrasound image relative to a reference ultrasound image, obtains a corrected ultrasound image corresponding to each current ultrasound image, includes:
s601: correcting all current pixel points in each current ultrasonic image based on global motion information of each current ultrasonic image relative to a reference ultrasonic image, and determining corrected pixel points of all current pixel points in the reference ultrasonic image;
S602: interpolation processing is carried out on the reference gray values of the reference ultrasonic image based on the correction pixel points of all the current pixel points in the reference ultrasonic image, and correction gray values corresponding to all the correction pixel points are obtained;
s603: and acquiring a corrected ultrasonic image corresponding to each current ultrasonic image according to the corrected gray values corresponding to all the corrected pixel points.
The current pixel point refers to a pixel point in the current ultrasonic image. The correction pixel point is to determine that the current ultrasonic image is corrected to the pixel point in the reference ultrasonic image based on the current pixel point and global motion information, and because the global motion information comprises the pixel point displacement of all the pixel points, the correction pixel point of the current ultrasonic image corrected to the reference ultrasonic image can be determined based on the pixel point displacement corresponding to the same pixel point between the current pixel point and two frames of images.
As an example, in step S601, after acquiring global motion information of each current ultrasound image relative to the reference ultrasound image, that is, after acquiring pixel point displacements of each current ultrasound image and all pixel points in the reference ultrasound image, the image processor may calculate the pixel point displacements of all current pixel points and pixel points in the current ultrasound image, and determine a corrected pixel point of the current pixel point in the reference ultrasound image. For example, in determining the background pixel point A1 in the reference ultrasonic image, the current pixel point A2 at the same pixel position in the current ultrasonic image, the characteristic point displacement calculated by the background pixel point A1 and the current pixel point A2 is A 1,2 According to the background pixel point A1, the current pixel point A2 and the characteristic point displacement A 1,2 Calculating the pixel position of the current pixel point A2 in the reference ultrasonic image, determining the characteristic point corresponding to the pixel position as a correction pixel point newA1, wherein newA1=A2-A 1,2
The reference gray value refers to the gray value of each reference pixel point measured in the reference ultrasonic image, and the reference pixel point refers to the pixel point in the reference ultrasonic image. The corrected gray value is obtained by performing interpolation processing by using the reference gray value, and determining the gray value of the position of the corrected pixel point.
As an example, in step S602, after obtaining corrected pixel points of all current pixel points in the reference ultrasound image, the image processor may determine a plurality of current pixel points adjacent to the corrected pixel points in the reference ultrasound image and corresponding reference gray values thereof; then, combining the pixel point coordinates of the correction pixel points, the pixel point coordinates of the current pixel points and the corresponding reference gray values, and carrying out interpolation processing on all the reference gray values to determine the correction gray values corresponding to the correction pixel points. For example, the pixel point coordinates A1 (5, 3) of a certain current pixel point in the current ultrasonic image, the characteristic point displacement is A 1,2 (0.5,0.4) then determining the current pixelSince the pixel point coordinates newA1 (4.5,2.6) of the corrected pixel point of the point A1 (5, 3) in the reference ultrasonic image cannot obtain the corresponding gray value from the reference ultrasonic image, an interpolation algorithm including, but not limited to, a bilinear interpolation algorithm is required to perform interpolation processing on the reference gray values corresponding to the four reference pixel points adjacent to newA1 (4.5,2.6) to determine the corrected gray value corresponding to the corrected pixel point newA1 (4.5,2.6).
As an example, in step S603, the image processor performs interpolation processing on the gray values of the positions of the correction pixels according to the reference gray values corresponding to all the reference pixels in the reference ultrasound image, and after determining the correction pixels corresponding to all the correction pixels, may construct the correction ultrasound image corresponding to the current ultrasound image based on the correction pixels corresponding to all the correction pixels. That is, the correction of the ultrasound image refers to updating the gray value of the pixel in the current ultrasound image to make the overall shape of the image more similar to that of the reference ultrasound image, while preserving the image characteristics of the reference ultrasound image.
In this embodiment, a correction pixel point of each current ultrasonic image in the reference ultrasonic image is determined according to global motion information and the current pixel point of each current ultrasonic image, then interpolation processing is performed on the gray value of the position of the correction pixel point by using the reference gray value in the reference ultrasonic image, and the correction gray value corresponding to the correction pixel point is determined, so that the correction ultrasonic image corresponding to the current ultrasonic image is determined based on the correction gray values corresponding to all correction pixel points, so that the overall shape of the correction ultrasonic image is similar to that of the reference ultrasonic image, and meanwhile, the image characteristics of the foreground of the reference ultrasonic image are maintained.
In one embodiment, as shown in FIG. 7, step S204, i.e. performing differential processing on the N-1 frame corrected ultrasound image and the reference ultrasound image to obtain a differential average value corresponding to the N-1 corrected ultrasound images, includes:
s701: carrying out differential calculation on pixel gray values corresponding to all pixel points in each corrected ultrasonic image and pixel gray values corresponding to the same pixel point in the reference ultrasonic image, and determining gray difference values corresponding to all pixel points in each corrected ultrasonic image;
s702: the first d pixel points with larger absolute values of gray level difference values are determined as d target pixel points corresponding to each corrected ultrasonic image;
S703: and carrying out average value processing on the absolute values of the gray difference values corresponding to the d target pixel points, and obtaining a differential average value corresponding to each corrected ultrasonic image.
As an example, in step S701, after acquiring N-1 frames of corrected ultrasound images, the image processor may perform differential calculation on the pixel gray value corresponding to each pixel point in each corrected ultrasound image and the pixel gray value corresponding to the same pixel point in the reference ultrasound image, to determine a differential map corresponding to the corrected ultrasound image, where the differential map includes gray differences corresponding to all pixel points. In this example, after the image processor acquires the differential map corresponding to each current ultrasound image, the image processor may buffer the differential map until the buffer of the differential map corresponding to the nth frame of current ultrasound image is performed, and then the subsequent steps are performed.
As an example, in step S702, the image processor may determine an absolute value of the gray-scale difference value corresponding to each pixel in each corrected ultrasonic image after obtaining the gray-scale difference values corresponding to all the pixels; and sequencing absolute values of gray difference values corresponding to all the pixel points, determining the first d pixel points with larger absolute values of gray difference values, and determining the first d pixel points as d target pixel points corresponding to each corrected ultrasonic image, wherein d is the number of the target pixel points, is a preset algorithm parameter, and can be determined according to experiments.
As an example, in step S703, after obtaining d target pixel points corresponding to each corrected ultrasound image, the image processor may determine an absolute value of a gray level difference value corresponding to each target pixel point; and then carrying out average value calculation on the absolute values of the gray level difference values corresponding to the d target pixel points, determining a calculation result as a differential average value g corresponding to each corrected ultrasonic image, and the like, so as to determine the differential average value g corresponding to the N-1 frame corrected ultrasonic image.
In the embodiment, differential calculation is performed on pixel gray values corresponding to the same pixel point in each corrected ultrasonic image and the reference ultrasonic image, and gray difference values corresponding to all the pixel points are determined; determining the first d pixel points with larger absolute values of gray differences as target pixel points so as to determine d target pixel points with larger differences of pixel gray values in two frames of ultrasonic images, and effectively reflecting the change condition of foreground image characteristics in the ultrasonic images; and finally, carrying out average value processing on absolute values of gray difference values corresponding to the d target pixel points, so that the calculated differential average value g can effectively reflect the cardiac change condition.
In one embodiment, as shown in fig. 8, step S204, namely performing heart rate conversion on the differential average value corresponding to the N-1 corrected ultrasound images, obtains a heart rate map corresponding to the N frames of ultrasound images, includes:
S801: constructing a heart motion curve based on the difference average value corresponding to the N-1 corrected ultrasonic images;
s802: sequentially sliding the local sliding windows on the heart motion curve, calculating local extremum values of the heart motion curve, and determining the number of extremum values corresponding to each local sliding window;
s803: the number of extremum values corresponding to each local sliding window is determined, and the actually measured heart rate value corresponding to each local sliding window is determined;
s804: and acquiring a heart rate chart corresponding to the N frames of ultrasonic images based on the actually measured heart rate values corresponding to all the local sliding windows.
As an example, in step S801, after obtaining the differential average value g corresponding to the M-1 frame of corrected ultrasound image, the image processor may construct a cardiac motion curve with N-1 as the horizontal axis and the differential average value g as the vertical axis. Since background motion in the ultrasound image can be effectively filtered after image reconstruction, the drawn heart motion curve can represent the motion rhythm of the heart or is equivalent to the trend of cardiac variation.
As an example, in step S802, after constructing the cardiac motion curve, the image processor performs local extremum statistics by using a sliding window method, specifically, sequentially slides on the cardiac motion curve by using a local sliding window, performs local extremum calculation on the cardiac motion curve, and determines a sliding window local area of the cardiac motion curve selected by the local sliding window when the local sliding window slides on the cardiac motion curve; performing extremum calculation on the heart motion curve in the sliding window area, and determining all maximum values and all minimum values corresponding to the sliding window area; and counting the number of all maximum values and all minimum values corresponding to the sliding window area, and determining the number as the number of extreme values corresponding to the sliding window area, wherein Q can be adopted for representing.
In step S803, after determining the number of extremum values Q corresponding to the sliding window, the image processor may calculate the number of extremum values Q corresponding to the sliding window by using a heart rate calculation formula to determine an actual measurement heart rate value corresponding to the sliding window, where the heart rate calculation formula isQ is the extreme value number, N is the image frame number of one cardiac cycle, frmRate refers to the image frame rate of the ultrasound device, +.>For the calculated measured heart rate value,
as an example, step S804, the image processor controls the local sliding window to slide on the heart motion curve along the horizontal axis direction, and calculates the measured heart rate value corresponding to the sliding window local area defined by the local sliding window once sliding; when the local sliding windows slide from 0 point of the transverse axis to N-1, the corresponding actual measurement heart rate values of all sliding windows are locally measured; and then, determining a heart rate chart corresponding to the N frames of ultrasonic images based on the measured heart rate values locally corresponding to all sliding windows.
In the embodiment, a heart motion curve is firstly constructed based on N-1 differential average values g, then a local sliding window is adopted to sequentially slide on the heart motion curve, local extremum statistics is carried out on a sliding window area defined by the local sliding window in each sliding process, and the extremum quantity is determined; and then, calculating the corresponding detection heart rate value based on the extreme value quantity in real time, and finally, determining a heart rate chart corresponding to the N frames of ultrasonic images based on all the actually measured heart rate values.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present invention.
In an embodiment, an ultrasonic heart rate detection device is provided, where the ultrasonic heart rate detection device corresponds to the ultrasonic heart rate detection method in the above embodiment one by one. As shown in fig. 9, the ultrasonic heart rate detection device includes a first feature point acquisition module 901, a global motion information acquisition module 902, a corrected ultrasonic image acquisition module 903, a differential average acquisition module 904, and a heart rate map acquisition module 905. The functional modules are described in detail as follows:
the first feature point obtaining module 901 is configured to obtain Mi first feature points corresponding to N frames of ultrasound images, where the ultrasound images include 1 frame of reference ultrasound image and N-1 frames of current ultrasound images, and Mi is the number of first feature points in the ith ultrasound image;
the global motion information obtaining module 902 is configured to perform global motion estimation on each current ultrasound image with respect to the reference ultrasound image based on Mi first feature points corresponding to the N frames of ultrasound images, and determine global motion information of each current ultrasound image with respect to the reference ultrasound image;
The corrected ultrasound image obtaining module 903 is configured to perform image reconstruction on each current ultrasound image based on global motion information of each current ultrasound image relative to the reference ultrasound image, so as to obtain a corrected ultrasound image corresponding to each current ultrasound image;
the differential average value obtaining module 904 is configured to perform differential processing on the N-1 frame corrected ultrasound images and the reference ultrasound image, and obtain differential average values corresponding to the N-1 corrected ultrasound images;
the heart rate map obtaining module 905 is configured to perform heart rate conversion on the differential average values corresponding to the N-1 corrected ultrasound images, and obtain a heart rate map corresponding to the N frames of ultrasound images.
Preferably, the global motion information acquisition module 902 includes:
the second characteristic point determining unit is used for determining P second characteristic points existing in the N frames of ultrasonic images based on Mi first characteristic points corresponding to the N frames of ultrasonic images;
the affine transformation matrix acquisition unit is used for carrying out affine transformation processing on each current ultrasonic image relative to the reference ultrasonic image based on the P second characteristic points to acquire an affine transformation matrix corresponding to each current ultrasonic image;
the background feature point determining unit is used for carrying out feature screening on the P second feature points based on the affine transformation matrix corresponding to each current ultrasonic image, and determining Bi background feature points corresponding to each current ultrasonic image;
And the global motion information determining unit is used for determining global motion information of each current ultrasonic image relative to the reference ultrasonic image based on the Bi background characteristic points corresponding to each current ultrasonic image.
Preferably, the background feature point determination unit includes:
the characteristic distribution probability determining subunit is used for determining the characteristic distribution probability corresponding to each second characteristic point based on the affine transformation matrix corresponding to each current ultrasonic image, the characteristic point coordinates of the P second characteristic points in each current ultrasonic image and the characteristic point coordinates in the reference ultrasonic image;
the target distribution probability determining subunit is used for determining the target distribution probability based on the feature distribution probabilities corresponding to the P second feature points;
and the background characteristic point determining subunit is used for determining the second characteristic points corresponding to the characteristic distribution probability as Bi background characteristic points corresponding to each current ultrasonic image if the characteristic distribution probability is greater than or equal to the target distribution probability.
Preferably, the global motion information determining unit includes:
the characteristic point displacement determination subunit is used for determining Bi characteristic point displacements of each current ultrasonic image relative to the reference ultrasonic image based on Bi background characteristic points corresponding to each current ultrasonic image;
The pixel point displacement determination subunit is used for carrying out interpolation processing on all pixel points in the reference ultrasonic image based on Bi characteristic point displacements of each current ultrasonic image relative to the reference ultrasonic image, and determining pixel point displacements corresponding to all pixel points;
and the global motion information determining subunit is used for determining the pixel point displacement corresponding to all the pixel points as global motion information of each current ultrasonic image relative to the reference ultrasonic image.
Preferably, the corrected ultrasound image acquisition module 903 comprises:
the correction pixel point determining unit is used for correcting all current pixel points in each current ultrasonic image based on global motion information of each current ultrasonic image relative to the reference ultrasonic image, and determining correction pixel points of all current pixel points in the reference ultrasonic image;
the correction gray value acquisition unit is used for carrying out interpolation processing on the reference gray value of the reference ultrasonic image based on the correction pixel points of all the current pixel points in the reference ultrasonic image to acquire correction gray values corresponding to all the correction pixel points;
and the corrected ultrasonic image acquisition unit is used for acquiring corrected ultrasonic images corresponding to each current ultrasonic image according to the corrected gray values corresponding to all the corrected pixel points.
Preferably, the differential average value obtaining module 904 includes:
the gray difference value determining unit is used for performing differential calculation on the pixel gray values corresponding to all the pixel points in each corrected ultrasonic image and the pixel gray values corresponding to the same pixel point in the reference ultrasonic image, and determining the gray difference value corresponding to all the pixel points in each corrected ultrasonic image;
the target pixel point determining unit is used for determining the first d pixel points with larger absolute values of gray level difference values as d target pixel points corresponding to each corrected ultrasonic image;
and the differential average value acquisition unit is used for carrying out average value processing on the absolute values of the gray level difference values corresponding to the d target pixel points to acquire a differential average value corresponding to each corrected ultrasonic image.
Preferably, the heart rate map acquisition module 905 comprises:
the heart motion curve construction unit is used for constructing a heart motion curve based on the difference average value corresponding to the N-1 corrected ultrasonic images;
the extremum quantity determining unit is used for sequentially sliding on the heart motion curve by adopting the local sliding windows, carrying out local extremum calculation on the heart motion curve and determining the extremum quantity corresponding to each local sliding window;
the actually-measured heart rate value determining unit is used for determining the extreme value number corresponding to each local sliding window and determining the actually-measured heart rate value corresponding to each local sliding window;
And the heart rate graph acquisition unit is used for acquiring the heart rate graph corresponding to the N frames of ultrasonic images based on the actually measured heart rate values corresponding to all the local sliding windows.
For specific limitations of the ultrasonic heart rate detection device, reference may be made to the above limitations of the ultrasonic heart rate detection method, and no further description is given here. The modules in the ultrasonic heart rate detection device can be realized in whole or in part by software, hardware and a combination thereof. The above modules can be embedded in hardware or independent from a processor in the ultrasonic device, or can be stored in a memory in the ultrasonic device in software form, so that the processor can call and execute the operations corresponding to the above modules.
In an embodiment, an ultrasound apparatus is provided, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor executes the computer program to implement the method for detecting an ultrasound heart rate in the above embodiment, for example, S201-S205 shown in fig. 2, or S201-S205 shown in fig. 3-8, which are not repeated herein. Alternatively, the processor may implement the functions of the modules/units in this embodiment of the ultrasonic heart rate detection apparatus when executing the computer program, for example, the functions of the first feature point acquisition module 901, the global motion information acquisition module 902, the corrected ultrasonic image acquisition module 903, the differential average value acquisition module 904, and the heart rate map acquisition module 905 shown in fig. 9, which are not described herein again for avoiding repetition.
In an embodiment, a computer readable storage medium is provided, and a computer program is stored on the computer readable storage medium, and when the computer program is executed by a processor, the method for detecting an ultrasonic heart rate in the above embodiment is implemented, for example, S201 to S205 shown in fig. 2, or S201 to S205 shown in fig. 3 to 8, which are not repeated herein. Alternatively, the functions of the modules/units in this embodiment of the above-mentioned ultrasonic heart rate detection device, such as the functions of the first feature point acquisition module 901, the global motion information acquisition module 902, the corrected ultrasonic image acquisition module 903, the differential average value acquisition module 904, and the heart rate map acquisition module 905 shown in fig. 9, are implemented when the computer program is executed by a processor, and are not repeated herein.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention.

Claims (6)

1. An ultrasonic heart rate detection method, comprising:
acquiring Mi first characteristic points corresponding to N frames of ultrasonic images, wherein the ultrasonic images comprise 1 frame of reference ultrasonic image and N-1 frames of current ultrasonic images, and Mi is the number of the first characteristic points in the ith ultrasonic image;
Global motion estimation is carried out on each current ultrasonic image relative to the reference ultrasonic image based on Mi first feature points corresponding to the ultrasonic images of N frames, and global motion information of each current ultrasonic image relative to the reference ultrasonic image is determined;
based on global motion information of each current ultrasonic image relative to the reference ultrasonic image, carrying out image reconstruction on each current ultrasonic image to obtain a corrected ultrasonic image corresponding to each current ultrasonic image;
performing differential processing on the N-1 frames of corrected ultrasonic images and the reference ultrasonic image to obtain differential average values corresponding to the N-1 corrected ultrasonic images;
performing heart rate conversion on the differential average value corresponding to the N-1 corrected ultrasonic images to obtain heart rate graphs corresponding to N frames of ultrasonic images;
the global motion estimation is performed on each current ultrasonic image relative to the reference ultrasonic image based on Mi first feature points corresponding to the ultrasonic images of the N frames, and global motion information of each current ultrasonic image relative to the reference ultrasonic image is determined, including:
based on Mi first feature points corresponding to the N frames of ultrasonic images, P second feature points of the N frames of ultrasonic images are determined;
Carrying out affine transformation processing on each current ultrasonic image relative to the reference ultrasonic image based on the P second characteristic points to obtain an affine transformation matrix corresponding to each current ultrasonic image;
based on affine transformation matrixes corresponding to the current ultrasonic images, carrying out feature screening on the P second feature points, and determining Bi background feature points corresponding to the current ultrasonic images;
determining global motion information of each current ultrasonic image relative to the reference ultrasonic image based on Bi background feature points corresponding to each current ultrasonic image;
the feature screening is performed on the P second feature points based on the affine transformation matrix corresponding to each current ultrasonic image, and the determination of Bi background feature points corresponding to each current ultrasonic image includes:
determining feature distribution probability corresponding to each second feature point based on an affine transformation matrix corresponding to each current ultrasonic image, feature point coordinates of P second feature points in each current ultrasonic image and feature point coordinates in a reference ultrasonic image;
determining target distribution probability based on the feature distribution probability corresponding to the P second feature points;
If the feature distribution probability is greater than or equal to the target distribution probability, determining second feature points corresponding to the feature distribution probability as Bi background feature points corresponding to each current ultrasonic image;
the step of performing image reconstruction on each current ultrasonic image based on global motion information of each current ultrasonic image relative to the reference ultrasonic image to obtain a corrected ultrasonic image corresponding to each current ultrasonic image includes:
correcting all current pixel points in each current ultrasonic image based on global motion information of each current ultrasonic image relative to the reference ultrasonic image, and determining corrected pixel points of all current pixel points in the reference ultrasonic image;
interpolation processing is carried out on the reference gray values of the reference ultrasonic image based on the correction pixel points of all the current pixel points in the reference ultrasonic image, and correction gray values corresponding to all the correction pixel points are obtained;
acquiring a corrected ultrasonic image corresponding to each current ultrasonic image according to the corrected gray values corresponding to all the corrected pixel points;
performing differential processing on the N-1 frames of corrected ultrasonic images and the reference ultrasonic image to obtain differential average values corresponding to the N-1 corrected ultrasonic images, wherein the differential average values comprise:
Carrying out differential calculation on pixel gray values corresponding to all pixel points in each corrected ultrasonic image and pixel gray values corresponding to the same pixel point in the reference ultrasonic image, and determining gray difference values corresponding to all pixel points in each corrected ultrasonic image;
the first d pixel points with larger absolute values of the gray difference values are determined to be d target pixel points corresponding to each corrected ultrasonic image;
carrying out average value processing on absolute values of gray difference values corresponding to d target pixel points to obtain a differential average value corresponding to each corrected ultrasonic image;
performing heart rate conversion on the differential average value corresponding to the N-1 corrected ultrasonic images to obtain a heart rate chart corresponding to the N frames of ultrasonic images, wherein the heart rate chart comprises the following components:
constructing a heart motion curve based on the difference average value corresponding to the N-1 corrected ultrasonic images;
sequentially sliding the local sliding windows on a heart motion curve, calculating local extremum values of the heart motion curve, and determining the number of extremum values corresponding to each local sliding window;
determining the number of extreme values corresponding to each local sliding window, and determining the actually measured heart rate value corresponding to each local sliding window;
And acquiring heart rate graphs corresponding to the N frames of ultrasonic images based on the actually measured heart rate values corresponding to all the local sliding windows.
2. The method for detecting an ultrasonic heart rate according to claim 1, wherein the determining global motion information of each current ultrasonic image with respect to the reference ultrasonic image based on Bi of the background feature points corresponding to each current ultrasonic image includes:
determining Bi feature point displacements of each current ultrasonic image relative to the reference ultrasonic image based on Bi background feature points corresponding to each current ultrasonic image;
interpolation processing is carried out on all pixel points in the reference ultrasonic image based on Bi characteristic point displacements of each current ultrasonic image relative to the reference ultrasonic image, and pixel point displacements corresponding to all pixel points are determined;
and determining the pixel point displacement corresponding to all the pixel points as global motion information of each current ultrasonic image relative to the reference ultrasonic image.
3. An ultrasonic heart rate detection device, comprising:
the device comprises a first characteristic point acquisition module, a second characteristic point acquisition module and a third characteristic point acquisition module, wherein the first characteristic point acquisition module is used for acquiring Mi first characteristic points corresponding to N frames of ultrasonic images, the ultrasonic images comprise 1 frame of reference ultrasonic image and N-1 frames of current ultrasonic images, and Mi is the number of the first characteristic points in the ith ultrasonic image;
The global motion information acquisition module is used for carrying out global motion estimation on each current ultrasonic image relative to the reference ultrasonic image based on Mi first characteristic points corresponding to the N frames of ultrasonic images, and determining global motion information of each current ultrasonic image relative to the reference ultrasonic image;
the corrected ultrasonic image acquisition module is used for carrying out image reconstruction on each current ultrasonic image based on global motion information of each current ultrasonic image relative to the reference ultrasonic image to acquire a corrected ultrasonic image corresponding to each current ultrasonic image;
the difference average value acquisition module is used for carrying out difference processing on the N-1 frames of corrected ultrasonic images and the reference ultrasonic images to acquire difference average values corresponding to the N-1 corrected ultrasonic images;
the heart rate map acquisition module is used for carrying out heart rate conversion on the difference average values corresponding to the N-1 corrected ultrasonic images to acquire heart rate maps corresponding to the N frames of ultrasonic images;
the global motion information acquisition module includes:
the second feature point determining unit is used for determining P second feature points of the N frames of ultrasonic images based on Mi first feature points corresponding to the N frames of ultrasonic images;
An affine transformation matrix obtaining unit, configured to perform affine transformation processing on each current ultrasonic image with respect to the reference ultrasonic image based on the P second feature points, to obtain an affine transformation matrix corresponding to each current ultrasonic image;
the background feature point determining unit is used for carrying out feature screening on the P second feature points based on an affine transformation matrix corresponding to each current ultrasonic image, and determining Bi background feature points corresponding to each current ultrasonic image;
the global motion information determining unit is used for determining global motion information of each current ultrasonic image relative to the reference ultrasonic image based on Bi background feature points corresponding to each current ultrasonic image;
the background feature point determining unit includes:
the characteristic distribution probability determining subunit is used for determining the characteristic distribution probability corresponding to each second characteristic point based on an affine transformation matrix corresponding to each current ultrasonic image, the characteristic point coordinates of the P second characteristic points in each current ultrasonic image and the characteristic point coordinates in the reference ultrasonic image;
the target distribution probability determining subunit is used for determining target distribution probability based on the feature distribution probabilities corresponding to the P second feature points;
The background feature point determining subunit is configured to determine, if the feature distribution probability is greater than or equal to the target distribution probability, second feature points corresponding to the feature distribution probability as Bi background feature points corresponding to each current ultrasound image;
the corrected ultrasound image acquisition module includes:
the correction pixel point determining unit is used for correcting all current pixel points in each current ultrasonic image based on global motion information of each current ultrasonic image relative to the reference ultrasonic image, and determining correction pixel points of all current pixel points in the reference ultrasonic image;
the correction gray value acquisition unit is used for carrying out interpolation processing on the reference gray value of the reference ultrasonic image based on correction pixel points of all current pixel points in the reference ultrasonic image to acquire correction gray values corresponding to all correction pixel points;
the corrected ultrasonic image acquisition unit is used for acquiring corrected ultrasonic images corresponding to each current ultrasonic image according to the corrected gray values corresponding to all the corrected pixel points;
the differential average value obtaining module comprises:
the gray difference value determining unit is used for carrying out differential calculation on the pixel gray values corresponding to all the pixel points in each corrected ultrasonic image and the pixel gray values corresponding to the same pixel point in the reference ultrasonic image, and determining the gray difference value corresponding to all the pixel points in each corrected ultrasonic image;
A target pixel point determining unit, configured to determine the first d pixel points with larger absolute values of the gray scale difference values as d target pixel points corresponding to each corrected ultrasound image;
the differential average value obtaining unit is used for carrying out average value processing on the absolute values of the gray difference values corresponding to the d target pixel points to obtain a differential average value corresponding to each corrected ultrasonic image;
the heart rate graph acquisition module comprises:
the heart motion curve construction unit is used for constructing a heart motion curve based on the difference average value corresponding to the N-1 corrected ultrasonic images;
the extremum quantity determining unit is used for sequentially sliding on a heart motion curve by adopting local sliding windows, carrying out local extremum calculation on the heart motion curve and determining the extremum quantity corresponding to each local sliding window;
the measured heart rate value determining unit is used for determining the number of extreme values corresponding to each local sliding window and determining the measured heart rate value corresponding to each local sliding window;
and the heart rate graph acquisition unit is used for acquiring the heart rate graph corresponding to the ultrasonic image of N frames based on the actually measured heart rate values corresponding to all the local sliding windows.
4. The ultrasonic heart rate detection apparatus as claimed in claim 3, wherein the global motion information determination unit comprises:
A feature point displacement determination subunit, configured to determine Bi feature point displacements of each current ultrasound image relative to the reference ultrasound image based on Bi feature points corresponding to each current ultrasound image;
the pixel point displacement determination subunit is used for carrying out interpolation processing on all pixel points in the reference ultrasonic image based on Bi characteristic point displacements of each current ultrasonic image relative to the reference ultrasonic image, and determining pixel point displacements corresponding to all pixel points;
and the global motion information determining subunit is used for determining the pixel point displacement corresponding to all the pixel points as global motion information of each current ultrasonic image relative to the reference ultrasonic image.
5. An ultrasound device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the ultrasound heart rate detection method according to any one of claims 1 to 2 when executing the computer program.
6. A computer readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, implements the ultrasonic heart rate detection method according to any one of claims 1 to 2.
CN202310900911.9A 2023-07-21 2023-07-21 Ultrasonic heart rate detection method and device, ultrasonic equipment and storage medium Active CN116616817B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310900911.9A CN116616817B (en) 2023-07-21 2023-07-21 Ultrasonic heart rate detection method and device, ultrasonic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310900911.9A CN116616817B (en) 2023-07-21 2023-07-21 Ultrasonic heart rate detection method and device, ultrasonic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116616817A CN116616817A (en) 2023-08-22
CN116616817B true CN116616817B (en) 2023-10-03

Family

ID=87602922

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310900911.9A Active CN116616817B (en) 2023-07-21 2023-07-21 Ultrasonic heart rate detection method and device, ultrasonic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116616817B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101238391A (en) * 2005-08-04 2008-08-06 皇家飞利浦电子股份有限公司 Motion compensation in functional imaging
WO2018060502A1 (en) * 2016-09-30 2018-04-05 Koninklijke Philips N.V. Ultrasound thermometry system with motion compensation and method of operation thereof
WO2019013217A1 (en) * 2017-07-12 2019-01-17 シャープ株式会社 Correction device, imaging device, control method for correction device, and control program
EP3492945A1 (en) * 2017-12-01 2019-06-05 Origin Wireless, Inc. Method, apparatus, and system for periodic motion detection and monitoring
CN110555796A (en) * 2019-07-24 2019-12-10 广州视源电子科技股份有限公司 image adjusting method, device, storage medium and equipment
CN111374706A (en) * 2018-12-28 2020-07-07 深圳迈瑞生物医疗电子股份有限公司 Fetal heart rate display method, ultrasonic imaging device and storage medium
CN113288215A (en) * 2020-12-31 2021-08-24 深圳北芯生命科技股份有限公司 System and method for measuring cardiac cycle by using ultrasonic image
WO2022235162A1 (en) * 2021-05-07 2022-11-10 Medis Associated B.V. Method of determining a motion of a heart wall

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8989349B2 (en) * 2004-09-30 2015-03-24 Accuray, Inc. Dynamic tracking of moving targets
US11304681B2 (en) * 2016-03-03 2022-04-19 Canon Medical Systems Corporation Ultrasonic diagnostic apparatus and image processing method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101238391A (en) * 2005-08-04 2008-08-06 皇家飞利浦电子股份有限公司 Motion compensation in functional imaging
WO2018060502A1 (en) * 2016-09-30 2018-04-05 Koninklijke Philips N.V. Ultrasound thermometry system with motion compensation and method of operation thereof
WO2019013217A1 (en) * 2017-07-12 2019-01-17 シャープ株式会社 Correction device, imaging device, control method for correction device, and control program
EP3492945A1 (en) * 2017-12-01 2019-06-05 Origin Wireless, Inc. Method, apparatus, and system for periodic motion detection and monitoring
EP3492944A1 (en) * 2017-12-01 2019-06-05 Origin Wireless, Inc. Apparatus, systems and methods for event detection and recognition based on a wireless signal
CN111374706A (en) * 2018-12-28 2020-07-07 深圳迈瑞生物医疗电子股份有限公司 Fetal heart rate display method, ultrasonic imaging device and storage medium
CN110555796A (en) * 2019-07-24 2019-12-10 广州视源电子科技股份有限公司 image adjusting method, device, storage medium and equipment
CN113288215A (en) * 2020-12-31 2021-08-24 深圳北芯生命科技股份有限公司 System and method for measuring cardiac cycle by using ultrasonic image
WO2022235162A1 (en) * 2021-05-07 2022-11-10 Medis Associated B.V. Method of determining a motion of a heart wall

Also Published As

Publication number Publication date
CN116616817A (en) 2023-08-22

Similar Documents

Publication Publication Date Title
JP5721311B2 (en) Contrast-enhanced ultrasound diagnostic imaging method, computer-readable storage medium, and contrast-enhanced ultrasound diagnostic imaging apparatus
US20070276245A1 (en) System And Method For Automated Boundary Detection Of Body Structures
JP5753798B2 (en) Ultrasonic diagnostic apparatus and its operating method
US7632231B2 (en) Ultrasonic strain imaging device and method providing parallel displacement processing
US20080009735A1 (en) Ultrasound diagnosis apparatus
JP4763588B2 (en) Ultrasonic diagnostic equipment
EP3255607B1 (en) Method and system for estimating motion between images, particularly in ultrasound spatial compounding
CN113240587B (en) Super-resolution scan conversion method, device, ultrasonic apparatus and storage medium
JPWO2006123729A1 (en) Ultrasonic diagnostic apparatus and image processing method thereof
WO2020010351A1 (en) Systems and methods for generating and estimating unknown and unacquired ultrasound data
CN112998750B (en) Ultrasonic image synthesis method and device, ultrasonic equipment and storage medium
US9384568B2 (en) Method and system for enhanced frame rate upconversion in ultrasound imaging
CN116616817B (en) Ultrasonic heart rate detection method and device, ultrasonic equipment and storage medium
CN112826533B (en) Ultrasonic imaging space compounding method and device, ultrasonic diagnostic apparatus and storage medium
CN115082320B (en) Ultrasonic image super-resolution processing method, ultrasonic device and storage medium
EP3639752A1 (en) Analyzing apparatus and analyzing program
JP4709937B2 (en) Ultrasonic diagnostic apparatus and image processing apparatus
CN112704517B (en) Method, system, equipment and storage medium for processing endometrium peristalsis ultrasonic image
CN111434311A (en) Ultrasonic imaging apparatus and image processing method
CN115330681A (en) Ultrasonic image measuring method, system, intelligent terminal and storage medium
CN111388012A (en) Method, device and system for detecting tissue hardness
US11141138B2 (en) Kalman filtering for flash artifact suppression in ultrasound imaging
CN116687445B (en) Automatic positioning and tracking method, device, equipment and storage medium for ultrasonic fetal heart
JP5190248B2 (en) Ultrasonic diagnostic equipment
CN114926468B (en) Ultrasonic image quality control method, ultrasonic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant