CN114240815A - Multithreading strain imaging method and device based on living body ultrasonic image - Google Patents

Multithreading strain imaging method and device based on living body ultrasonic image Download PDF

Info

Publication number
CN114240815A
CN114240815A CN202210003911.4A CN202210003911A CN114240815A CN 114240815 A CN114240815 A CN 114240815A CN 202210003911 A CN202210003911 A CN 202210003911A CN 114240815 A CN114240815 A CN 114240815A
Authority
CN
China
Prior art keywords
strain
image
exh
period
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210003911.4A
Other languages
Chinese (zh)
Inventor
郭霞生
尹楚豪
章东
屠娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University
Original Assignee
Nanjing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University filed Critical Nanjing University
Priority to CN202210003911.4A priority Critical patent/CN114240815A/en
Publication of CN114240815A publication Critical patent/CN114240815A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5061Partitioning or combining of resources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention discloses a multithreading strain imaging method and a multithreading strain imaging device based on a living body ultrasonic image, and belongs to the field of medical image processing. Collecting a living body B ultrasonic image sequence, analyzing the correlation characteristics of the images, judging the respiratory frequency, and dividing the images into different respiratory cycles and synchronous phase periods among the cycles; establishing a computing thread for each phase period, computing strain in parallel by each thread after displacement correction and image registration, and denoising according to the correlation among the threads; and finally, fusing the multithreading strain results to obtain accurate and high-time-density strain output. The method can solve the problems of artifacts and errors generated in strain imaging by physiological motion in the prior art, can obtain strain distribution in multiple motion phase periods, and further obtains a precise, high-robustness and high-time-density in-vivo strain distribution image after denoising and fusion are carried out by combining with the spatial correlation of the strain distribution.

Description

Multithreading strain imaging method and device based on living body ultrasonic image
Technical Field
The invention relates to the field of medical image processing, in particular to a multithreading strain imaging method and a multithreading strain imaging device based on a living body ultrasonic image.
Background
Ultrasonic imaging is one of four major image diagnosis technologies in clinical medicine, and has high application value in clinic due to the advantages of safety, no wound, small burden on patients, good signal processing real-time performance and the like. The development of ultrasonic imaging technology goes from one-dimensional to three-dimensional, static to dynamic, structural imaging to functional imaging, the function of ultrasonic diagnostic equipment is gradually enhanced, and the provided information is increasingly rich.
With the development of ultrasound imaging technology and ultrasound imaging equipment, a variety of diagnostic, analytical imaging algorithms based on ultrasound imaging signals are generated in succession. Such as doppler flow imaging, elastography, thermal strain imaging, etc., based on B-mode ultrasound imaging signals. The derived imaging algorithm quantifies the information of blood flow, hardness, temperature and the like of target tissues, greatly enhances the monitoring capability of B-ultrasonic imaging and enriches the clinical diagnosis basis.
The accuracy and signal-to-noise ratio of strain imaging based on B-mode ultrasonic imaging signals are heavily dependent on the correlation of images before and after compression. Under motion of a living body, such as breathing, heartbeat, pulse, etc., the motion causes severe artifacts in strain imaging and can even overwhelm the effective strain. Therefore, overcoming tissue motion and realizing image registration are an urgent problem to be solved in the field of strain imaging.
In the prior art, some technical solutions are proposed for imaging to overcome physiological motion of a living body. For example, the invention and creation names are: an ultrasound image generation system for generating intravascular ultrasound images (application date: 2018, 11/13; application number: 201880087709.6) discloses a method for intravascular ultrasound imaging based on intravascular pressure in the same moving state of a vessel wall. The approach uses intravascular pressure as a selection criterion for combining intravascular ultrasound images, combining image signals for which the pressure values differ from the reference pressure value by less than a predefined deviation threshold, thereby reducing artifacts due to pulse in the combined image. The disadvantage of this solution is that it requires additional blood pressure measuring equipment, and the synchronization of the signals of the manometer and the ultrasound imaging equipment and the noise signal of the manometer may cause gating failures. For another example, the invention is named as: an ultrasonic method for measuring temperature change of biological tissues based on a thermal expansion and gating algorithm (application date: 2017, 9, 25; application number: 201710876349) discloses an ultrasonic method for measuring temperature change of tissues based on a thermal expansion and gating algorithm. And selecting a matched image based on the B ultrasonic time sequence image, and training an adaptive filter by adopting a noise signal so as to remove the noise of the motion. The scheme has the defects that a mechanism for judging the physiological motion state and the motion cycle is lacked, so that accumulated motion artifacts caused by improper image matching intervals exist; and the image matching can be only carried out in the single motion state of respiration, the noise influence of the single motion state is easy to be caused, and the time density of the output strain result is low, so that the real-time requirement of clinical monitoring is not favorably met. The invention also has the name: a respiration separation type strain imaging method based on a living body ultrasound image (application date: 7/27/2020, application number: 202010731441.4) is an invention patent applied by the applicant in the earlier stage, the invention solves the problems of judging the respiration period and respectively carrying out image registration in the expiration stage and the inspiration stage, but the time interval of the output result is not uniform enough, image registration and displacement correction can not be respectively carried out in different phase stages, the space correlation between the results in different states is not utilized for carrying out strain result denoising, and a certain distance is kept away from the clinical application requirement.
In summary, it is an urgent need in the art to output strain imaging with high accuracy and meeting the clinical real-time requirement without the assistance of additional equipment, and effectively suppress the influence of motion on the image.
Disclosure of Invention
1. Problems to be solved
Aiming at the defects that motion suppression is realized through auxiliary gating of additional equipment in the prior art, and the gating technology depends on a single motion phase, the output result is low in time density and poor in real-time performance and is often accompanied with artifacts and large errors, the invention provides a multithreading strain imaging method and device based on a living body ultrasonic image, strain distribution in a multi-motion phase period can be obtained, and a living body strain distribution image with accuracy, high robustness and high time density is obtained after denoising and fusion are further combined with the spatial correlation of the strain distribution.
2. Technical scheme
In order to overcome the problems, the technical scheme provided by the invention is as follows:
a multithreading strain imaging method based on a living body ultrasonic image comprises the following steps:
collecting an ultrasonic image of a living object to obtain a digital ultrasonic image sequence I;
a characteristic extraction step: extracting the texture features of the digital ultrasonic image sequence I;
a cycle dividing step: determining characteristic images of the expiration and inspiration states through the texture characteristics of the digital ultrasonic image sequence I, and dividing the digital ultrasonic image sequence I into a plurality of image sequences of expiration periods and a plurality of image sequences of inspiration periods;
setting a threshold value: setting a expiration textural feature threshold, and if the textural feature of the image in the expiration period is greater than the expiration textural feature threshold, placing the image in an expiration image sequence IexhDividing the image sequence of each expiratory cycle into U phases1A phase period, U1For the number of divided phase periods, a multithreading subset I is obtainedexh,u1(u1=1,2,…,U1) (ii) a Setting an inspiration textural feature threshold, and if the textural feature of the image in the inspiration period is larger than the inspiration textural feature threshold, placing the image in an inspiration image sequence IinhDividing the image sequence of each inspiration cycle into U phases2A phase period, U2For the number of divided phase periods, a multithreading subset I is obtainedinh,u2(u2=1,2,…,U2);
A registration step: determining the region of interest by taking the position of the target region as the center, for Iexh,u1、Iinh,u2To the region of interest, displacement calibration and image registration are performed to obtain registered subsets Qexh,u1、Qinh,u2;Qexh,u1And Qinh,u2Is denoted as Qu(U-1, 2, …, U) and U is QuNumber of threads of (U) ═ U1+U2
A fusion step: parallel computation QuThe time domain interpolation is carried out on the accumulated strain image sequence of each thread, denoising is carried out based on strain correlation, and the accumulated strain of each thread after processing is fused to obtain a final strain imaging result.
Furthermore, the texture features in the feature extraction step are correlations between images, and the specific extraction method comprises the following steps:
taking N-1, 2, …, M frames of N frames of image of digital ultrasonic image sequence I as template frame IrefN is the total frame number of the image sequence I, M is a positive integer, and the M frame images at least cover a complete respiration period, and the rest N-1 frames are used as detection frames ItarCalculating IrefAnd ItarObtaining M cross correlation coefficients-time curve pn(t)。
Furthermore, the specific implementation method of the cycle dividing step is as follows:
within the interval of possible physiological movement frequencies, from pn(t) frequency spectrum Pn(f) Characteristic image I for determining expiration and inspiration statesRE、IRIAnd the respiratory frequency fresThe calculation method comprises the following steps: will spectrum Pn(f) Is denoted as An,AnThe template frame corresponding to the maximum value of (n-1, 2, …, M) is IRECorresponding to pn(t) is denoted by pexh(t) for pexh(t) after low-pass filtering, the detection frame corresponding to the minimum value in the first M points on the curve is IRI,IRIAs template frame correspondencesn(t) is denoted by pinh(t),pexh(t) the frequency corresponding to the maximum of the spectral amplitude is fres(ii) a To IRECorresponding correlation coefficient-time curve pexh(t) filtering to obtain a smooth curve p'exhAnd (t) taking the corresponding time of all the peaks as the boundary of the inspiration period and the corresponding time of the valleys as the boundary of the expiration period, thereby dividing the digital ultrasonic image sequence I into a plurality of image sequences of expiration periods and a plurality of image sequences of inspiration periods.
Further, p 'is selected'exhThe method of the peak and the valley of (t) is: the exhalation characteristic image IRESet as an anchor point at a breathing frequency fresDesign search step Δ t, 1/fres>Δt>0.5/fresIn the direction of increasing and decreasing time from anchor point, at p'exh(t) alternately searching for valleys and peaks.
Furthermore, the dividing of the motion phase for the image sequence of each expiratory cycle in the setting of the threshold step is performed by dividing or setting the division of the amplitude interval of the correlation coefficient at equal time intervals.
Furthermore, the specific implementation method of the step of setting the threshold value is as follows:
at pexh(t) in each expiratory cycle, setting the upper quartile of all cross-correlation coefficients as the expiratory texture feature threshold, for pexh(t) points above the expiratory texture feature threshold, and placing the corresponding images into the expiratory image sequence Iexh
Dividing the image sequence of each expiration period into U phases1A phase period, U1For the number of divided phase periods, a multithreading subset I is obtainedexh,u1(u1=1,2,…,U1);
At pinh(t) in each inspiration period, setting the upper quartile of all cross-correlation coefficients as the inspiration texture feature threshold, for pinh(t) points above the threshold of the inspiratory texture feature, and placing the corresponding images in the inspiratory image sequence Iinh
The image sequence of each inspiration cycle is operatedDivision of the dynamic phase into U2A phase period, U2For the number of divided phase periods, a multithreading subset I is obtainedinh,u2(u2=1,2,…,U2)。
Further, the specific implementation manner of the registration step is as follows:
the position of the target area is taken as the center, the size is determined to be AxL, and the coordinate of the upper left corner is determined to be (a)0,l0) Wherein a is the axial length and L is the transverse length;
during each phase (u) of the first expiratory cycle1=1,2,…,U1) Taking ROI of the image at the intermediate time as a template frame QrefPut it into the subset Qexh,u1(ii) a In the same phase period of the next period, each frame of image is A × L in size, and the upper left corner is located at (a)0+δa,l0+ δ l) area is the detection frame Qtar(ii) a Delta a and delta L are respectively changed within the range of-A/3 to A/3 and-L/3 to L/3, delta a and delta L are respectively axial displacement and transverse displacement, and Q is enabled to betarAnd QrefThe two-dimensional cross correlation coefficient of (1) is maximum, and corresponding delta a and delta l are QtarThe displacement amounts Δ a and Δ l of; will QtarShifted in the axial and lateral directions by Δ a and Δ l, respectively, to obtain a displacement-corrected image Q'tar(ii) a Comparing all maximum cross correlation coefficients, and comparing Q 'corresponding to the maximum value in the maximum cross correlation coefficients'tarAs new QrefPut into the subset Qexh,u1To the end of (1); using new QrefThe same operation is carried out on the next expiration period, and the operation is circulated until all expiration periods are calibrated and matched;
treating the inspiration period by the same method to obtain a subset Qinh,u2(u2=1,2,…,U2)。
Further, the specific implementation manner of the fusion step is as follows:
for image subset QuCalculating instantaneous displacement distribution between time-adjacent images by using a block matching algorithm on a thread-by-thread u basis; accumulating the instantaneous displacement distribution at different moments to obtain the spatial accumulated displacement D of each threadu(ii) a By pair DuCalculated along the axial direction to obtainAccumulated strain Su
Let each SuSet of time coordinates tu(U-1, 2, …, U), converting all t' suIs denoted by tall(ii) a S obtained for each threaduAnd (4) performing time-domain cubic spline interpolation to enable the strain result of each thread to cover a time coordinate set tall
Designing a denoising function based on the correlation among the threads of the accumulated strain, and denoising the strain distribution;
at tallAt each time point in (2), the threads after noise reduction are accumulated with strain S'uAveraging to obtain the final strain imaging result Sall
Furthermore, the denoising function is designed based on the correlation between threads of the accumulated strain, and the specific implementation method for the strain distribution denoising is as follows:
at any time, calculate SuCross correlation coefficients among threads; if the cross-correlation coefficient of a certain thread and other threads is lower than a threshold value gamma, the output strain of the thread at the current moment is eliminated, and the value range of gamma is 0.4-0.9;
at that moment, with the image SuIs taken as the center, and the size is A0×L0Rectangular window Ru,A0、L0The axial and lateral dimensions of the rectangular window, in pixels, are the strain S of the other threadsv(v ≠ U ≠ 1,2, …) for the corresponding portion RvThe average cross-correlation coefficient is calculated as follows,
Figure BDA0003454708340000041
wherein, COV (R)u,Rv) Represents RuAnd RvCovariance of (a)R1And σRvIs RuAnd RvStandard deviation of (2)
Denoising function qu(i, j) is defined as:
Figure BDA0003454708340000042
wherein, alpha is attenuation coefficient, and the value range can be 10-2~102,pthThe correlation coefficient threshold value is set, and the value range can be 0.4-0.9;
de-noising function qu(i, j) and SuMultiplying to obtain a denoised strain image S'u
The invention also provides a multithreading strain imaging device based on the living body ultrasonic image, which comprises the following components: the ultrasonic diagnostic system comprises a computer, a B ultrasonic instrument, a linear array probe and a strain generating device;
the B ultrasonic instrument is connected with the linear array probe and is used for acquiring ultrasonic images of the living body object;
strain generating means for creating a displacement and strain distribution in the tissue;
the computer is respectively connected with the B-ultrasonic instrument and the strain generating device, is used for controlling the B-ultrasonic instrument and the strain generating device, and is configured to realize the above multithreading strain imaging method based on the living body ultrasonic image.
3. Advantageous effects
Compared with the prior art, the invention has the beneficial effects that:
according to the multithreading strain imaging method based on the living body ultrasonic image, an ultrasonic image sequence is divided into a plurality of independent threads corresponding to different motion phases, image registration, displacement compensation and strain imaging are performed in a multithreading parallel mode, strain distribution under a plurality of motion phases with motion being suppressed can be obtained, and strain imaging depending on a single motion phase is avoided; denoising the strain based on the spatial correlation of the strain distribution of the multiple threads, and further improving the signal-to-noise ratio of the strain distribution and the robustness of the algorithm; the calculation of a plurality of threads in the strain imaging is independent and parallel to each other, and the strain imaging system has an efficient data processing and calculating structure; after strain denoising and fusion of multiple threads, the finally output strain has obviously improved precision and time density, and has extremely high application value for clinical real-time performance.
Drawings
FIG. 1 is a schematic flow chart of a B-mode ultrasound image-based in-vivo multithread strain imaging method;
FIG. 2 is a schematic view of the apparatus of example 2;
FIG. 3 shows p in example 2inh(t) setting a threshold value of a second inspiration period, dividing motion phases and selecting images of all phases;
FIG. 4 is the distribution after registration of 4 threads in inhalation and exhalation states in example 2;
FIG. 5 is a cumulative thermal strain profile of a sequence of 4 thread images for the inspiratory phase and the expiratory phase of example 2;
FIG. 6 is a schematic diagram showing the spatial dependence of the thermal strain on each thread in example 2;
FIG. 7 is a diagram illustrating the cumulative thermal strain distribution of each thread after the denoising function is applied in example 2;
FIG. 8 is a strain-time curve of the maximum strain point after strain fusion for each thread in example 2;
reference numerals: 01. a computer; 02. a portable B ultrasonic instrument; 03. b ultrasonic imaging probe; 04. a microwave heater; 05. a microwave ablation needle.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments; moreover, the embodiments are not relatively independent, and can be combined with each other according to needs, so that a better effect is achieved. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
For a further understanding of the invention, reference should be made to the following detailed description taken in conjunction with the accompanying drawings and examples.
Example 1
With reference to fig. 1, a multithreading strain imaging method based on a living body ultrasound image of the present invention includes the following steps:
an acquisition step, namely acquiring an ultrasonic image of a living object to obtain a digital ultrasonic image sequence I:
b-type ultrasonic imaging is carried out on biological tissues of a target area to be inspected aiming at a living object in a conventional physiological motion state, and data acquisition is carried out at equal time intervals to obtain a digital image sequence I;
specifically, the B ultrasonic instrument works in a real-time imaging mode, the position, the angle and the imaging parameters of the ultrasonic probe are adjusted, and the parameters of the probe and the system are fixed after the target area tissues are clearly visible in the image; by applying external force or heat, etc., displacement and strain distribution is formed in the tissue; b ultrasonic continuously and at equal time intervals collects digital images; the acquisition duration is not shorter than 2 breathing cycles.
The ultrasonic instrument for real-time imaging and B ultrasonic digital image acquisition can be a cart type B ultrasonic machine or a portable type B ultrasonic machine; the adopted ultrasonic imaging probe can be an abdominal probe, a craniocerebral probe, an intracavity probe and a heart probe according to the diagnosis part, can be a rectangular probe, an arc probe and a circular probe according to the geometrical shape, and can be a linear scanning probe, a phased array probe and a mechanical sector scanning probe according to the beam control mode; the parameters of the imaging system are selected from imaging depth, time gain compensation, focusing depth, emission frequency, linear density and the like; the beam transmitting logic and the corresponding beam forming algorithm of the B-ultrasonic instrument are selected according to the strain imaging algorithm, the motion rate of scattering points and the like and by integrating the requirements of the transverse resolution and the frame rate of the image, and the method comprises focused beam imaging, plane wave imaging, multi-beam imaging, synthetic aperture imaging and the like.
The sources for applying external force to the target area tissues comprise quasi-static pressure applied in vitro, elastic waves formed by low-frequency external driving mechanical force in the tissues, acoustic radiation force formed by the action of a focused ultrasonic sound field on the tissues, shear waves generated by acoustic pulses and transversely transmitted along beams, and the like; the heating mode comprises means such as radio frequency ablation, microwave ablation, laser ablation, focused ultrasound ablation and the like.
The stability of the living body state is kept in the process of collecting the image, and the breathing mode can be autonomous breathing, breathing supported by a breathing machine and the like; the heartbeat may be an autonomous heartbeat, a heartbeat supported by a cardiac pacemaker, or the like; the acquisition process at least acquires image sequences with 2 physiological motion cycles and more, so that effective spectrum analysis is facilitated; the data form of the B-ultrasonic digital image can be an ultrasonic Radio Frequency (RF) signal, an analytic signal of the RF signal after orthogonal demodulation or Hilbert transform, a gray scale signal obtained by enveloping and compressing the signal, and the like.
A characteristic extraction step: extracting the texture features of the digital ultrasonic image sequence I:
in this embodiment, correlation calculation between images is used as a feature analysis means for image texture information, and extracting texture features is not necessarily limited to extracting by correlation calculation, and may be performed based on information such as energy, contrast, entropy, and the like of images.
Respectively taking the image which is firstly acquired and covers a respiratory cycle in the digital ultrasonic image sequence I as a template frame Iref,n(n is an image serial number), and the two-dimensional cross correlation coefficient calculation is carried out on the image serial number and the rest images to obtain a series of curves p of which the cross correlation coefficient changes along with timen(t); specifically, N is 1,2, …, M frames in N frame images are taken as template frames I respectivelyref(N is the total frame number of the image sequence I, M is a positive integer and makes the M frame image cover at least one complete respiration cycle), and the rest N-1 frames are used as the detection frame ItarCalculating IrefAnd ItarObtaining M cross correlation coefficients-time curve pn(t)。
The specific method of the two-dimensional correlation calculation is not limited to the normalized cross-correlation algorithm, and also includes a mean absolute difference algorithm, a sum of absolute differences algorithm, a sum of square errors algorithm, a mean sum of square errors algorithm, a sequential similarity test algorithm, non-normalized cross-correlation, Hybrid-sign correlation, polar-correlation or Meyr-Spies method and other known methods. Taking normalized cross-correlation as an example, let the templateThe matrix of frame pixels is IrefDetecting a frame pixel matrix as ItarThen the cross correlation coefficient p is calculated as follows:
Figure BDA0003454708340000071
wherein, COV (I)ref,Itar) Is represented byref,ItarThe covariance of (a) of (b),
Figure BDA0003454708340000072
is represented byref,ItarStandard deviation of (2).
A cycle dividing step: determining a characteristic image of an expiration state and an inspiration state through the texture characteristics of the digital ultrasonic image sequence I, and dividing the digital ultrasonic image sequence I into an image sequence of a plurality of expiration periods and an image sequence of a plurality of inspiration periods:
within the interval of possible physiological movement frequencies, from pn(t) frequency spectrum Pn(f) Characteristic image I for determining expiration and inspiration statesRE、IRIAnd the respiratory frequency fres(ii) a In particular, the frequency spectrum Pn(f) Is denoted as An,AnThe template frame corresponding to the maximum value of (n-1, 2, …, M) is IRECorresponding to pn(t) is denoted by pexh(t); to pexh(t) after low-pass filtering, the detection frame corresponding to the minimum value in the first M points on the curve is IRI,IRIAs template frame correspondencesn(t) is denoted by pinh(t);pexh(t) the frequency corresponding to the maximum of the spectral amplitude is fres
Time domain signal pn(t) transforming to obtain a frequency spectrum Pn(f) To determine the breathing frequency requires high spectral resolution, methods of spectral refinement that may be employed include the refined fast fourier transform (zoomft), the chirped z-transform, and the Yip-Zoom transform, among others.
To IRECorresponding correlation coefficient-time curve pexh(t) low-pass filtering, the time corresponding to all peak values of the filtered curveTaking the moment as an inspiration period boundary and the corresponding moment of the valley value as an expiration period boundary, and dividing all the images into different periods of expiration and inspiration states according to the moment; in particular, the FIR low-pass filter pair p is designedexh(t) Filtering to obtain a smoothed curve p'exh(t) the filter cut-off frequency is selected at fres~3fresTo (c) to (d); p'exh(t) an inspiration period is formed between every two adjacent peaks, and an expiration period is formed between the valleys; additionally, there is an incomplete period from the starting time to the first extreme, and if the extreme is a peak (valley), there is an incomplete inspiration (expiration) period; the last extreme to the end is an incomplete cycle, and if the extreme is a peak (valley), it is an incomplete inspiratory (expiratory) cycle.
From pexh(t) smoothing to give p'exhThe method of (t) is not limited to FIR low pass filters, but other smoothing filters may be used, including mean filtering, median filtering, local regression, Savitzky-Golay filtering, and the like.
For low pass filter post curve p'exh(t) poor smoothness may result in erroneous peak and valley selection, which may affect accurate cycle division. Can be obtained by imaging the exhalation characteristics IRESet as an anchor point at a breathing frequency fresDesigning the search step Δ t, e.g. 1/fres>Δt>0.5/fresIn the direction of increasing and decreasing time from anchor point, at p'exh(t) alternately searching for valleys and peaks.
Setting a threshold value: setting a expiration textural feature threshold, and if the textural feature of the image in the expiration period is greater than the expiration textural feature threshold, placing the image in an expiration image sequence IexhDividing the image sequence of each expiratory cycle into U phases1A phase period, U1For the number of divided phase periods, a multithreading subset I is obtainedexh,u1(u1=1,2,…,U1) (ii) a Setting an inspiration textural feature threshold, and if the textural feature of the image in the inspiration period is larger than the inspiration textural feature threshold, placing the image in an inspiration image sequence IinhFor each inspirationThe periodic image sequence is divided into U phases by dividing the motion phases2A phase period, U2For the number of divided phase periods, a multithreading subset I is obtainedinh,u2(u2=1,2,…,U2):
At pexh(t) in each expiratory cycle, setting the upper quartile of all cross-correlation coefficients as the expiratory texture feature threshold, for pexh(t) points above the expiratory texture feature threshold, and placing the corresponding images into the expiratory image sequence Iexh
Dividing the image sequence of each expiration period into U phases1A phase period, U1For the number of divided phase periods, a multithreading subset I is obtainedexh,u1(u1=1,2,…,U1);
At pinh(t) in each inspiration period, setting the upper quartile of all cross-correlation coefficients as the inspiration texture feature threshold, for pinh(t) points above the threshold of the inspiratory texture feature, and placing the corresponding images in the inspiratory image sequence Iinh
Dividing the image sequence of each inspiration period into U2A phase period, U2For the number of divided phase periods, a multithreading subset I is obtainedinh,u2(u2=1,2,…,U2)。
The method for dividing the motion phase of the image sequence of each expiration period is to divide or set the division of the amplitude interval of the correlation coefficient at equal time intervals. Generally, the time of inspiration and expiration states of the living body are not equal, so the number of phases U of the two states1、U2Nor are they necessarily equal.
The texture feature threshold is selected from an upper quartile of cross correlation coefficients from small to large in the period, and the selection of the texture feature threshold is related to the overall correlation level of the image sequence and is not necessarily limited to the upper quartile.
The threshold setting is for pexh(t) and pinh(t) incomplete length of cycles when the incomplete length of exhalationOr the length of the inspiration period is less than 0.9/fresThen the threshold for that cycle is taken to be the threshold for the adjacent full cycle.
A registration step: determining the region of interest by taking the position of the target region as the center, for Iexh,u1、Iinh,u2To the region of interest, displacement calibration and image registration are performed to obtain registered subsets Qexh,u1、Qinh,u2;Qexh,u1And Qinh,u2Is denoted as Qu(U-1, 2, …, U) and U is QuNumber of threads of (U) ═ U1+U2
The specific process comprises the following steps: the position of the target area is taken as the center, the size is determined to be A (axial direction) multiplied by L (transverse direction), and the coordinate of the upper left corner is determined to be (a)0,l0) ROI of interest.
During each phase (u) of the first expiratory cycle1=1,2,…,U1) Taking ROI of the image at the intermediate time as a template frame QrefPut it into the subset Qexh,u1(ii) a In the same phase period of the next period, each frame of image is A × L in size, and the upper left corner is located at (a)0+δa,l0+ δ l) area is the detection frame Qtar(ii) a Delta a and delta L are respectively changed within the range of-A/3 to A/3 and-L/3 to L/3, delta a and delta L are respectively axial displacement and transverse displacement, and Q is enabled to betarAnd QrefThe two-dimensional cross correlation coefficient of (1) is maximum, and corresponding delta a and delta l are QtarThe displacement amounts Δ a and Δ l of; will QtarShifted in the axial and lateral directions by Δ a and Δ l, respectively, to obtain a displacement-corrected image Q'tar(ii) a Comparing all maximum cross correlation coefficients, and comparing Q 'corresponding to the maximum value in the maximum cross correlation coefficients'tarAs new QrefPut into the subset Qexh,u1To the end of (c).
Using new QrefThe same is done for the next expiratory cycle, and so on until all expiratory cycles are calibrated and matched. Treating the inspiration period by the same method to obtain a subset Qinh,u2(u2=1,2,…,U2). All Qexh,u1And Qinh,u2The collection of (total U threads) is marked as Qu
A fusion step:parallel computation QuPerforming time domain interpolation on the accumulated strain image sequence of each thread, denoising based on strain correlation, fusing the accumulated strain of each thread after processing to obtain a final strain imaging result:
for image subset QuCalculating instantaneous displacement distribution between time-adjacent images by using a block matching algorithm on a thread-by-thread u basis; accumulating the instantaneous displacement distribution at different moments to obtain the spatial accumulated displacement D of each threadu(ii) a By pair DuThe cumulative strain S is calculated by the difference along the axial direction (the direction of the ultrasonic beam)u
Performing time domain interpolation on the accumulated strain image sequence of each thread to enable all threads to have the same time coordinate; specifically, let each SuSet of time coordinates tu(U-1, 2, …, U), converting all t' suIs denoted by tall(ii) a S obtained for each threaduAnd (4) performing time-domain cubic spline interpolation to enable the strain result of each thread to cover a time coordinate set tall
Designing a denoising function based on the correlation between threads of accumulated strain, wherein the specific process of denoising the strain distribution comprises the following steps:
at any time, calculate SuCross correlation coefficients among threads; if the cross-correlation coefficient of a certain thread and other threads is lower than a threshold value gamma, the output strain of the thread at the current moment is eliminated, and the value range of gamma is 0.4-0.9;
at that moment, with the image SuIs taken as the center, and the size is A0×L0Rectangular window Ru,A0、L0The axial and lateral dimensions of the rectangular window, in pixels, are the strain S of the other threadsv(v ≠ U ≠ 1,2, …) for the corresponding portion RvThe average cross-correlation coefficient is calculated as follows,
Figure BDA0003454708340000091
wherein, COV (R)u,Rv) Represents RuAnd RvCovariance of (a)R1And σRvIs RuAnd RvStandard deviation of (2). Denoising function qu(i, j) is defined as:
Figure BDA0003454708340000101
wherein, alpha is attenuation coefficient, and the value range can be 10-2~102,pthThe correlation coefficient threshold value is a value range of 0.4-0.9.
De-noising function qu(i, j) and SuMultiplying to obtain a denoised strain image S'u
Accumulating strain S 'for each thread subjected to noise reduction'uFusing to obtain final strain imaging result SallIn the case of a liquid crystal display device, in particular,
fusing the strain results of all threads to obtain a final strain imaging result SallThe specific process comprises the following steps: at tallAt each time point in (2), the cumulative strain distribution obtained by each thread is averaged.
It is worth explaining that, according to the multi-thread strain imaging method based on the living body ultrasonic image, the more phases/threads are selected, the higher the accuracy, the time density and the stronger the real-time performance of the finally output strain result are, and the robustness of the strain imaging algorithm is stronger by combining the space correlation denoising of the thread results. Theoretically, the algorithm can cover all phases of respiratory motion, but in practical application, the algorithm is limited by a limited ultrasonic image frame rate, and physiological motion is not a complete ideal period, so that motion phases are divided in an interval where respiratory motion is relatively stable, and the number of selectable motion phases is not infinite; in addition, the method of the invention is essentially to process the live images which are arranged in sequence, so that the method is definitely applicable to two-dimensional live image sequences with similar characteristics obtained by other physical means besides B-ultrasonic.
It should be further noted that the invention is a kind of image data driven door-like motion suppression method, under the condition of not depending on the external equipment, have realized the state recognition of physiological motion, physiological motion frequency discrimination, physiological motion cycle division, have greatly simplified the monitoring equipment; the image sequences of all threads of the invention are independent in the links with relatively complex calculation, such as image registration, displacement correction and strain calculation, so that the parallel programming structure is adopted, and the calculation efficiency of the strain imaging algorithm can be obviously improved. After each thread completes independent strain calculation, the strain output which is dense in time is obtained through fusion, and the method has high application value for real-time monitoring in clinical treatment.
Example 2
In this embodiment, the method in embodiment 1 is adopted, the microwave ablation needle is used to heat the visceral fat of the living pig and acquire the B-ultrasonic digital image signal, and the thermal strain of the living pig is calculated by the B-ultrasonic image-based multithread strain imaging method, which includes the following steps:
a multithread strain imaging device based on living body ultrasonic images is shown in figure 2, a computer 01 controls digital ultrasonic image acquisition of a portable B ultrasonic instrument 02, the digital sampling rate of the B ultrasonic instrument is 40MHz, an adopted imaging probe is a 128-array-element linear array probe 03, and the center frequency of the imaging probe is 10.5 MHz. Under the guidance of a real-time B ultrasonic image presented by a B ultrasonic instrument, adjusting the imaging position and angle of an ultrasonic probe and the parameters of an imaging system to clearly present target area tissues in the center of the B ultrasonic image; the strain generating device is used to form displacement and strain distribution in the tissue, and the strain generating device forms strain by external force action, heating, or the like, and the specific mode is not limited. In the embodiment, displacement and strain distribution are formed by adopting a heating mode, a microwave heater 04 is connected with a microwave ablation needle 05 to serve as a heating source, and the microwave ablation needle is inserted into adipose tissues of a target region in a direction normal to an ultrasonic imaging plane along a wound of the epidermis of a live pig under the guidance of a B ultrasonic image, so that the heating region is in the imaging plane; the imaging depth of the B-mode ultrasound machine was set to 4cm, and a digital ultrasound image sequence of 2048 (axial) × 128 (transverse) pixels per frame was acquired at a frame rate of 50Hz for a total of 20 seconds from the start of heating.
Heating process 20 seconds, collecting N-1000 frames of images, making M-250 to ensure covering a complete respiration period, adopting normalized cross-correlation to calculate 1-M frames of B-ultrasonic images as template frame IrefAnd the rest N-1 frame images are taken as detection frames ItarCross correlation coefficient p ofn(t)。
In the target frequency band f1=0.1Hz,f2In the interval 5Hz, p is obtainedn(t) refined frequency spectrum Pn(f) According to Pn(f) The amplitude relation can determine the expiration characteristic image IREFor frame 92, inspiratory phase feature image IRIFor frame 396, respiratory rate fres=0.3325Hz。
To IRECorresponding correlation coefficient-time curve pexh(t) low-pass filtering with FIR low-pass filter with normalized cut-off frequency of 0.02 and order of 81, and compensating group delay to obtain p'exh(t) of (d). From p'exh(t) the peak value and the valley value of the (t) are obtained to obtain the periodic boundary t of the expiratory state exh0,2.46,5.28,7.94,10.98,13.7,16.58,19.46,20s, and cycle boundary t of inspiratory stateinhThe division results in 8 expiration periods and 8 inspiration periods, namely 0,1.04,3.86,6.6,9.46,12.38,15.18,18.14 and 20 s.
At pexh(t) in each expiration cycle, setting the upper quartile of all cross-correlation coefficients as a threshold, and placing the corresponding template frame larger than the threshold into Iexh(ii) a Similarly, at pinh(t) in each inspiration period, setting the upper quartile of all cross-correlation coefficients as a threshold, and placing the corresponding template frame larger than the threshold into Iinh(ii) a Each expiratory cycle is divided equally into U1U 24 different phases for 4 phase periods, resulting in a multi-thread subset Iexh,u1(u1=1,2,…,U1)、Iinh,u2(u2=1,2,…,U2). FIG. 3 shows a graph at pinh(t) threshold setting for the second inspiratory cycle, motion phase division and image selection for each phase.
Taking the target area position of the B-ultrasonic image as the center,a region of interest of size 561 pixels x 35 pixels is determined. To Iexh,u1、Iinh,u2To the region of interest, displacement calibration and image registration are performed to obtain registered subsets Qexh,u1、Qinh,u2. All U being U1+U2The registered subset corresponding to each thread is named Qu(u=1,2,…,U1+U2). As shown in FIG. 4, FIGS. 4(a) -4(d) show the 4-thread registered subset Q in the inspiratory stateinh,u2At pinh(t) distribution, FIGS. 4(e) -4(h) show the 4-thread registered subset Q in the expiratory stateexh,u1At pexh(t) distribution over (t).
Image subset QuCalculating the displacements D in paralleluStrain SuThe cumulative thermal strain results when t is 20s are shown in fig. 5, where fig. 5(a) to 5(d) show the cumulative thermal strain profiles for the respective threads of the inhalation cycle, and fig. 5(e) to 5(h) show the cumulative thermal strain profiles for the respective threads of the exhalation cycle. Each thread develops a cumulative thermal strain of about 0.2 and has a similar thermal strain profile, with strain noise in the corners of the regions shown by the individual threads.
After the thermal strain time domain interpolation of each thread, the spatial correlation among strains is calculated to obtain the spatial correlation distribution
Figure BDA0003454708340000111
As shown in fig. 6, the region with high gray value represents high correlation in the space, and the region with low gray value represents low correlation. De-noising function q is designed based on the distributionu(i,j),pth0.95, 5 and the strain distribution S of each threaduMultiplication. As shown in fig. 7, fig. 7(a) -7(d) are cumulative thermal strain distributions after denoising for each thread in an inspiration period, and fig. 7(e) -7(h) are cumulative thermal strain distributions after denoising for each thread in an expiration period, and it can be seen that the strain noise outside the heat source is effectively suppressed.
Finally, accumulating strain S 'for each thread after noise reduction'uFusing to obtain final strain imaging result SallFIG. 8 shows the post-fusion thermal strain SallThermal strain-time curve at the position of the maximum, scatter point isThe strain point of the output, the strain average output time interval is 0.35 second, the maximum output time interval is 0.76 second, and the time density is very high.
The invention has been described in detail hereinabove with reference to specific exemplary embodiments thereof. It will, however, be understood that various modifications and changes may be made without departing from the scope of the invention as defined in the appended claims. The detailed description and drawings are to be regarded as illustrative rather than restrictive, and any such modifications and variations are intended to be included within the scope of the present invention as described herein. Furthermore, the background is intended to be illustrative of the state of the art as developed and the meaning of the present technology and is not intended to limit the scope of the invention or the application and field of application of the invention.

Claims (10)

1. A multithreading strain imaging method based on a living body ultrasonic image is characterized by comprising the following steps:
collecting an ultrasonic image of a living object to obtain a digital ultrasonic image sequence I;
a characteristic extraction step: extracting the texture features of the digital ultrasonic image sequence I;
a cycle dividing step: determining characteristic images of the expiration and inspiration states through the texture characteristics of the digital ultrasonic image sequence I, and dividing the digital ultrasonic image sequence I into a plurality of image sequences of expiration periods and a plurality of image sequences of inspiration periods;
setting a threshold value: setting an expiration texture feature threshold;
if the texture feature of the image on the expiration period is larger than the expiration texture feature threshold value, the image is placed into an expiration image sequence IexhDividing the image sequence of each expiratory cycle into U phases1A phase period, U1For the number of divided phase periods, a multithreading subset I is obtainedexh,u1(u1=1,2,…,U1) (ii) a Setting a suction texture feature threshold;
if the texture of the image over an inspiratory period is greater than an inspiratory texture threshold, the image is placed inInspiratory image sequence IinhDividing the image sequence of each inspiration cycle into U phases2A phase period, U2For the number of divided phase periods, a multithreading subset I is obtainedinh,u2(u2=1,2,…,U2);
A registration step: determining the region of interest by taking the position of the target region as the center, for Iexh,u1、Iinh,u2To the region of interest, displacement calibration and image registration are performed to obtain registered subsets Qexh,u1、Qinh,u2;Qexh,u1And Qinh,u2Is denoted as Qu(U-1, 2, …, U) and U is QuNumber of threads of (U) ═ U1+U2
A fusion step: parallel computation QuThe time domain interpolation is carried out on the accumulated strain image sequence of each thread, denoising is carried out based on strain correlation, and the accumulated strain of each thread after processing is fused to obtain a final strain imaging result.
2. The method of claim 1, wherein the texture features in the feature extraction step are correlations between images, and the specific extraction method comprises:
taking N-1, 2, …, M frames of N frames of image of digital ultrasonic image sequence I as template frame IrefN is the total frame number of the image sequence I, M is a positive integer, and the M frame images at least cover a complete respiration period, and the rest N-1 frames are used as detection frames ItarCalculating IrefAnd ItarObtaining M cross correlation coefficients-time curve pn(t)。
3. The method for multi-thread strain imaging based on living body ultrasound images as claimed in claim 1 or 2, wherein the step of dividing the cycle is implemented by:
within the interval of possible physiological movement frequencies, from pn(t) ofFrequency spectrum Pn(f) Characteristic image I for determining expiration and inspiration statesRE、IRIAnd the respiratory frequency fresThe calculation method comprises the following steps: will spectrum Pn(f) Is denoted as An,AnThe template frame corresponding to the maximum value of (n-1, 2, …, M) is IRECorresponding to pn(t) is denoted by pexh(t) for pexh(t) after low-pass filtering, the detection frame corresponding to the minimum value in the first M points on the curve is IRI,IRIAs template frame correspondencesn(t) is denoted by pinh(t),pexh(t) the frequency corresponding to the maximum of the spectral amplitude is fres(ii) a To IRECorresponding correlation coefficient-time curve pexh(t) filtering to obtain a smooth curve p'exhAnd (t) taking the corresponding time of all the peaks as the boundary of the inspiration period and the corresponding time of the valleys as the boundary of the expiration period, thereby dividing the digital ultrasonic image sequence I into a plurality of image sequences of expiration periods and a plurality of image sequences of inspiration periods.
4. The live ultrasound image-based multi-threaded strain imaging method of claim 3, wherein p 'is selected'exhThe method of the peak and the valley of (t) is: the exhalation characteristic image IRESet as an anchor point at a breathing frequency fresDesign search step Δ t, 1/fres>Δt>0.5/fresIn the direction of increasing and decreasing time from anchor point, at p'exh(t) alternately searching for valleys and peaks.
5. The method of claim 1, wherein the dividing of the motion phase of the image sequence for each expiratory cycle in the thresholding step is performed by dividing or setting the division of the correlation coefficient amplitude interval at equal time intervals.
6. The method of claim 1, wherein the threshold setting step is implemented by:
at pexh(t) in each expiratory cycle, setting the upper quartile of all cross-correlation coefficients as the expiratory texture feature threshold, for pexh(t) points above the expiratory texture feature threshold, and placing the corresponding images into the expiratory image sequence Iexh
Dividing the image sequence of each expiration period into U phases1A phase period, U1For the number of divided phase periods, a multithreading subset I is obtainedexh,u1(u1=1,2,…,U1);
At pinh(t) in each inspiration period, setting the upper quartile of all cross-correlation coefficients as the inspiration texture feature threshold, for pinh(t) points above the threshold of the inspiratory texture feature, and placing the corresponding images in the inspiratory image sequence Iinh
Dividing the image sequence of each inspiration period into U2A phase period, U2For the number of divided phase periods, a multithreading subset I is obtainedinh,u2(u2=1,2,…,U2)。
7. The method of claim 1, wherein the registering step is implemented as follows:
the position of the target area is taken as the center, the size is determined to be AxL, and the coordinate of the upper left corner is determined to be (a)0,l0) Wherein a is the axial length and L is the transverse length;
during each phase (u) of the first expiratory cycle1=1,2,…,U1) Taking ROI of the image at the intermediate time as a template frame QrefPut it into the subset Qexh,u1(ii) a In the same phase period of the next period, each frame of image is A × L in size, and the upper left corner is located at (a)0+δa,l0+ δ l) area is the detection frame Qtar(ii) a Delta a and delta L respectively varying in the range-A/3 to A/3 and-L/3 to L/3, delta a and delta L respectively being axial and transverseTo the displacement amount of QtarAnd QrefThe two-dimensional cross correlation coefficient of (1) is maximum, and corresponding delta a and delta l are QtarThe displacement amounts Δ a and Δ l of; will QtarShifted in the axial and lateral directions by Δ a and Δ l, respectively, to obtain a displacement-corrected image Q'tar(ii) a Comparing all maximum cross correlation coefficients, and comparing Q 'corresponding to the maximum value in the maximum cross correlation coefficients'tarAs new QrefPut into the subset Qexh,u1To the end of (1); using new QrefThe same operation is carried out on the next expiration period, and the operation is circulated until all expiration periods are calibrated and matched;
during each phase (u) of the first inspiratory cycle2=1,2,…,U2) Taking ROI of the image at the intermediate time as a template frame QrefPut it into the subset Qinh,u2(ii) a In the same phase period of the next period, each frame of image is A × L in size, and the upper left corner is located at (a)0+δa,l0+ δ l) area is the detection frame Qtar(ii) a Delta a and delta L are respectively changed within the range of-A/3 to A/3 and-L/3 to L/3, delta a and delta L are respectively axial displacement and transverse displacement, and Q is enabled to betarAnd QrefThe two-dimensional cross correlation coefficient of (1) is maximum, and corresponding delta a and delta l are QtarThe displacement amounts Δ a and Δ l of; will QtarShifted in the axial and lateral directions by Δ a and Δ l, respectively, to obtain a displacement-corrected image Q'tar(ii) a Comparing all maximum cross correlation coefficients, and comparing Q 'corresponding to the maximum value in the maximum cross correlation coefficients'tarAs new QrefPut into the subset Qinh,u2To the end of (1); using new QrefThe same is done for the next inspiratory cycle, and so on until all inspiratory cycles are calibrated and matched.
8. The method of claim 1 or 7, wherein the fusion step is implemented as follows:
for image subset QuCalculating instantaneous displacement distribution between time-adjacent images by using a block matching algorithm on a thread-by-thread u basis; accumulating the instantaneous displacement distribution at different moments to obtain the spatial accumulated displacement D of each threadu(ii) a By pair DuThe accumulated strain S is obtained by calculating the difference along the axial directionu
Let each SuSet of time coordinates tu(U-1, 2, …, U), converting all t' suIs denoted by tall(ii) a S obtained for each threaduAnd (4) performing time-domain cubic spline interpolation to enable the strain result of each thread to cover a time coordinate set tall
Designing a denoising function based on the correlation among the threads of the accumulated strain, and denoising the strain distribution;
at tallAt each time point in (2), the threads after noise reduction are accumulated with strain S'uAveraging to obtain the final strain imaging result Sall
9. The method as claimed in claim 8, wherein the method for multi-threaded strain imaging based on living body ultrasound images comprises the following steps:
at any time, calculate SuCross correlation coefficients among threads; if the cross-correlation coefficient of a certain thread and other threads is lower than a threshold value gamma, the output strain of the thread at the current moment is eliminated, and the value range of gamma is 0.4-0.9;
at said arbitrary time, with the image SuIs taken as the center, and the size is A0×L0Rectangular window Ru,A0、L0The axial and lateral dimensions of the rectangular window, in pixels, are the strain S of the other threadsv(v ≠ U ≠ 1,2, …) for the corresponding portion RvThe average cross-correlation coefficient is calculated as follows,
Figure FDA0003454708330000031
wherein, COV (R)u,Rv) Represents RuAnd RvCovariance of (a)R1And σRvIs RuAnd RvStandard deviation of (2)
Denoising function qu(i, j) is defined as:
Figure FDA0003454708330000041
wherein alpha is attenuation coefficient and has a value range of 10-2~102,pthThe correlation coefficient threshold value is a value range of 0.4-0.9;
de-noising function qu(i, j) and SuMultiplying to obtain a denoised strain image S'u
10. A multi-threaded strain imaging apparatus based on an ultrasound image of a living body, comprising: the ultrasonic diagnostic system comprises a computer, a B ultrasonic instrument, a linear array probe and a strain generating device;
the B ultrasonic instrument is connected with the linear array probe and is used for acquiring ultrasonic images of the living body object;
strain generating means for creating a displacement and strain distribution in the tissue;
a computer connected to the B-ultrasonic apparatus and the strain generating device, respectively, for controlling the B-ultrasonic apparatus and the strain generating device, configured to implement a method of multi-threaded strain imaging based on live ultrasound images as claimed in any one of claims 1 to 9.
CN202210003911.4A 2022-01-04 2022-01-04 Multithreading strain imaging method and device based on living body ultrasonic image Pending CN114240815A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210003911.4A CN114240815A (en) 2022-01-04 2022-01-04 Multithreading strain imaging method and device based on living body ultrasonic image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210003911.4A CN114240815A (en) 2022-01-04 2022-01-04 Multithreading strain imaging method and device based on living body ultrasonic image

Publications (1)

Publication Number Publication Date
CN114240815A true CN114240815A (en) 2022-03-25

Family

ID=80745746

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210003911.4A Pending CN114240815A (en) 2022-01-04 2022-01-04 Multithreading strain imaging method and device based on living body ultrasonic image

Country Status (1)

Country Link
CN (1) CN114240815A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106056589A (en) * 2016-05-24 2016-10-26 西安交通大学 Ultrasound contrast perfusion parameter imaging method based on respiratory motion compensation
CN109615677A (en) * 2019-02-13 2019-04-12 南京广慈医疗科技有限公司 A method of thermal strain distribution is calculated based on low sampling rate B ultrasound image
EP3508132A1 (en) * 2018-01-04 2019-07-10 Koninklijke Philips N.V. Ultrasound system and method for correcting motion-induced misalignment in image fusion

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106056589A (en) * 2016-05-24 2016-10-26 西安交通大学 Ultrasound contrast perfusion parameter imaging method based on respiratory motion compensation
EP3508132A1 (en) * 2018-01-04 2019-07-10 Koninklijke Philips N.V. Ultrasound system and method for correcting motion-induced misalignment in image fusion
CN109615677A (en) * 2019-02-13 2019-04-12 南京广慈医疗科技有限公司 A method of thermal strain distribution is calculated based on low sampling rate B ultrasound image

Similar Documents

Publication Publication Date Title
JP4857418B2 (en) Method and system for specimen motion artifact correction
CN111225617B (en) Ultrasound imaging system and method
JP4763588B2 (en) Ultrasonic diagnostic equipment
CN111260634B (en) Facial blood flow distribution extraction method and system
US20100174191A1 (en) Apparatus and method for providing a dynamic 3d ultrasound image
CN108742705A (en) A kind of supersonic imaging apparatus and method of real-time detection muscle morphological parameters
US10548494B2 (en) Method for determining a personalized cardiac model using a magnetic resonance imaging sequence
Byram et al. 3-D phantom and in vivo cardiac speckle tracking using a matrix array and raw echo data
Mercure et al. A local angle compensation method based on kinematics constraints for non-invasive vascular axial strain computations on human carotid arteries
Liu et al. Asthma pattern identification via continuous diaphragm motion monitoring
CN112512436B (en) Apparatus, system and method for pulmonary beat detection in ultrasound
Jensen et al. Tissue motion estimation and correction in super resolution imaging
Hennersperger et al. Vascular 3D+ T freehand ultrasound using correlation of doppler and pulse-oximetry data
CN114240815A (en) Multithreading strain imaging method and device based on living body ultrasonic image
WO2020127615A1 (en) Methods and systems for monitoring a function of a heart
Zheng et al. Compensation of in-plane rigid motion for in vivo intracoronary ultrasound image sequence
CN110931130B (en) Method for evaluating respiratory and cardiac cycles based on B ultrasonic signals
EP3995081A1 (en) Diagnosis assisting program
CN111681740B (en) Breathing separation type strain imaging method based on living body ultrasonic image
Zheng et al. An off-line gating method for suppressing motion artifacts in ICUSsequence
US20220096048A1 (en) Methods and systems for investigating blood vessel characteristics
CN116439748A (en) Living body time composite imaging method and system based on respiratory phase tracking
CN116942147A (en) Heart sound envelope reconstruction method based on T-HSER algorithm and millimeter wave radar
CN114299125A (en) Method and device for estimating in-vivo heat source distribution based on ultrasonic imaging
KR20150026611A (en) Method and apparatus for monitoring temperature change of region of interest using periodic bio signals of object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination