CN104739442B - Compressive resilience imaging displacement detection method, device and supersonic imaging apparatus - Google Patents

Compressive resilience imaging displacement detection method, device and supersonic imaging apparatus Download PDF

Info

Publication number
CN104739442B
CN104739442B CN201310726398.2A CN201310726398A CN104739442B CN 104739442 B CN104739442 B CN 104739442B CN 201310726398 A CN201310726398 A CN 201310726398A CN 104739442 B CN104739442 B CN 104739442B
Authority
CN
China
Prior art keywords
point
node
nth row
search
row
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310726398.2A
Other languages
Chinese (zh)
Other versions
CN104739442A (en
Inventor
袁宇辰
樊睿
李双双
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority to CN201310726398.2A priority Critical patent/CN104739442B/en
Priority to PCT/CN2014/077327 priority patent/WO2015096353A1/en
Publication of CN104739442A publication Critical patent/CN104739442A/en
Application granted granted Critical
Publication of CN104739442B publication Critical patent/CN104739442B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/485Diagnostic techniques involving measuring strain or elastic properties
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52036Details of receivers using analysis of echo signal for target characterisation
    • G01S7/52042Details of receivers using analysis of echo signal for target characterisation determining elastic properties of the propagation medium or of the reflective target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/32Determination of transform parameters for the alignment of images, i.e. image registration using correlation-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20072Graph-based image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Abstract

This application discloses the displacement detecting method in a kind of elastogram, when displacement detecting is carried out to destination image data interior joint, pilot point of the matching degree highest node as the row node using in lastrow, side-play amount according to pilot point determines the initial offset of node, match point is searched in the range of determining based on pilot point, so as to reduce the cumulative effect of mistake in computation.In addition, it is also proposed that displacement detector a kind of compressive resilience imaging suitable for the method, and a kind of supersonic imaging apparatus are disclosed.

Description

Compressive resilience imaging displacement detection method, device and supersonic imaging apparatus
Technical field
The application is related to a kind of Medical Devices, and in particular to a kind of compressive resilience imaging displacement detection method and its device and A kind of supersonic imaging apparatus.
Background technology
Medical supersonic elastogram refers mainly to a series of imagings and the signal transacting for the purpose of showing tissue elasticity difference Technology.Current existing several macrotaxonomies include compressive resilience imaging, acoustic radiation force elastogram(Acoustic Radiation Force Imaging,ARFI), shearing wave elastogram(Shear Wave Elastography,SWE)Deng.Wherein pressure bullet Property imaging progress time it is most long, technology is also the most ripe.Compressive resilience is imaged as cancer detection, and especially breast cancer is benign In pernicious differentiation, to the important supplementary means of B-mode ultrasonogram detection, clinic is quickly applied to.
Compressive resilience imaging is mainly popped one's head in by hand-held ultrasound and apply pressure to destination organization(pressure), obtain mesh Tissue is marked by the front and rear two frame ultrasonic echo informations of compression, then calculates what the front and rear correspondence position of compression occurred by specific algorithm Displacement(displacement), as destination organization seeks axle in two not spatial position change information in the same time by displacement To(axial)Gradient, and then obtain the strain value of target tissue region each point(strain), under identical outer force compresses, strain It is bigger, represent that tissue is softer, strain smaller, then it represents that tissue is harder.Strain value according to target tissue region each point is with image Form is showed, and can intuitively reflect soft or hard difference or elastic difference between different tissues.
In above-mentioned processing procedure, whether displacement detecting is accurate, whether calculate quick, and joint effect final strain pattern Contrast-to-noise ratio (contrast-noise-ratio, CNR), the real-time of image, clinical frame per second etc..Application No. 201110159110.9 Chinese patent《Displacement detecting method, apparatus and system in a kind of elastogram》Propose guiding zero Phase estimation(Guided Phase Zero Estimation, GPZE)Algorithm, this GPZE displacement detectings algorithm, on the one hand makes Guide next line displacement to calculate with optimal default displacement result, reduce volumes of searches;On the other hand phase estimation is used Method calculates displacement, to initial data sample rate less demanding, greatly reduces amount of calculation.But GPZE algorithms still have following Weak point:
Existing GPZE algorithms calculate a frame elastic image when, using calculation is guided line by line, due to guide position mistake In fixation, when mistake in computation occurs in lastrow point causes bad point, due to the presence for guiding, mistake can be caused to extend to after Every a line the position, longitudinal wire error is shown as on image.
The content of the invention
The application provides a kind of compressive resilience imaging displacement detection method and its device and a kind of supersonic imaging apparatus, reduces The probability of the downward conduction eror of pilot point.
According to the application's in a first aspect, the application provides a kind of compressive resilience imaging displacement detection method, including:
Obtain two field pictures data, respectively as the destination image data before compression and compression after be matched picture number According to;
Using the node of the first row in destination image data as impact point, correspondence is found in view data is matched respectively Match point;
Impact point and its match point according to the first row calculate the displacement result of each node of the first row;
The displacement result of each node of Nth row in destination image data is calculated, wherein N is integer successively from 2 to n, and n is one The line number that two field picture is divided, including:
Matching degree highest node is found out in the matching result of each node of N-1 rows as row initial guide point;
Match point of each node of Nth row in view data is matched is searched based on row initial guide point;
Each node and its match point according to Nth row calculate the displacement result of each node of Nth row.
According to the second aspect of the application, the application provides displacement detector in a kind of compressive resilience imaging, including:
Image collection module, for obtaining two field pictures data, respectively as the destination image data before compression and compression Afterwards be matched view data;
First matching module, for using the node of the first row in destination image data as impact point, being matched respectively Corresponding match point is found in view data;
Searching modul, for finding out matching degree highest node conduct in the matching result of each node of N-1 rows Row initial guide point, wherein N is the current line for requiring to look up match point, and N is integer successively from 2 to n, and n draws for a two field picture The line number divided;
Second matching module, for each node based on row initial guide point lookup Nth row in view data is matched Match point;
Displacement computing module, the displacement knot of each node of the first row is calculated for the impact point according to the first row and its match point Really, and according to each node and its match point of Nth row the displacement result of each node of Nth row is calculated.
According to the third aspect of the application, the application provides a kind of supersonic imaging apparatus, including:
Probe, for scanning objective emission ultrasonic wave and receiving ultrasonic echo;
Signal processor, for processing ultrasonic echo, generates ultrasound image data;
Image processor, for processing ultrasound image data, and generates elastic image, and image processor includes: Above-mentioned displacement detector and the displacement result of each node based on displacement detector detection generate the elasticity of elastic image Video generation device.
The beneficial effect of the application is:Guidance mode is improved, the guidance mode of existing fixation is changed to by matching degree most Node high is used as pilot point, so as to reduce the probability of the downward conduction eror of pilot point, improves the reliability of guiding, it is to avoid The cumulative effect of mistake in computation.
Brief description of the drawings
Fig. 1 is the embodiment of the present application supersonic imaging apparatus structure chart;
Fig. 2 is the embodiment of the present application displacement detector structure chart;
Fig. 3 is the embodiment of the present application the second matching module structure chart;
Fig. 4 is each modal displacement overhaul flow chart of the embodiment of the present application;
Fig. 5 is that the embodiment of the present application video data block divides schematic diagram;
Fig. 6 is the flow chart of the match point that the embodiment of the present application searches each node;
Fig. 7 determines a kind of tactful schematic diagram of region of search for the embodiment of the present application.
Specific embodiment
The present invention is described in further detail below by specific embodiment combination accompanying drawing.
Medical elastic imaging refers mainly to a series of imagings and the signal processing technology for the purpose of showing tissue elasticity difference. By taking medical ultrasound imaging technology as an example, Fig. 1 is refer to, Fig. 1 is shown at the structure of supersonic imaging apparatus, including probe 1, signal Reason device 2, image processor 3 and display 4.Wherein:
Probe 1 is used for scanning objective emission ultrasonic wave and receives ultrasonic echo.Ultrasonic output circuit 11 produces waveform Data, the array element of probe 1 is connected by transmission channel 12, to detected tissue emissions ultrasonic wave, ultrasonic wave through Tissue reflectance and Ultrasonic echo is formed after absorption, probe 1 is received ultrasonic echo, exported to signal processor 2 by receiving channel 13.
Signal processor 2 is used to process ultrasonic echo, generates ultrasound image data.Signal processor 2 first will The ultrasonic echo that receiving channel 13 is received obtains radio frequency by Beam synthesis link(Radio frequency, RF)Signal;Again Baseband signal by obtaining quadrature demodulation after quadrature demodulation.In processing procedure, radio frequency can also be believed after Beam synthesis A liter sampling number is carried out, increases the sample rate of RF signals, then again by down-sampled after quadrature demodulation.Can be increased by a liter sampling Plus displacement detection precision, rise sample rate and preset by system.Image processor is arrived in ultrasound image data output after treatment.
Image processor 3 is used to process ultrasound image data, and generates elastic image.Image processor 3 includes Displacement detector and elastic image generating means, displacement detector enter to the ultrasound image data that signal processor 2 is exported Row treatment obtains the displacement result of each node, and then elastic image generating means are based on each node of displacement detector detection Displacement result generates elastic image.
Display 4 is used for the elastic image of the generation of display image processor 3.
The improvement of the application is the displacement detector in image processor 3, is illustrated in figure 2 displacement detector Structure, including image collection module 301, the first matching module 302, searching modul 303, the second matching module 304 and displacement meter Calculate module 305.Image collection module 301 is used to obtain two field pictures data, respectively as the destination image data before compression and View data is matched after compression;First matching module 302 is used for the node of the first row in destination image data as mesh Punctuate, finds corresponding match point in view data is matched respectively;Searching modul 303 is used for each node in N-1 rows Matching result in find out matching degree highest node as row initial guide point, wherein N be require to look up match point work as Move ahead, N is integer successively from 2 to n, n is the line number that a two field picture is divided;Second matching module 304 is used for initial based on row Pilot point searches match point of each node of Nth row in view data is matched;Displacement computing module 305 is used for according to first Capable impact point and its match point calculate the displacement result of each node of the first row, and according to each node and its match point of Nth row Calculate the displacement result of each node of Nth row.In a kind of instantiation, displacement computing module 305 includes cross-correlation phase calculation Unit and displacement computing unit, cross-correlation phase calculation unit, for the ultrasonic radio frequency letter in reply based on the node and its match point Number calculate the cross-correlation phase of the node;Displacement computing unit is used for the final mean annual increment movement knot based on the cross-correlation phase calculation node Really.
In one embodiment, the structure of the second matching module 304 is as shown in figure 3, including first object point determining unit 341st, the first side-play amount acquiring unit 342, the first region of search determining unit 343, the first search unit 344, the second impact point Determining unit 345, pilot point determining unit 346, the second side-play amount acquiring unit 347, the and of the second region of search determining unit 348 Second search unit 349.First object point determining unit 341 is used to determine mesh based on matching degree highest node in N-1 rows The first object point of Nth row in logo image data;The displacement that first side-play amount acquiring unit 342 is used to obtain initial guide point is inclined Shifting amount;First region of search determining unit 343 is used for the shift offset of row initial guide point as Nth row first object point Initial offset, in view data is matched, with the position after the position of Nth row first object point skew initial offset It is core searching position, region of search is determined based on core searching position;First search unit 344 is used to be searched in region of search The match point of the Nth row first object point in rope destination image data;Second impact point determining unit 345 is used to find N After the match point of row first object point, the node with Nth row first object point colleague both sides is as impact point successively;Pilot point determines Matching degree highest node conduct is selected in the node for having calculated shift offset that unit 346 is used for around impact point Pilot point;Second side-play amount acquiring unit 347 is used to obtain the shift offset of pilot point;Second region of search determining unit 348 be used for using the shift offset of pilot point as impact point initial offset, in view data is matched, with impact point Position skew initial offset after position be core searching position, region of search is determined based on core searching position;Second Impact point is matched during search unit 349 is used in region of search to destination image data, and obtains destination image data In Nth row impact point match point.
The present embodiment also discloses displacement detecting method in a kind of imaging of compressive resilience, the method be applied to above-mentioned ultrasound into As the displacement detector in equipment, especially image processor 3.The thinking of the method is:In displacement detection, based on pressure Two field pictures data, are estimated using zero phase is guided before and after contracting(Guided phase zero estimation, GPZE)Algorithm When calculating displacement result, using matching degree highest node as initial guide point.GPZE algorithms can be found in Application No. 201110159110.9 Chinese patent《Displacement detecting method, apparatus and system in a kind of elastogram》.
Between GPZE algorithms mainly more quickly and accurately detect front and rear two frame signals of compression by the thinking of guiding search Cross-correlation function phase, and the corresponding relation between the phase and length travel amount is derived, between being calculated two frame signals Length travel amount, the quality of Displacement Estimation is ensure that while amount of calculation is greatly reduced.Additionally, the scope of displacement detecting Also expanded, GPZE algorithms are applicable not only to thin tail sheep situation, be also applied for the situation of big displacement.
Assuming that the target image before compression is expressed as respectively with the picture signal that is matched after compression:
Wherein, ωcIt is signal center frequency, θ is signal initial phase, TcIt it is the signal period, with centre frequency ωcRelatively Should.uxIt is spatially transverse side-play amount, u caused by compressionyVertical misalignment amount caused by compression.uyAlways it is represented by uy=τ+ nTc/ 2 form, wherein n are integer, and τ is always distributed across-Tc/ 2~TcIn the range of/2.
Turn into baseband signal after quadrature demodulation, the plural form of baseband signal is represented by:
fub(t,x)=Au(t,x)e
Or,
Su=Iu+iQu
Sc=Ic+iQc
Then, the cross-correlation function between baseband signal is expressed as:
Wherein, u0And x0The longitudinal direction between two frame target datas when calculating cross-correlation function and transversal displacement are represented, RebBe two frame data envelope signal between cross-correlation function.
As long as find making envelope cross-correlation function Rb(u0,x0) it is maximum when u0And x0, that is, find between two frame envelope datas Longitudinally, laterally side-play amount, using vertical and horizontal side-play amount can find compression before target image in impact point exist Match point in compressed images, so as to calculate cross-correlation phase according to impact point and match point, and further calculates most Whole displacement result.
Each modal displacement testing process is as shown in figure 4, comprise the following steps:
Step 410. obtains view data
The data that the displacement detecting of the present embodiment is based on obtaining are calculated.The two field pictures data of acquisition, respectively as View data is matched after destination image data and compression before compression.
Specifically, a pair of I/Q baseband signal frame data are obtained:
Su=Iu+iQu
Sc=Ic+iQc
Wherein, SuIt is the destination image data before compression, ScView data is matched after for compression;IuAnd QuAnd IcWith QcRespectively destination image data and it is matched the signal parameter of view data.
The calculating of each frame displacement result is needed using two frame I/Q baseband signal datas, and gained displacement result refers to two frames Space relative displacement between signal.Above-mentioned two frames baseband signal data, can be two continuous frames data, can have a framing Two frame data at interval, frame period quantity is preset by system, or is determined according to the selection of user.Use the two of certain intervals Frame data, can effectively adjust the amount of displacement between two frame data for calculating so that the final strain pattern matter for obtaining Amount is more preferable.
Step 420. finds the match point of each node of destination image data
Using the node of the first row in the destination image data that step 410 is obtained as impact point, image is being matched respectively Corresponding match point is found in data;For the node of other each rows in destination image data, with each node matching degree of lastrow Highest node then finds each node of one's own profession in view data is matched as initial guide point based on initial guide point Match point.
It will be apparent to those skilled in the art that even if by down-sampled, the sample rate of baseband signal is still higher, and displacement Typically small, the displacement difference between neighbouring sample point is very small.In order that amount of calculation must be reduced, displacement inspection has been divided in advance Survey estimation point(Node)Position, for example, image can at equal intervals be divided some continuous and nonoverlapping piece, with block as node To be matched and displacement detecting, it is possible to prevente effectively from or reducing redundant computation amount.It is illustrated in figure 5 in the embodiment of the present application View data gridding schematic diagram, divides n rows, stain altogether(I.e. at grid node)The position of Displacement Estimation may be as carried out, Black line represents baseband signal data or envelope data in figure, and mesh generation is with the data sampling in target image frame in two frame signals On the basis of point position.
Longitudinal direction, since the data of most shallow depth, every a number of data sampling point(Or every certain depth) After take a little, the position is needed to carry out the node or impact point of Displacement Estimation, and the longitudinally spaced quantity is set in advance by system It is fixed.
Laterally, since center probe scan data line, every a number of sampling line(Or every one fixed width) After take a little, the position is needed to carry out the node or impact point of Displacement Estimation, and the lateral separation quantity is set in advance by system It is fixed.
After grid division, the amount of calculation needed for obtaining strain pattern will greatly reduce, particularly when horizontal and vertical When interval is very big.But interval has a certain impact greatly very much to image quality, Displacement Estimation result and final can be influenceed The spatial resolution of image.
Match point of the destination image data node in view data is matched be with the maximally related node of the node, can lead to Cross relevance algorithms calculate in image upon compression with the maximally related node of the node, and as the matching of impact point Point.
Step 430. calculates cross-correlation phase
After the match point for finding destination image data interior joint, you can calculate the cross-correlation phase of the node and match point Position.Phase calculation uses I, Q data.
Target image and the parameter I and Q that are matched relevant position in image are taken out, then trying to achieve phase is:
Wherein, (i, j) represents relative coordinate or the position of back end, the phase that same (i, j) is represented in two frame data It is identical to position, it is assumed for example that(0,0)The point in the Nuclear Data lower left corner is represented in the target image, then(0,0)It is being matched figure The point in the Nuclear Data lower left corner is also illustrated that as in.
In above formula, the computational methods of n are relevant with the final mean annual increment movement estimated result of lastrow node, it is assumed that lastrow node Final vertical displacement result is uy, then:
Wherein, round represents round(Can also be by the way of rounding up or rounding downwards).
Further, since the range of results that arctan functions are calculated existsBetween, in addition it is also necessary to according to its above formulaThe positive and negative situation of middle molecule denominator is by calibration of the output results to-π~π scopes Go.By the phase distribution rule of trigonometric function, the symbol of above-mentioned molecule withCorrespondence, the symbol of denominator withCorrespondence.
Step 440. calculates displacement result
By after above-mentioned calculating, present node(Or Displacement Estimation point)Displacement result be:
Wherein, TcIt it is the signal period, with signal center's angular frequencycIt is corresponding.nTcThe introducing of/2, compensate for phasometer The aliasing of calculation, so that this algorithm is not only suitable for the situation of thin tail sheep, is also applied for the situation of big displacement.
Above-mentioned displacement result in units of the sampling time in the way of represent, it is also possible to be converted into physical length unit Represent, the two is one-to-one.
In the present embodiment, at step 420, adopted when the match point of each node in lookup target image in image is matched Method with match point of each node of one's own profession in view data is matched is found based on initial guide point, refer to Fig. 6, including Following steps:
Step 421. searches the match point of the first row impact point
Using the node of the first row in destination image data as impact point, correspondence is found in view data is matched respectively Match point when every two field picture is divided into the grid of m row n rows, each lattice be a node, matched to impact point Shi Caiyong Block- matchings..Match point be with the maximally related node of impact point, while can also obtain impact point and the displacement of match point is inclined The position coordinates variable quantity of shifting amount, i.e. impact point and match point.
After the match point of the first row impact point determines, step 430 is on the one hand performed, calculate mutual according to impact point and match point Dependent phase, on the other hand performs following steps, searches in destination image data the second row and behind each node of each row Match point.
Step 422. determines the row initial guide point of Nth row
Since the second row, when the match point of impact point is searched using the method for pilot point, it is assumed that search Nth row and respectively save The match point of point, N is integer successively from 2 to n, and n is the line number that a two field picture is divided.Now, each node has been in N-1 rows Match point is found, therefore matching degree highest node is found out in the matching result of each node of N-1 rows as N first Capable row initial guide point, match point of each node of Nth row in view data is matched is searched based on row initial guide point. In a kind of instantiation, can be using the quality factor of node(QF)To judge the matching degree of the node, quality factor is higher, It is higher with spending.
In one embodiment, quality factor can be by SuAnd ScCalculated using statistical coefficient correlation Go out, it would however also be possible to employ absolute difference is sued for peace(Sum-Absolute Difference, SAD), the difference of two squares summation(Sum-Square Difference, SSD)Etc. trying to achieve, a kind of preferred calculation is:
WhereinRepresent that plural number takes conjugation.The value of quality factor QF between [0,1], when actually used Can be quantified to be judged to other number ranges according to custom, for example, be multiplied by 100 and round so that quality factor becomes It is the integer between 0-100.
In other instantiations, the matching degree of the node can be also judged using other modes, for example with flat The mode of size or node space position is strained to judge the matching degree of the node.
Using matching degree highest node in N-1 rows as initial guide point, so that it is determined that Nth row in destination image data First object point.Alleged first object point refers to that in all nodes of Nth row, first is being matched with as impact point The node of match point is searched in view data.
Step 423. determines the first object point of Nth row
After it is determined that matching degree highest node is as initial guide point in N-1 rows, generally chosen in Nth row with just The close point of beginning pilot point is used as first object point.Preferred mode is to choose the node in Nth row with initial guide point same column As first object point, because the consecutive points of same column are easiest to influence each other, such as assume initial guide point coordinate for (t-1, x), Then first object point may be selected (t, x).The selection of first object point can also be in Nth row with initial guide point adjacent column Node, but be the drawbacks of this mode, initial guide point is in first or terminal column, it is possible to cause the Nth row cannot to select One impact point:For example, if the node of the previous column of selection initial guide point is used as first object point, when initial pilot point is located at head During row, first object point just cannot be selected in Nth row;If selecting the node of the latter row of initial guide point as first object Point, when initial pilot point is located at terminal column, first object point just cannot be selected in Nth row.
Step 424. determines the region of search of first object point
The guiding thinking of the application is that the shift offset according to lastrow matching degree highest node determines for guiding The region of search of one's own profession first object point.Therefore, the shift offset of row initial guide point should first be obtained.In the present embodiment, The vertical misalignment amount and transversal displacement of row initial guide point are obtained respectively, in other embodiments, it is also possible to only obtain initial The vertical misalignment amount of pilot point.Assuming that the coordinate of first object point is (t, x), the side-play amount of the vertical and horizontal of initial guide point Respectively u0And x0, then the initial offset for setting first object point is u0And x0(Vertical and horizontal), i.e. first object point is in quilt The core searching position of the match point in matching view data is (t+u0,x+x0)。
After it is determined that first object point initial displacement side-play amount obtains the core searching position of first object point, just can be true Determine region of search.In image frame data is matched, centered on the core searching position(Or on the basis of), take out nearby or Surrounding one block number evidence is calculated.
As shown in Figure 7, it is assumed that core searching position of the first object point in image frame data is matched is (t+u0,x+ x0), the size of the block number evidence is preset by system, for example in the setting x laterally to the left of core searching position1, set to the right x2;U is set longitudinally upward1, u is set downwards2.After setting, region of search of the first object point in image frame data is matched Can be determined by the interval and horizontal interval in longitudinal direction, wherein, it is [t+u that longitudinal direction is interval0-u1,t+u0+u2], laterally interval is [x+ x0-x1,x+x0+x2], such as the region of Fig. 7 black lines institute frame.
According to clinical experience, destination organization is deeper, and its coefficient of elasticity is smaller, i.e., more soft, the length travel for showing Side-play amount can be bigger.Therefore the present embodiment also discloses a kind of determination scheme of region of search:The wherein horizontal interval ibid side of stating Case, and for longitudinal direction, no longer setting search scope upwards, i.e., in the setting x laterally to the left of core searching position1, set to the right x2;Longitudinal direction sets downwards u3.After setting, region of search of the first object point in image frame data is matched is:Longitudinal direction is interval It is [t+u0,t+u0+u3], laterally interval is [x+x0-x1,x+x0+x2]。
Region of search setting is smaller, and amount of calculation is smaller.
Step 425. searches the match point of first object point
The point as match point of correlation maximum is searched out in region of search, with Block- matching(block-matching)Think of Based on road, the position of search and first object point correlation maximum in view data is matched.Correlation maximum in search Distinguishing rule can be using SAD methods, NCC methods etc..SAD minimum position or the maximum positions of NCC are correlation maximum Position.Other similar distinguishing rules can also be used.
Step 430 and 440 are carried out by finding after the match point of first object point, cross-correlation phase and displacement knot is calculated Really.
Step 426. searches the pilot point of other nodes of Nth row
After the match point of first object point for finding Nth row, the section of both sides of being gone together with Nth row first object point successively Point is impact point, and the match point of impact point is searched in view data is matched.For the impact point selected, image is being matched The specific method that its match point is searched in data includes:
Matching degree highest node is selected in the node for having calculated shift offset around selected impact point As pilot point.Refer to other nodes near impact point around impact point, can be node, or the one's own profession of lastrow Node, for example matched by impact point of the point on the left of first object point successively, then on the right side of impact point, top, the right side Matching degree highest node is searched in the node of top as pilot point.It is selected as the bar that the node of pilot point should meet Part is:Shift offset is calculated.In other embodiments, it is also possible to set priority, such as setting matching degree threshold value, when The matching degree of a certain node is not reached when requiring, even if the node is nearest from impact point, it is also possible to directly abandon the node, and select Select from the farther still matching degree node high of impact point as pilot point.
Step 427. determines the region of search of other impact points of Nth row, is matched
It is determined that after pilot point, obtaining the shift offset of pilot point, and using the shift offset of pilot point as mesh The initial offset of punctuate.In view data is matched, with the position after the position skew initial offset of impact point as core Heart searching position, region of search is determined based on core searching position, in region of search to destination image data in target click through Row is matched, and obtains the match point of the Nth row impact point in destination image data.Specific method can refer to such scheme, herein Repeat no more.
Step 430 and 440 are carried out by finding after the match point of selected node, cross-correlation phase and displacement knot is calculated Really.
Step 428. judges whether that all nodes of Nth row have all been matched, if it is, performing step 429, otherwise continues to hold Row step 426.
Step 429. is when the displacement result of all nodes of Nth row is all calculated and finished, it is necessary to judge whether Nth row is last A line, if it is, terminating;Otherwise carry out the calculating of next line.
In the present embodiment, when the node to the second row and its back row carries out displacement detecting, because matching is all selected every time The matching degree highest point with the near nodal is selected as pilot point, therefore by mistake when fixed pilot point can be avoided to make a mistake It is extended to by mistake on follow-up node, reduces the cumulative effect of mistake in computation.
Obtain all nodes in destination image data(Or Displacement Estimation point)After the displacement result at place, to displacement result data Gradient is sought along longitudinal direction, you can obtain straining result, i.e. strain value.
In post processing is strained, certain error correction can be carried out to the strain result of gained, such as detect therein Abnormal trip point or apparent error point are corrected;Space smoothing can also be carried out to it, to improve image display effect;Or Person is used for different GTG or color atlas is mapped, to strengthen picture contrast.Other increase figures can also be carried out As the operation of quality.
Finally, the strain result of desired zone is carried out into output display as strain pattern, you can reflect the regional organization Elastic difference.
After whole frame displacement data has all been calculated, the QF values of whole each node of frame can also be in the lump obtained.By these QF information, Quality information can be provided the user when final image shows, one can be that, whole frame QF drafting pattern pictures, can be apparent that The mass fraction of each node, two can be the average value for calculating whole frame QF, be easy to understand a width elastic image on the whole It is good and bad.
It will be understood by those skilled in the art that all or part of step of various methods can pass through in above-mentioned implementation method Program instructs related hardware to complete, and the program can be stored in a computer-readable recording medium, storage medium can be wrapped Include:Read-only storage, random access memory, disk or CD etc..
Above content is to combine specific embodiment further description made for the present invention, it is impossible to assert this hair Bright specific implementation is confined to these explanations.For general technical staff of the technical field of the invention, do not taking off On the premise of present inventive concept, some simple deduction or replace can also be made.

Claims (32)

1. displacement detecting method during a kind of compressive resilience is imaged, it is characterised in that including:
Obtain two field pictures data, respectively as the destination image data before compression and compression after be matched view data;
Using the node of the first row in destination image data as impact point, corresponding is found in view data is matched respectively With point;
Impact point and its match point according to the first row calculate the displacement result of each node of the first row;
The displacement result of each node of Nth row in destination image data is calculated, wherein N is integer successively from 2 to n, and n is a frame figure As the line number for dividing, including:
Matching degree highest node is found out in the matching result of each node of N-1 rows as row initial guide point;
Match point of each node of Nth row in view data is matched is searched based on row initial guide point;
Each node and its match point according to Nth row calculate the displacement result of each node of Nth row.
2. the method for claim 1, it is characterised in that the grid of m row n rows is divided into per two field picture, each lattice are One node, Block- matching is used when being matched to impact point.
3. the method for claim 1, it is characterised in that described image data are the ultrasonic echo data for receiving through treatment Ultrasound image data afterwards
4. the method as any one of claim 1-3, it is characterised in that the match point is maximally related with impact point Node.
5. method as claimed in claim 4, it is characterised in that the matching degree of the node is judged using the quality factor of node, Quality factor is higher, and matching degree is higher.
6. method as claimed in claim 5, it is characterised in that the quality factor of node is the correlation of the matched point of the node Coefficient.
7. method as claimed in claim 4, it is characterised in that the calculating of each modal displacement result includes:Based on the node and The ultrasonic radio frequency complex signal of its match point calculates the cross-correlation phase of the node, final based on the cross-correlation phase calculation node Displacement result.
8. the method for claim 1, it is characterised in that each node of Nth row is searched in quilt based on row initial guide point Match point in matching view data includes:
The first object point of Nth row in destination image data is determined based on matching degree highest node in N-1 rows;
Obtain the shift offset of initial guide point;
Using the shift offset of row initial guide point as Nth row first object point initial offset;
In view data is matched, searched by core of the position after the position skew initial offset of Nth row first object point Rope position, region of search is determined based on core searching position, to the mesh of Nth row first in destination image data in region of search Punctuate is matched, and obtains the match point of the Nth row first object point in destination image data;
After having calculated Nth row first object point, successively with Nth row first object point colleague both sides node as impact point, in quilt The match point of impact point is searched in matching view data.
9. method as claimed in claim 2, it is characterised in that each node of Nth row is searched in quilt based on row initial guide point Match point in matching view data includes:
The first object point of Nth row in destination image data is determined based on matching degree highest node in N-1 rows;
Obtain the shift offset of initial guide point;
Using the shift offset of row initial guide point as Nth row first object point initial offset;
In view data is matched, searched by core of the position after the position skew initial offset of Nth row first object point Rope position, region of search is determined based on core searching position, to the mesh of Nth row first in destination image data in region of search Punctuate is matched, and obtains the match point of the Nth row first object point in destination image data;
After having calculated Nth row first object point, successively with Nth row first object point colleague both sides node as impact point, in quilt The match point of impact point is searched in matching view data.
10. method as claimed in claim 3, it is characterised in that each node of Nth row is searched in quilt based on row initial guide point Match point in matching view data includes:
The first object point of Nth row in destination image data is determined based on matching degree highest node in N-1 rows;
Obtain the shift offset of initial guide point;
Using the shift offset of row initial guide point as Nth row first object point initial offset;
In view data is matched, searched by core of the position after the position skew initial offset of Nth row first object point Rope position, region of search is determined based on core searching position, to the mesh of Nth row first in destination image data in region of search Punctuate is matched, and obtains the match point of the Nth row first object point in destination image data;
After having calculated Nth row first object point, successively with Nth row first object point colleague both sides node as impact point, in quilt The match point of impact point is searched in matching view data.
11. methods as claimed in claim 4, it is characterised in that each node of Nth row is searched in quilt based on row initial guide point Match point in matching view data includes:
The first object point of Nth row in destination image data is determined based on matching degree highest node in N-1 rows;
Obtain the shift offset of initial guide point;
Using the shift offset of row initial guide point as Nth row first object point initial offset;
In view data is matched, searched by core of the position after the position skew initial offset of Nth row first object point Rope position, region of search is determined based on core searching position, to the mesh of Nth row first in destination image data in region of search Punctuate is matched, and obtains the match point of the Nth row first object point in destination image data;
After having calculated Nth row first object point, successively with Nth row first object point colleague both sides node as impact point, in quilt The match point of impact point is searched in matching view data.
12. methods as claimed in claim 5, it is characterised in that each node of Nth row is searched in quilt based on row initial guide point Match point in matching view data includes:
The first object point of Nth row in destination image data is determined based on matching degree highest node in N-1 rows;
Obtain the shift offset of initial guide point;
Using the shift offset of row initial guide point as Nth row first object point initial offset;
In view data is matched, searched by core of the position after the position skew initial offset of Nth row first object point Rope position, region of search is determined based on core searching position, to the mesh of Nth row first in destination image data in region of search Punctuate is matched, and obtains the match point of the Nth row first object point in destination image data;
After having calculated Nth row first object point, successively with Nth row first object point colleague both sides node as impact point, in quilt The match point of impact point is searched in matching view data.
13. methods as claimed in claim 6, it is characterised in that each node of Nth row is searched in quilt based on row initial guide point Match point in matching view data includes:
The first object point of Nth row in destination image data is determined based on matching degree highest node in N-1 rows;
Obtain the shift offset of initial guide point;
Using the shift offset of row initial guide point as Nth row first object point initial offset;
In view data is matched, searched by core of the position after the position skew initial offset of Nth row first object point Rope position, region of search is determined based on core searching position, to the mesh of Nth row first in destination image data in region of search Punctuate is matched, and obtains the match point of the Nth row first object point in destination image data;
After having calculated Nth row first object point, successively with Nth row first object point colleague both sides node as impact point, in quilt The match point of impact point is searched in matching view data.
14. methods as claimed in claim 7, it is characterised in that each node of Nth row is searched in quilt based on row initial guide point Match point in matching view data includes:
The first object point of Nth row in destination image data is determined based on matching degree highest node in N-1 rows;
Obtain the shift offset of initial guide point;
Using the shift offset of row initial guide point as Nth row first object point initial offset;
In view data is matched, searched by core of the position after the position skew initial offset of Nth row first object point Rope position, region of search is determined based on core searching position, to the mesh of Nth row first in destination image data in region of search Punctuate is matched, and obtains the match point of the Nth row first object point in destination image data;
After having calculated Nth row first object point, successively with Nth row first object point colleague both sides node as impact point, in quilt The match point of impact point is searched in matching view data.
15. method as any one of claim 8-14, it is characterised in that Nth row first object point is and N-1 rows The close node of matching degree highest node.
16. methods as claimed in claim 15, it is characterised in that Nth row first object point is with the matching degree of N-1 rows most The node of node same column high.
17. method as any one of claim 8-14, it is characterised in that the node with both sides is as impact point successively, Include the step of the match point of impact point is searched in being matched view data:
Matching degree highest node is selected in the node for having calculated shift offset around impact point as pilot point;
Obtain the shift offset of pilot point;
Using the shift offset of pilot point as impact point initial offset;
It is core searching position, base with the position after the position skew initial offset of impact point in view data is matched Determine region of search in core searching position, in region of search to destination image data in impact point match, and obtain The match point of the Nth row impact point in destination image data.
18. method as any one of claim 8-14, it is characterised in that region of search be from core searching position to Core searching position longitudinally offsets downward the region formed after setting value, or region of search is centered on core searching position To the region formed after the offset setting value of periphery.
Displacement detector in a kind of 19. compressive resilience imagings, it is characterised in that including:
Image collection module, for obtaining two field pictures data, after the destination image data before compression and compression It is matched view data;
First matching module, for using the node of the first row in destination image data as impact point, being matched image respectively Corresponding match point is found in data;
Searching modul, it is first as row for finding out matching degree highest node in the matching result of each node of N-1 rows Beginning pilot point, wherein N are the current line for requiring to look up match point, and N is integer successively from 2 to n, and n is what a two field picture was divided Line number;
Second matching module, for based on row initial guide point search Nth row each node in view data is matched With point;
Displacement computing module, the displacement result of each node of the first row is calculated for the impact point according to the first row and its match point, And the displacement result of each node of Nth row is calculated according to each node and its match point of Nth row.
20. devices as claimed in claim 19, it is characterised in that described image data are the ultrasonic echo data for receiving through place Ultrasound image data after reason.
21. devices as claimed in claim 19, it is characterised in that the matching of the node is judged using the quality factor of node Degree, quality factor is higher, and matching degree is higher.
22. device as any one of claim 19-21, it is characterised in that the match point is and impact point most phase The node of pass.
23. devices as claimed in claim 22, it is characterised in that displacement computing module is calculated to be included:
Cross-correlation phase calculation unit, the mutual of the node is calculated for the ultrasonic radio frequency complex signal based on the node and its match point Dependent phase;
Displacement computing unit, for the final mean annual increment movement result based on the cross-correlation phase calculation node.
24. devices as claimed in claim 19, it is characterised in that the second matching module includes:
First object point determining unit (341), for determining destination image data based on matching degree highest node in N-1 rows The first object point of middle Nth row;
First side-play amount acquiring unit (342), the shift offset for obtaining initial guide point;
First region of search determining unit (343), for using the shift offset of row initial guide point as Nth row first object The initial offset of point, in view data is matched, with the position after the position skew initial offset of Nth row first object point Core searching position is set to, region of search is determined based on core searching position;
First search unit (344), for the Nth row first object point searched in destination image data in the region of search With point;
Second impact point determining unit (345), for after Nth row first object point has been calculated, successively with Nth row first object The node of point colleague both sides is impact point;
Pilot point determining unit (346), for selection in the node for having calculated shift offset around impact point With degree highest node as pilot point;
Second side-play amount acquiring unit (347), the shift offset for obtaining pilot point;
Second region of search determining unit (348), for using the shift offset of pilot point as impact point initial offset, It is core searching position with the position after the position skew initial offset of impact point, based on core in view data is matched Heart searching position determines region of search;
Second search unit (349), in region of search to destination image data in impact point match, and obtain mesh The match point of the Nth row impact point in logo image data.
25. devices as claimed in claim 20, it is characterised in that the second matching module includes:
First object point determining unit (341), for determining destination image data based on matching degree highest node in N-1 rows The first object point of middle Nth row;
First side-play amount acquiring unit (342), the shift offset for obtaining initial guide point;
First region of search determining unit (343), for using the shift offset of row initial guide point as Nth row first object The initial offset of point, in view data is matched, with the position after the position skew initial offset of Nth row first object point Core searching position is set to, region of search is determined based on core searching position;
First search unit (344), for the Nth row first object point searched in destination image data in the region of search With point;
Second impact point determining unit (345), for after Nth row first object point has been calculated, successively with Nth row first object The node of point colleague both sides is impact point;
Pilot point determining unit (346), for selection in the node for having calculated shift offset around impact point With degree highest node as pilot point;
Second side-play amount acquiring unit (347), the shift offset for obtaining pilot point;
Second region of search determining unit (348), for using the shift offset of pilot point as impact point initial offset, It is core searching position with the position after the position skew initial offset of impact point, based on core in view data is matched Heart searching position determines region of search;
Second search unit (349), in region of search to destination image data in impact point match, and obtain mesh The match point of the Nth row impact point in logo image data.
26. devices as claimed in claim 21, it is characterised in that the second matching module includes:
First object point determining unit (341), for determining destination image data based on matching degree highest node in N-1 rows The first object point of middle Nth row;
First side-play amount acquiring unit (342), the shift offset for obtaining initial guide point;
First region of search determining unit (343), for using the shift offset of row initial guide point as Nth row first object The initial offset of point, in view data is matched, with the position after the position skew initial offset of Nth row first object point Core searching position is set to, region of search is determined based on core searching position;
First search unit (344), for the Nth row first object point searched in destination image data in the region of search With point;
Second impact point determining unit (345), for after Nth row first object point has been calculated, successively with Nth row first object The node of point colleague both sides is impact point;
Pilot point determining unit (346), for selection in the node for having calculated shift offset around impact point With degree highest node as pilot point;
Second side-play amount acquiring unit (347), the shift offset for obtaining pilot point;
Second region of search determining unit (348), for using the shift offset of pilot point as impact point initial offset, It is core searching position with the position after the position skew initial offset of impact point, based on core in view data is matched Heart searching position determines region of search;
Second search unit (349), in region of search to destination image data in impact point match, and obtain mesh The match point of the Nth row impact point in logo image data.
27. devices as claimed in claim 22, it is characterised in that the second matching module includes:
First object point determining unit (341), for determining destination image data based on matching degree highest node in N-1 rows The first object point of middle Nth row;
First side-play amount acquiring unit (342), the shift offset for obtaining initial guide point;
First region of search determining unit (343), for using the shift offset of row initial guide point as Nth row first object The initial offset of point, in view data is matched, with the position after the position skew initial offset of Nth row first object point Core searching position is set to, region of search is determined based on core searching position;
First search unit (344), for the Nth row first object point searched in destination image data in the region of search With point;
Second impact point determining unit (345), for after Nth row first object point has been calculated, successively with Nth row first object The node of point colleague both sides is impact point;
Pilot point determining unit (346), for selection in the node for having calculated shift offset around impact point With degree highest node as pilot point;
Second side-play amount acquiring unit (347), the shift offset for obtaining pilot point;
Second region of search determining unit (348), for using the shift offset of pilot point as impact point initial offset, It is core searching position with the position after the position skew initial offset of impact point, based on core in view data is matched Heart searching position determines region of search;
Second search unit (349), in region of search to destination image data in impact point match, and obtain mesh The match point of the Nth row impact point in logo image data.
28. devices as claimed in claim 23, it is characterised in that the second matching module includes:
First object point determining unit (341), for determining destination image data based on matching degree highest node in N-1 rows The first object point of middle Nth row;
First side-play amount acquiring unit (342), the shift offset for obtaining initial guide point;
First region of search determining unit (343), for using the shift offset of row initial guide point as Nth row first object The initial offset of point, in view data is matched, with the position after the position skew initial offset of Nth row first object point Core searching position is set to, region of search is determined based on core searching position;
First search unit (344), for the Nth row first object point searched in destination image data in the region of search With point;
Second impact point determining unit (345), for after Nth row first object point has been calculated, successively with Nth row first object The node of point colleague both sides is impact point;
Pilot point determining unit (346), for selection in the node for having calculated shift offset around impact point With degree highest node as pilot point;
Second side-play amount acquiring unit (347), the shift offset for obtaining pilot point;
Second region of search determining unit (348), for using the shift offset of pilot point as impact point initial offset, It is core searching position with the position after the position skew initial offset of impact point, based on core in view data is matched Heart searching position determines region of search;
Second search unit (349), in region of search to destination image data in impact point match, and obtain mesh The match point of the Nth row impact point in logo image data.
29. device as any one of claim 25-28, it is characterised in that Nth row first object point is and N-1 The close node of capable matching degree highest node.
30. devices as claimed in claim 29, it is characterised in that Nth row first object point is with the matching degree of N-1 rows most The node of node same column high.
31. device as any one of claim 25-28, it is characterised in that region of search is from core searching position The region formed after longitudinally offseting downward setting value to core searching position, or region of search is in being with core searching position The heart is to the region formed after the offset setting value of periphery.
A kind of 32. supersonic imaging apparatus, it is characterised in that including:
Probe, for scanning objective emission ultrasonic wave and receiving ultrasonic echo;
Signal processor, for processing ultrasonic echo, generates ultrasound image data;
Image processor, for processing ultrasound image data, and generates elastic image, and described image processor includes:
Displacement detector as any one of claim 19-31;
The displacement result of each node based on displacement detector detection generates the elastic image generating means of elastic image.
CN201310726398.2A 2013-12-25 2013-12-25 Compressive resilience imaging displacement detection method, device and supersonic imaging apparatus Active CN104739442B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201310726398.2A CN104739442B (en) 2013-12-25 2013-12-25 Compressive resilience imaging displacement detection method, device and supersonic imaging apparatus
PCT/CN2014/077327 WO2015096353A1 (en) 2013-12-25 2014-05-13 Displacement detection method and device in pressure elasticity imaging, and ultrasonic imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310726398.2A CN104739442B (en) 2013-12-25 2013-12-25 Compressive resilience imaging displacement detection method, device and supersonic imaging apparatus

Publications (2)

Publication Number Publication Date
CN104739442A CN104739442A (en) 2015-07-01
CN104739442B true CN104739442B (en) 2017-06-16

Family

ID=53477442

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310726398.2A Active CN104739442B (en) 2013-12-25 2013-12-25 Compressive resilience imaging displacement detection method, device and supersonic imaging apparatus

Country Status (2)

Country Link
CN (1) CN104739442B (en)
WO (1) WO2015096353A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102035993B1 (en) * 2015-09-03 2019-10-25 지멘스 메디컬 솔루션즈 유에스에이, 인크. Ultrasound system and method for generating elastic image
CN106651868A (en) * 2016-08-31 2017-05-10 沈阳东软医疗系统有限公司 Displacement measurement method and displacement measurement device
CN108053362A (en) * 2017-12-22 2018-05-18 飞依诺科技(苏州)有限公司 The data processing method and its system of ultrasonoscopy
CN109745073B (en) * 2019-01-10 2021-08-06 武汉中旗生物医疗电子有限公司 Two-dimensional matching method and equipment for elastography displacement
CN113476075A (en) * 2020-03-16 2021-10-08 深圳市理邦精密仪器股份有限公司 Ultrasonic elastography method, and image data screening method and device
CN111528912A (en) * 2020-05-25 2020-08-14 武汉中旗生物医疗电子有限公司 Ultrasonic elastography method, device and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102824194A (en) * 2011-06-14 2012-12-19 深圳迈瑞生物医疗电子股份有限公司 Displacement detecting method and device thereof in elasticity imaging
CN102824193A (en) * 2011-06-14 2012-12-19 深圳迈瑞生物医疗电子股份有限公司 Displacement detecting method, device and system in elastic imaging
US8403850B2 (en) * 2008-03-25 2013-03-26 Wisconsin Alumni Research Foundation Rapid two/three-dimensional sector strain imaging

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070167772A1 (en) * 2005-12-09 2007-07-19 Aloka Co., Ltd. Apparatus and method for optimized search for displacement estimation in elasticity imaging
US7632231B2 (en) * 2006-03-22 2009-12-15 Wisconsin Alumni Research Foundation Ultrasonic strain imaging device and method providing parallel displacement processing
US9375195B2 (en) * 2012-05-31 2016-06-28 Siemens Medical Solutions Usa, Inc. System and method for real-time ultrasound guided prostate needle biopsy based on biomechanical model of the prostate from magnetic resonance imaging data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8403850B2 (en) * 2008-03-25 2013-03-26 Wisconsin Alumni Research Foundation Rapid two/three-dimensional sector strain imaging
CN102824194A (en) * 2011-06-14 2012-12-19 深圳迈瑞生物医疗电子股份有限公司 Displacement detecting method and device thereof in elasticity imaging
CN102824193A (en) * 2011-06-14 2012-12-19 深圳迈瑞生物医疗电子股份有限公司 Displacement detecting method, device and system in elastic imaging

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
A quality-guided displacement tracking algorithm for ultrasonic elasticity imaging;Lujie Chen;《Medical Image analysis》;20081108;第13卷(第2期);286-296 *

Also Published As

Publication number Publication date
CN104739442A (en) 2015-07-01
WO2015096353A1 (en) 2015-07-02

Similar Documents

Publication Publication Date Title
CN104739442B (en) Compressive resilience imaging displacement detection method, device and supersonic imaging apparatus
US11786210B2 (en) Ultrasound diagnostic system, ultrasound image generation apparatus, and ultrasound image generation method
US9607405B2 (en) Method and device for detecting displacement in elastography
US8265358B2 (en) Ultrasonic image processing apparatus and method for processing ultrasonic image
JP4831565B2 (en) Method and system for motion correction of ultrasonic volumetric data sets
JP6490809B2 (en) Ultrasonic diagnostic apparatus and image processing method
US20080077011A1 (en) Ultrasonic apparatus
US11622743B2 (en) Rib blockage delineation in anatomically intelligent echocardiography
JP6705134B2 (en) Ultrasonic image diagnostic apparatus, ultrasonic image processing method, and ultrasonic image processing program
KR101656127B1 (en) Measuring apparatus and program for controlling the same
CN102824193B (en) Displacement detecting method in a kind of elastogram, Apparatus and system
US11526991B2 (en) Medical image processing apparatus, and medical imaging apparatus
JP6381972B2 (en) Medical image processing apparatus and medical image diagnostic apparatus
CN104739451A (en) Elastic image imaging method and device and ultrasonic imaging device
KR101629541B1 (en) Ultrasonic diagnostic apparatus and control program thereof
WO2013063465A1 (en) Method for obtaining a three-dimensional velocity measurement of a tissue
Rivaz et al. Tracked regularized ultrasound elastography for targeting breast radiotherapy
CN105326529B (en) Elastograph imaging method and system
JP7078571B2 (en) Ultrasound diagnostic equipment, tracing methods and programs
CN110811674B (en) Ultrasonic diagnostic apparatus and storage medium
JP2021083699A (en) Ultrasonic diagnostic device and display method
Jiang et al. Modified phase zero method for ultrasound freehand strain imaging
JP6253640B2 (en) Medical image processing device
Cui et al. A robust phase zero estimator for ultrasonic elastography using quality-guided seeding strategy

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20150701

Assignee: Shenzhen Mindray Animal Medical Technology Co.,Ltd.

Assignor: SHENZHEN MINDRAY BIO-MEDICAL ELECTRONICS Co.,Ltd.

Contract record no.: X2022440020009

Denomination of invention: Pressure elastography displacement detection method, device and ultrasonic imaging equipment

Granted publication date: 20170616

License type: Common License

Record date: 20220804

EE01 Entry into force of recordation of patent licensing contract