CN105708492A - Method and system for fusing B ultrasonic imaging and microwave imaging - Google Patents

Method and system for fusing B ultrasonic imaging and microwave imaging Download PDF

Info

Publication number
CN105708492A
CN105708492A CN201511032447.8A CN201511032447A CN105708492A CN 105708492 A CN105708492 A CN 105708492A CN 201511032447 A CN201511032447 A CN 201511032447A CN 105708492 A CN105708492 A CN 105708492A
Authority
CN
China
Prior art keywords
image
ultrasonic
microwave
imaging
imagery
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201511032447.8A
Other languages
Chinese (zh)
Inventor
李慈航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HANGZHOU ET MEDICAL TECHNOLOGY Co Ltd
Original Assignee
HANGZHOU ET MEDICAL TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HANGZHOU ET MEDICAL TECHNOLOGY Co Ltd filed Critical HANGZHOU ET MEDICAL TECHNOLOGY Co Ltd
Priority to CN201511032447.8A priority Critical patent/CN105708492A/en
Publication of CN105708492A publication Critical patent/CN105708492A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/0507Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  using microwaves or terahertz waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a method and a system for fusing B ultrasonic imaging and microwave imaging. By mapping pixel points of either an B ultrasonic image or a microwave image to the other image, related pixel points of the two images achieve consistency in spatial position, so that fusion of the two images is completed. Various imaging modes can be adopted for complementing each other's advantages in the process of diagnosing patients with early-stage cancers; image fusion has a potential that, with the comprehensive application of information obtained by imaging devices, the spatial position size and the geometrical shape of a diseased body as well as the spatial relationship between the diseased body and surrounding biological tissues can be accurately determined, so that diseases can be diagnosed timely and efficiently; and the method and the system can be also applied to such aspects as development of surgical planning, tracking of pathological changes, assessment of a treatment effect and the like.

Description

A kind of method and system merging B ultrasonic imaging and microwave imaging
Technical field
The present invention relates to the fusion method of a kind of image and system, particularly relate to fusion method and the system of a kind of B ultrasonic image and microwave imagery.
Background technology
The position distribution that microwave imaging is determined by organism dielectric constant and conductivity, measurement data is processed to search out the tissue that to be imaged in body those dielectric constants different from normal value, its reliability in various researchs it is verified that. existing microwave imaging system is concentrated mainly in the algorithm of imaging and the design of antenna, existing microwave imaging algorithm mainly includes ULTRA-WIDEBAND RADAR imaging and microwave tomography, ULTRA-WIDEBAND RADAR imaging needs the spatial distribution being scattered intensity of reconstruct subject, the process of signal can be carried out in the frequency band of several GHz, it being effectively improved the spatial resolution of imaging system. microwave chromatography imaging technique is then the spatial distribution of the dielectric property needing reconstruct measured body, pathological tissues is diagnosed by presenting the difference of dielectric property between different tissues, first the measured body implanted in coupling medium is mainly carried out electromagnetic wave detection by its method, and at receiving terminal by rotable antenna battle array to realize the comprehensive collection to scattered wave and monitoring, the dielectric property of measured body is reconstructed by algorithm on computers that finally utilize iteration.
The commonly used ultrasonic pulse-echo technology of B ultrasonic imaging system, namely utilizes ultrasonic irradiation human body, and ultrasound wave reflects in human body, reflects and scattering, then passes through the echo receiving and processing carrying information, thus obtaining the gray scale image of human tissue structure.
Different medical images provides the different information of relevant internal organs, the potentiality of image co-registration are in that these imaging device gained information of integrated application, the locus size of pathological changes body can be accurately determined, spatial relationship between geometry and it and surrounding biological tissue, thus diagnosing the illness efficiently in time, the formulation of surgical planning can also be used in, the tracking of pathological change, the aspects such as the evaluation of therapeutic effect, need the problem solved often at medical diagnosis Chinese medicine iconography: whether to have focus, whether the focus found is cancerous lesions, cancerous lesions is limited to local or diffusion, should how to treat, whether effective treat, the need of so a series of problems such as further treatments.For these problems, ultra sonic imaging has high-resolution, high specific and the feature without ionizing radiation, but its picture contrast is poor, the information of pathological tissues image is differentiated not high enough, and under microwave frequency band, the electrical characteristic parameter obvious difference of normal structure and malignant tumor tissue, their dielectric constant and electrical conductivity difference are all more than 5 times, difference between display normal structure and malignant tumor tissue that microwave imaging can be apparent from, picture contrast is significantly high.Two kinds of formation methods are respectively arranged with advantage but are difficult to unified, it is impossible to utilize two kinds of images to have complementary advantages simultaneously.
Summary of the invention
Present invention solves the technical problem that and be: build a kind of method and system merging B ultrasonic imaging and microwave imaging, overcome the single detection device of prior art that two kinds of images can not be utilized to carry out the technical problem having complementary advantages simultaneously.
A kind of method merging B ultrasonic imaging and microwave imaging of offer is provided, comprises the steps:
Imaging respectively: region to be measured is occurred B ultrasonic signal, and the echo-signal receiving described B ultrasonic signal carries out B ultrasonic imaging;Region to be measured is occurred microwave signal, and the echo-signal receiving described microwave signal carries out microwave imaging;
Image co-registration: be mapped in another piece image by the pixel of piece image in described B ultrasonic image and described microwave imagery, makes the fusion reaching unanimously to complete this two width image related like vegetarian refreshments on locus of two width images.
The further technical scheme of the present invention is: in described B ultrasonic image and described microwave imagery, the characteristic point place of image carries out labelling, the image tagged of described B ultrasonic image and described microwave imagery is overlapped and described B ultrasonic image and described microwave imagery are merged.
The further technical scheme of the present invention is: when image co-registration, the image tagged of described B ultrasonic image and described microwave imagery is identified, then as boundary mark, identification image is carried out image co-registration.
The further technical scheme of the present invention is: also includes that described B ultrasonic image and described microwave imagery are carried out pretreatment and makes image become apparent from.
The further technical scheme of the present invention is: in image co-registration step, and the image pixel weighted average of described B ultrasonic image and described microwave imagery correspondence position is completed image co-registration.
The technical scheme is that a kind of system merging B ultrasonic imaging and microwave imaging of structure, including B ultrasonic image-generating unit, microwave imaging unit, fusion treatment unit, described B ultrasonic image-generating unit generates B ultrasonic image according to the B ultrasonic signal received, described microwave imaging unit generates microwave imagery according to the microwave echoes signal received, B ultrasonic image and the microwave imagery of microwave imaging generation that B ultrasonic imaging is generated by described fusion treatment unit carry out registration, the pixel of piece image in described B ultrasonic image and described microwave imagery is mapped in another piece image, make the fusion reaching unanimously to complete this two width image related like vegetarian refreshments on locus of two width images.
The further technical scheme of the present invention is: described fusion treatment unit also includes image tagged module, described image tagged module characteristic point place of image in described B ultrasonic image and described microwave imagery carries out labelling, the image tagged of described B ultrasonic image and described microwave imagery is overlapped and described B ultrasonic image and described microwave imagery are merged.
The further technical scheme of the present invention is: described fusion treatment unit also includes image tagged identification module, and the image tagged of described B ultrasonic image and described microwave imagery is identified by described image tagged identification module.
The further technical scheme of the present invention is: also include image pre-processing module, and described B ultrasonic image and described microwave imagery are carried out pretreatment by described image pre-processing module.
The solution have the advantages that: build a kind of method and system merging B ultrasonic imaging and microwave imaging, by being mapped in another piece image by the pixel of piece image in described B ultrasonic image and described microwave imagery, make the fusion reaching unanimously to complete this two width image related like vegetarian refreshments on locus of two width images.Can by using different image modes to have complementary advantages in the process of diagnosis earlier stage cancer patients, the potentiality of image co-registration are in that these imaging device gained information of integrated application, the locus size of pathological changes body can be accurately determined, spatial relationship between geometry and it and surrounding biological tissue, thus diagnosing the illness efficiently in time, the formulation of surgical planning can also be used in, the tracking of pathological change, the aspects such as the evaluation of therapeutic effect, need the problem solved often at medical diagnosis Chinese medicine iconography: whether to have focus, whether the focus found is cancerous lesions, cancerous lesions is limited to local or diffusion, should how to treat, whether effective treat, the need of so a series of problems such as further treatments.
Accompanying drawing explanation
Fig. 1 is the structural representation of the present invention.
Fig. 2 is the structure chart of the present invention.
Fig. 3 is the fusion flow chart of the present invention.
Detailed description of the invention
Below in conjunction with specific embodiment, technical solution of the present invention is further illustrated.
As it is shown in figure 1, the specific embodiment of the present invention is: provide a kind of method merging B ultrasonic imaging and microwave imaging, comprise the steps:
Imaging respectively: region to be measured is occurred B ultrasonic signal, and the echo-signal receiving described B ultrasonic signal carries out B ultrasonic imaging;Region to be measured is occurred microwave signal, and the echo-signal receiving described microwave signal carries out microwave imaging;
Specific implementation process is as follows: generate B ultrasonic image according to the B ultrasonic signal received, and generates microwave imagery according to the microwave echoes signal received.
Image co-registration: the microwave imagery that the B ultrasonic image generate B ultrasonic imaging and microwave imaging generate carries out registration, the pixel of piece image in described B ultrasonic image and described microwave imagery is mapped in another piece image, makes the fusion reaching unanimously to complete this two width image related like vegetarian refreshments on locus of two width images.
Specific implementation process is as follows: image co-registration includes multiple method: a kind of method is labelling method, in described B ultrasonic image and described microwave imagery, the characteristic point place of image carries out labelling, the image tagged of described B ultrasonic image and described microwave imagery is overlapped described B ultrasonic image and described microwave imagery are merged, when image co-registration, the image tagged of described B ultrasonic image and described microwave imagery is identified, then identification image is carried out image co-registration as boundary mark, identify image tagged feature, it is possible to make image co-registration more accurate.In specific embodiment, template or transparent mode is adopted to cover on another piece image piece image therein after carrying out labelling.Image tagged software design becomes the boundary mark synthesized by the Feature point recognition of image as image.Image tagged software is according to labelling to the labelling of tissue identification, the labelling of lymph node identification and characteristic of described B ultrasonic image and microwave imagery, coordinate axes is formed coincidence point, completing the fusion of described B ultrasonic image and microwave imagery, the method is equally applicable to two dimension or the three-dimensional breast image that other modes combine.Further comprises the display to multi-modal microwave breast image, the two and three dimensions information of display breast, the labelling to focal area.Another kind of method is pixel weighted mean method, it may be assumed that the image pixel weighted average of described B ultrasonic image and described microwave imagery correspondence position is completed image co-registration.
Image registration
By using coupling, superposition etc. to process means, the same target in multiple image is kept same position in the picture so that it is there is the process of identical space coordinates.
In the registration of microwave imagery and B ultrasonic image, using metastable microwave imagery as reference picture I, using B ultrasonic image as floating image II, carry out the image registration of the maximum mutual information method based on pixel, flow chart as shown in Figure 3 under:
Rigid body translation includes transformation of scale, in two-dimensional image I I, and point (x1, y1) through rigid transformation to point (x2, y2) application formula be:
x 2 y 2 = K c o s α ± s i n α s i n α ± c o s α x 1 y 1 - - - ( 1 )
Wherein, α is the anglec of rotation, and K is scale parameter.
Mutual information relevance evaluation.The gray value of two images subject to registration is regarded as two stochastic variable A and B respectively, and scope 0 to 255, marginal probability distribution is P respectivelyA(a) and PBB (), joint probability distribution is PAB(a, b), then edge entropy and the combination entropy that can obtain A and B are respectively as follows: H (A), H (B) and H (A, B).Then have:
H ( A ) = - Σ a P A ( a ) logP A ( a ) H ( B ) = - Σ b P B ( b ) logP B ( b ) H ( A , B ) = - Σ a Σ b P A B ( a , b ) logP A B ( a , b ) , a , b ∈ [ 0 , 255 ] - - - ( 2 )
Normalized mutual information dependent evaluation function I (A, B) of stochastic variable A and B, for:
I ( A , B ) = H ( A ) + H ( B ) H ( A , B ) - - - ( 3 )
When two width reach optimal registration based on the image of common anatomical structure, gray scale cross-correlation value of information I (A, B) of they respective pixel should reach maximum.
Registration optimizes.After rigid body translation completes, a kind of similarity side degree need to be found further to weigh the similarity degree of two width images, it is necessary to constantly transformation parameter α and K so that similar side degree reaches optimum, wherein scale parameter K excursion is 0 to 1, and rotation angle range is 0 to 180 degree.
1. it is the unit vector of coordinate axes: c by α and K transformation range seti=e (i=1,2 ..., N);
2. record initial value position vector is P0=(α0, K0);
3. to i=1,2 ..., N, by Pi-1Move to object function I (A, B) and prolong ciThe maximum position in direction, writes down this Pi
4. to i=1,2 ..., N, by ci+1It is assigned to ci, juxtaposition cN=PN-P0
5. by PNMove to object function I (A, B) at cNMaximum point on direction, and record the P of this point0
6. repeat step 2. to arrive 5., until functional value I (A, B) no longer increases.
The preferred embodiment of the present invention is: also included carrying out Image semantic classification before carrying out image co-registration.
Microwave imagery preprocess method is as follows: big by external interference factor based on microwave imaging, Image semantic classification we utilize point processing to carry out the extension of contrast, making image clear, feature is obvious, it is assumed that original image f (x, y) tonal range is [a, b], and image g after conversion (x, what tonal range y) was linear extends to [c, d], then there is grey linear transformation expression formula is:
g ( x , y ) = d - c b - a × [ f ( x , y ) - a ] + c
When the grey level distribution of pixel most of in image is in interval [a, b], fmaxFor the maximum gray scale of artwork, only the gray level of very small part has exceeded this interval, then in order to improve reinforced effects, and Wo Menling
g ( x , y ) = c 0 ≤ f ( x , y ) ≤ a d - c b - a × [ f ( x , y ) - a ] + c a ≤ f ( x , y ) ≤ b d b ≤ f ( x , y ) ≤ f m a x
By the linear stretch to image, it is possible to be effectively improved picture contrast effect.
B ultrasonic image pre-processing method is as follows: owing to B ultrasonic imaging is except the intrinsic problem speckle noise existed, and there is also the bright spot high-frequency noise of random appearance simultaneously, and we utilize low pass Recursive filtering method that it is carried out pretreatment.
Assume the grey decision-making x of each pixel in the n-th width ultrasonoscopyn(i j) represents, α is correlation coefficient, then the image y after processingn(i j) is
yn(i, j)=α * yn-1(i, j)+(1-α) * xn(i, j) (4)
By formula (4) it can be seen that, the value of current each pixel is solely dependent upon the input of this pixel and the output of last time, unrelated with the value of other pixels, and we analyze its frequency characteristic by one-dimensional transform method, analyze the frequency response of each pixel self, namely have:
Y (n)=α * y (n-1)+(1-α) * x (n) (5)
Its transmission function is,
H (z)=Y (z)/X (z)=(1-α)/(1-α z-1)(6)
In l-G simulation test, we value α is the amplitude-frequency characteristic of 0.2,0.6 and 0.8, and α value is more big, and radio-frequency component is repressed more severe, weakens speckle noise more obvious.
Such as Fig. 1, shown in Fig. 2, the specific embodiment of the present invention is: the present invention builds a kind of system merging B ultrasonic imaging and microwave imaging, including B ultrasonic image-generating unit 1, microwave imaging unit 2, fusion treatment unit 3, described B ultrasonic image-generating unit 1 generates B ultrasonic image according to the B ultrasonic signal received, described microwave imaging unit generates microwave imagery according to the microwave echoes signal received, B ultrasonic image and the microwave imagery of described microwave imaging unit 2 generation that described B ultrasonic image-generating unit 1 is generated by described fusion treatment unit 3 carry out registration, B ultrasonic image and the microwave imagery of microwave imaging generation that B ultrasonic imaging is generated by described fusion treatment unit 3 carry out registration, the pixel of piece image in described B ultrasonic image and described microwave imagery is mapped in another piece image, make the fusion reaching unanimously to complete this two width image related like vegetarian refreshments on locus of two width images.
Specific implementation process is as follows: described B ultrasonic image-generating unit 1 generates B ultrasonic image according to the B ultrasonic signal received, described microwave imaging unit generates microwave imagery according to the microwave echoes signal received, B ultrasonic image and the microwave imagery of described microwave imaging unit 2 generation that described B ultrasonic image-generating unit 1 is generated by described fusion treatment unit 3 carry out registration, B ultrasonic image and the microwave imagery of microwave imaging generation that B ultrasonic imaging is generated by described fusion treatment unit 3 carry out registration, the pixel of piece image in described B ultrasonic image and described microwave imagery is mapped in another piece image, make the fusion reaching unanimously to complete this two width image related like vegetarian refreshments on locus of two width images.Image co-registration includes multiple method: a kind of method is labelling method, in described B ultrasonic image and described microwave imagery, the characteristic point place of image carries out labelling, the image tagged of described B ultrasonic image and described microwave imagery is overlapped described B ultrasonic image and described microwave imagery are merged, when image co-registration, the image tagged of described B ultrasonic image and described microwave imagery is identified, then identification image is carried out image co-registration as boundary mark, identify image tagged feature, it is possible to make image co-registration more accurate.In specific embodiment, template or transparent mode is adopted to cover on another piece image piece image therein after carrying out labelling.Image tagged software design becomes the boundary mark synthesized by the Feature point recognition of image as image.Image tagged software is according to labelling to the labelling of tissue identification, the labelling of lymph node identification and characteristic of described B ultrasonic image and microwave imagery, coordinate axes is formed coincidence point, completing the fusion of described B ultrasonic image and microwave imagery, the method is equally applicable to two dimension or the three-dimensional breast image that other modes combine.Further comprises the display to multi-modal microwave breast image, the two and three dimensions information of display breast, the labelling to focal area.Another kind of method is pixel weighted mean method, it may be assumed that the image pixel weighted average of described B ultrasonic image and described microwave imagery correspondence position is completed image co-registration.
Described fusion treatment unit 3 also includes image tagged module 31, described image tagged module 31 characteristic point place of image in described B ultrasonic image and described microwave imagery carries out labelling, the image tagged of described B ultrasonic image and described microwave imagery is overlapped and described B ultrasonic image and described microwave imagery are merged.Described fusion treatment unit 3 also includes image tagged identification module 32, and the image tagged of described B ultrasonic image and described microwave imagery is identified by described image tagged identification module 32.
Image registration
By using coupling, superposition etc. to process means, the same target in multiple image is kept same position in the picture so that it is there is the process of identical space coordinates.
In the registration of microwave imagery and B ultrasonic image, using metastable microwave imagery as reference picture I, using B ultrasonic image as floating image II, carry out the image registration of the maximum mutual information method based on pixel, flow chart as shown in Figure 3 under:
Rigid body translation includes transformation of scale, in two-dimensional image I I, and point (x1, y1) through rigid transformation to point (x2, y2) application formula be:
x 2 y 2 = K c o s α ± sin α s i n α ± c o s α x 1 y 1 - - - ( 1 )
Wherein, α is the anglec of rotation, and K is scale parameter.
Mutual information relevance evaluation.The gray value of two images subject to registration is regarded as two stochastic variable A and B respectively, and scope 0 to 255, marginal probability distribution is P respectivelyA(a) and PBB (), joint probability distribution is PAB(a, b), then edge entropy and the combination entropy that can obtain A and B are respectively as follows: H (A), H (B) and H (A, B).Then have:
H ( A ) = - Σ a P A ( a ) logP A ( a ) H ( B ) = - Σ b P B ( b ) logP B ( b ) H ( A , B ) = - Σ a Σ b P A B ( a , b ) logP A B ( a , b ) , a , b ∈ [ 0 , 255 ] - - - ( 2 )
Normalized mutual information dependent evaluation function I (A, B) of stochastic variable A and B, for:
I ( A , B ) = H ( A ) + H ( B ) H ( A , B ) - - - ( 3 )
When two width reach optimal registration based on the image of common anatomical structure, gray scale cross-correlation value of information I (A, B) of they respective pixel should reach maximum.
Registration optimizes.After rigid body translation completes, a kind of similarity side degree need to be found further to weigh the similarity degree of two width images, it is necessary to constantly transformation parameter α and K so that similar side degree reaches optimum, wherein scale parameter K excursion is 0 to 1, and rotation angle range is 0 to 180 degree.
1. it is the unit vector of coordinate axes: c by α and K transformation range seti=e (i=1,2 ..., N);
2. record initial value position vector is P0=(α0, K0);
3. to i=1,2 ..., N, by Pi-1Move to object function I (A, B) and prolong ciThe maximum position in direction, writes down this Pi
4. to i=1,2 ..., N, by ci+1It is assigned to ci, juxtaposition cN=PN-P0
5. by PNMove to object function I (A, B) at cNMaximum point on direction, and record the P of this point0
6. repeat step 2. to arrive 5., until functional value I (A, B) no longer increases.
The preferred embodiment of the present invention is: also included carrying out Image semantic classification before carrying out image co-registration.
Microwave imagery preprocess method is as follows: big by external interference factor based on microwave imaging, Image semantic classification we utilize point processing to carry out the extension of contrast, making image clear, feature is obvious, it is assumed that original image f (x, y) tonal range is [a, b], and image g after conversion (x, what tonal range y) was linear extends to [c, d], then there is grey linear transformation expression formula is:
g ( x , y ) = d - c b - a × [ f ( x , y ) - a ] + c
When the grey level distribution of pixel most of in image is in interval [a, b], fmaxFor the maximum gray scale of artwork, only the gray level of very small part has exceeded this interval, then in order to improve reinforced effects, and Wo Menling
g ( x , y ) = c 0 ≤ f ( x , y ) ≤ a d - c b - a × [ f ( x , y ) - a ] + c a ≤ f ( x , y ) ≤ b d b ≤ f ( x , y ) ≤ f m a x
By the linear stretch to image, it is possible to be effectively improved picture contrast effect.
B ultrasonic image pre-processing method is as follows: owing to B ultrasonic imaging is except the intrinsic problem speckle noise existed, and there is also the bright spot high-frequency noise of random appearance simultaneously, and we utilize low pass Recursive filtering method that it is carried out pretreatment.
Assume the grey decision-making x of each pixel in the n-th width ultrasonoscopyn(i j) represents, α is correlation coefficient, then the image y after processingn(i j) is
yn(i, j)=α * yn-1(i, j)+(1-α) * xn(i, j) (4)
By formula (4) it can be seen that, the value of current each pixel is solely dependent upon the input of this pixel and the output of last time, unrelated with the value of other pixels, and we analyze its frequency characteristic by one-dimensional transform method, analyze the frequency response of each pixel self, namely have:
Y (n)=α * y (n-1)+(1-α) * x (n) (5)
Its transmission function is,
H (z)=Y (z)/X (z)=(1-α)/(1-α z-1)(6)
In l-G simulation test, we value α is the amplitude-frequency characteristic of 0.2,0.6 and 0.8, and α value is more big, and radio-frequency component is repressed more severe, weakens speckle noise more obvious.
The preferred embodiment of the present invention is: described microwave antenna constitutes aerial array.Aerial array entirety is hemispherical 32 the dual-mode antenna unit turned to towards half ball center.Aerial array is divided into transmitting and receiving unit, alternately arranges at equal intervals, and corresponding two an of transmitting element receives unit, staggered transmitting-receiving, control unit control aerial array and launch microwave radar detectable signal continuously to measured target, and control mode is single pole multiple throw.Microwave control unit provides cline frequency ripple to microwave switch aerial array, and operating frequency range is 1 to 30GHz.
The preferred embodiment of the present invention is: described fusion treatment unit 3 also includes image pre-processing module 34, and described B ultrasonic image and described microwave imagery are carried out pretreatment by described image pre-processing module 34.Image pre-processing module 34 carries out Image semantic classification, and process is as follows:
Microwave imagery preprocess method is as follows: big by external interference factor based on microwave imaging, Image semantic classification we utilize point processing to carry out the extension of contrast, making image clear, feature is obvious, it is assumed that original image f (x, y) tonal range is [a, b], and image g after conversion (x, what tonal range y) was linear extends to [c, d], then there is grey linear transformation expression formula is:
g ( x , y ) = d - c b - a × [ f ( x , y ) - a ] + c
When the grey level distribution of pixel most of in image is in interval [a, b], fmaxFor the maximum gray scale of artwork, only the gray level of very small part has exceeded this interval, then in order to improve reinforced effects, and Wo Menling
g ( x , y ) = c 0 ≤ f ( x , y ) ≤ a d - c b - a × [ f ( x , y ) - a ] + c a ≤ f ( x , y ) ≤ b d b ≤ f ( x , y ) ≤ f m a x
By the linear stretch to image, it is possible to be effectively improved picture contrast effect.
B ultrasonic image pre-processing method is as follows: owing to B ultrasonic imaging is except the intrinsic problem speckle noise existed, and there is also the bright spot high-frequency noise of random appearance simultaneously, and we utilize low pass Recursive filtering method that it is carried out pretreatment.
Assume the grey decision-making x of each pixel in the n-th width ultrasonoscopyn(i j) represents, α is correlation coefficient, then the image y after processingn(i j) is
yn(i, j)=α * yn-1(i, j)+(1-α) * xn(i, j) (4)
By formula (4) it can be seen that, the value of current each pixel is solely dependent upon the input of this pixel and the output of last time, unrelated with the value of other pixels, and we analyze its frequency characteristic by one-dimensional transform method, analyze the frequency response of each pixel self, namely have:
Y (n)=α * y (n-1)+(1-α) * x (n) (5)
Its transmission function is,
H (z)=Y (z)/X (z)=(1-α)/(1-α z-1)(6)
In l-G simulation test, we value α is the amplitude-frequency characteristic of 0.2,0.6 and 0.8, and α value is more big, and radio-frequency component is repressed more severe, weakens speckle noise more obvious.
The solution have the advantages that: build a kind of method and system merging B ultrasonic imaging and microwave imaging, by being mapped in another piece image by the pixel of piece image in described B ultrasonic image and described microwave imagery, make the fusion reaching unanimously to complete this two width image related like vegetarian refreshments on locus of two width images.Can by using different image modes to have complementary advantages in the process of diagnosis earlier stage cancer patients, microwave radar imaging utilizes ultra-wideband microwave signal to obtain target scattering center high-resolution in distance, then doppler information is utilized, obtain scattering center high-resolution in lateral separation, in combination with two dimension or the dimensional resolution that can obtain target, so that the multidigit high-resolution of target is achieved.Microwave Tomography is by low power microwave directive testee, under the excitation of microwave, measured object produces a scattered field, this scattered field is relevant with the complex dielectric permittivity distribution within measured object, by the measurement to this scattering, obtain the relative dielectric constant of measured object and the distribution of electrical conductivity, the microwave wave circuits and systems of measured object internal object after carrying out corresponding information processing, can be obtained.By both technology successful fusion in the present invention, constitute a set of multi-mode microwave breast imaging system, reach the purpose that imaging quality is complementary.
Above content is in conjunction with concrete preferred implementation further description made for the present invention, it is impossible to assert that specific embodiment of the invention is confined to these explanations.For general technical staff of the technical field of the invention, without departing from the inventive concept of the premise, it is also possible to make some simple deduction or replace, protection scope of the present invention all should be considered as belonging to.

Claims (9)

1. the method merging B ultrasonic imaging and microwave imaging, comprises the steps:
Imaging respectively: region to be measured is occurred B ultrasonic signal, and the echo-signal receiving described B ultrasonic signal carries out B ultrasonic imaging;Region to be measured is occurred microwave signal, and the echo-signal receiving described microwave signal carries out microwave imaging;
Image co-registration: the pixel of the microwave imagery piece image that the B ultrasonic image generate B ultrasonic imaging and microwave imaging generate is mapped in another piece image, makes the fusion reaching unanimously to complete this two width image related like vegetarian refreshments on locus of two width images.
2. the method merging B ultrasonic imaging and microwave imaging according to claim 1, it is characterized in that, in described B ultrasonic image and described microwave imagery, the characteristic point place of image carries out labelling, the image tagged of described B ultrasonic image and described microwave imagery is overlapped and described B ultrasonic image and described microwave imagery are merged.
3. the method merging B ultrasonic imaging and microwave imaging according to claim 2, it is characterised in that when image co-registration, the image tagged of described B ultrasonic image and described microwave imagery is identified, then identification image is carried out image co-registration as boundary mark.
4. the method merging B ultrasonic imaging and microwave imaging according to claim 1, it is characterised in that also include that described B ultrasonic image and described microwave imagery are carried out pretreatment and make image become apparent from.
5. the method merging B ultrasonic imaging and microwave imaging according to claim 1, it is characterised in that in image co-registration step, the image pixel weighted average of described B ultrasonic image and described microwave imagery correspondence position is completed image co-registration.
6. the system merging B ultrasonic imaging and microwave imaging, it is characterized in that, including B ultrasonic image-generating unit, microwave imaging unit, fusion treatment unit, described B ultrasonic image-generating unit generates B ultrasonic image according to the B ultrasonic signal received, described microwave imaging unit generates microwave imagery according to the microwave echoes signal received, the pixel of piece image in described B ultrasonic image and described microwave imagery is mapped in another piece image by described fusion treatment unit, makes the fusion reaching unanimously to complete this two width image related like vegetarian refreshments on locus of two width images.
7. merge B ultrasonic imaging and microwave imaging system according to claim 6, it is characterized in that, described fusion treatment unit also includes image tagged module, described image tagged module characteristic point place of image in described B ultrasonic image and described microwave imagery carries out labelling, the image tagged of described B ultrasonic image and described microwave imagery is overlapped and described B ultrasonic image and described microwave imagery are merged.
8. merge B ultrasonic imaging and microwave imaging system according to claim 6, it is characterized in that, described fusion treatment unit also includes image tagged identification module, and the image tagged of described B ultrasonic image and described microwave imagery is identified by described image tagged identification module.
9. merge B ultrasonic imaging and microwave imaging system according to claim 6, it is characterised in that also including image pre-processing module, described B ultrasonic image and described microwave imagery are carried out pretreatment by described image pre-processing module.
CN201511032447.8A 2015-12-31 2015-12-31 Method and system for fusing B ultrasonic imaging and microwave imaging Pending CN105708492A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201511032447.8A CN105708492A (en) 2015-12-31 2015-12-31 Method and system for fusing B ultrasonic imaging and microwave imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201511032447.8A CN105708492A (en) 2015-12-31 2015-12-31 Method and system for fusing B ultrasonic imaging and microwave imaging

Publications (1)

Publication Number Publication Date
CN105708492A true CN105708492A (en) 2016-06-29

Family

ID=56147009

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201511032447.8A Pending CN105708492A (en) 2015-12-31 2015-12-31 Method and system for fusing B ultrasonic imaging and microwave imaging

Country Status (1)

Country Link
CN (1) CN105708492A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3315075A1 (en) * 2016-10-27 2018-05-02 Micrima Limited System and method for combined microwave and ultrasound imaging
CN108542422A (en) * 2018-03-06 2018-09-18 武汉轻工大学 B ultrasound image optimization method, device and computer readable storage medium
CN109199381A (en) * 2018-09-11 2019-01-15 合肥工业大学 A kind of holography microwave elastogram system and its imaging method
CN109528306A (en) * 2019-01-08 2019-03-29 华北电力大学(保定) A kind of electromagnetism/resistance bimodal imaging device guiding hip replacement revision
CN111493931A (en) * 2019-08-01 2020-08-07 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic imaging method and device and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1681015A1 (en) * 2005-01-17 2006-07-19 Imasys SA Temperature mapping on structural data
US20100036240A1 (en) * 2008-08-07 2010-02-11 Ismail Aly M Multi-modality system for imaging in dense compressive media and method of use thereof
CN101959456A (en) * 2007-12-31 2011-01-26 真实成像有限公司 System and method for registration of imaging data
CN104574329A (en) * 2013-10-09 2015-04-29 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic fusion imaging method and ultrasonic fusion imaging navigation system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1681015A1 (en) * 2005-01-17 2006-07-19 Imasys SA Temperature mapping on structural data
CN101959456A (en) * 2007-12-31 2011-01-26 真实成像有限公司 System and method for registration of imaging data
US20100036240A1 (en) * 2008-08-07 2010-02-11 Ismail Aly M Multi-modality system for imaging in dense compressive media and method of use thereof
CN104574329A (en) * 2013-10-09 2015-04-29 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic fusion imaging method and ultrasonic fusion imaging navigation system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
朱圣权: "基于互信息的医学图像配准方法研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
章新友: "《医学图形图像处理》", 30 April 2015, 中国中医药出版社 *
马东等: "多模式医学图像的融合和配准技术", 《生物医学工程学杂志》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3315075A1 (en) * 2016-10-27 2018-05-02 Micrima Limited System and method for combined microwave and ultrasound imaging
WO2018078315A1 (en) * 2016-10-27 2018-05-03 Micrima Limited System and method for combined microwave and ultrasound imaging
CN108542422A (en) * 2018-03-06 2018-09-18 武汉轻工大学 B ultrasound image optimization method, device and computer readable storage medium
CN109199381A (en) * 2018-09-11 2019-01-15 合肥工业大学 A kind of holography microwave elastogram system and its imaging method
CN109199381B (en) * 2018-09-11 2021-11-02 合肥工业大学 Holographic microwave elastography system and imaging method thereof
CN109528306A (en) * 2019-01-08 2019-03-29 华北电力大学(保定) A kind of electromagnetism/resistance bimodal imaging device guiding hip replacement revision
CN111493931A (en) * 2019-08-01 2020-08-07 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic imaging method and device and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN105708492A (en) Method and system for fusing B ultrasonic imaging and microwave imaging
CN109199381B (en) Holographic microwave elastography system and imaging method thereof
CA3111578A1 (en) Apparatus and process for medical imaging
Akbari et al. 3D ultrasound image segmentation using wavelet support vector machines
CN103799982A (en) Rapid imaging method for ultra-wide band microwave detection based on Hilbert-huang transformation
US20210251610A1 (en) Automated diagnostics in 3d ultrasound system and method
Roohi et al. Machine learning approaches for automated stroke detection, segmentation, and classification in microwave brain imaging systems
CN105528773A (en) Multi-modal microwave imaging method and system based on labeling method
Zamani et al. Frequency domain method for early stage detection of congestive heart failure
Karam et al. A novel sophisticated form of DMAS beamformer: Application to breast cancer detection
CN205729316U (en) A kind of multi-modal microwave breast imaging device
CN205729400U (en) A kind of device merging B ultrasonic imaging and radar imagery
Girish et al. Breast cancer detection using deep learning
CN105678726A (en) Multi-modal microwave imaging method and system based on labeling method
Ojaroudi et al. A novel machine learning approach of hemorrhage stroke detection in differential microwave head imaging system
CN205729361U (en) A kind of multi-modal Microwave Scanning imaging device
CN105662408A (en) Multi-mode microwave imaging method and system
CN105976347A (en) Mark-method-based B ultrasound imaging and microwave imaging fused method and system
CN206151437U (en) Multimode microwave imaging device
Biçer et al. Deep learning-based classification of breast tumors using raw microwave imaging data
CN105725965A (en) Multi-mode microwave scanning and breast imaging method and system
CN205758511U (en) A kind of Microwave Scanning breast imaging device
Amdaouch et al. Confocal microwave imaging algorithm for breast cancer detection based on a high directive corrugated vivaldi antenna pulses
CN105708414A (en) Microwave scanning breast imaging method and system
CN105534479A (en) Multi-modal microwave breast imaging method and system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20160629

RJ01 Rejection of invention patent application after publication