CN102068281A - Processing method for space-occupying lesion ultrasonic images - Google Patents
Processing method for space-occupying lesion ultrasonic images Download PDFInfo
- Publication number
- CN102068281A CN102068281A CN 201110022840 CN201110022840A CN102068281A CN 102068281 A CN102068281 A CN 102068281A CN 201110022840 CN201110022840 CN 201110022840 CN 201110022840 A CN201110022840 A CN 201110022840A CN 102068281 A CN102068281 A CN 102068281A
- Authority
- CN
- China
- Prior art keywords
- mrow
- point
- image
- msub
- lesion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003902 lesion Effects 0.000 title claims abstract description 67
- 238000003672 processing method Methods 0.000 title abstract description 21
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 21
- 238000001914 filtration Methods 0.000 claims abstract description 18
- 230000002708 enhancing effect Effects 0.000 claims abstract description 3
- 238000000034 method Methods 0.000 claims description 46
- 238000002604 ultrasonography Methods 0.000 claims description 29
- 238000012545 processing Methods 0.000 claims description 19
- 241000270295 Serpentes Species 0.000 claims description 18
- 238000009792 diffusion process Methods 0.000 claims description 11
- 238000004364 calculation method Methods 0.000 claims description 8
- 239000002253 acid Substances 0.000 claims description 6
- 238000000354 decomposition reaction Methods 0.000 claims description 5
- 238000007781 pre-processing Methods 0.000 claims description 5
- 238000005070 sampling Methods 0.000 claims description 4
- 238000003745 diagnosis Methods 0.000 abstract description 7
- 238000003759 clinical diagnosis Methods 0.000 abstract description 3
- 238000000605 extraction Methods 0.000 abstract description 2
- 230000008569 process Effects 0.000 description 12
- 201000011510 cancer Diseases 0.000 description 7
- 206010028980 Neoplasm Diseases 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000009826 distribution Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000013399 early diagnosis Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 238000005452 bending Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 239000006185 dispersion Substances 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 238000008416 Ferritin Methods 0.000 description 1
- 102000008857 Ferritin Human genes 0.000 description 1
- 108050000784 Ferritin Proteins 0.000 description 1
- 208000032023 Signs and Symptoms Diseases 0.000 description 1
- 102000013529 alpha-Fetoproteins Human genes 0.000 description 1
- 108010026331 alpha-Fetoproteins Proteins 0.000 description 1
- 238000003556 assay Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013016 damping Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000002789 length control Methods 0.000 description 1
- 238000002595 magnetic resonance imaging Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005381 potential energy Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 208000024891 symptom Diseases 0.000 description 1
Images
Landscapes
- Ultra Sonic Daignosis Equipment (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a processing method for space-occupying lesion ultrasonic images. The processing method preprocesses the acquired ultrasonic image, comprising the following steps: the removal of the text information around the image, filtration, edge enhancement, the determination of an effective information region and the like; automatic location of a lesion region; determination of the rough contour of the space-occupying lesion, and extraction of the precise contour of the lesion region with the rough contour as an initial contour for an active contour model algorithm. The processing method realizes the functions of automatically segmenting the space-occupying lesion ultrasonic image and automatically extracting the region of interest in order to automatically diagnose the space-occupying lesion, thus enhancing the objectivity and accuracy of clinical diagnosis, and therefore the processing method effectively assists the diagnosis of space-occupying lesions.
Description
Technical Field
The invention relates to the field of ultrasonic images, in particular to a technology for processing an occupancy lesion ultrasonic image.
Background
Cancer is a general term for a large group of malignant tumors. The early stage of cancer is often with few symptoms, and the cancer gradually shows a series of symptoms and signs after a certain period of development, and then treatment is taken later. Therefore, early diagnosis and early treatment of cancer are particularly critical. The current main methods for early diagnosis of cancer include alpha-fetoprotein assay, ferritin detection, ultrasonic examination, CT examination, magnetic resonance imaging, selective contrast examination, and the like. Among them, the ultrasonic examination has the characteristics of no wound, no radiation, low price, real-time, repeatable operation, simple operation and the like, and has become one of the main means for early diagnosis of cancer.
However, in clinical practice, images acquired by ultrasound examination are mainly determined by observation with the naked eye empirically by a clinician. Therefore, the diagnosis result has great subjectivity, and misdiagnosis and missed diagnosis are easily caused; meanwhile, the outline of the focus area is manually drawn mainly by a doctor, which wastes time and labor and has poor result reproducibility. At present, the research on the processing method of the placeholder lesion ultrasonic image at home and abroad is still in the initial stage.
Therefore, a technology capable of objectively, accurately and quickly realizing automatic positioning and extraction of an ultrasonic image of a placeholder lesion is urgently needed.
Disclosure of Invention
The invention aims to provide a method for processing an ultrasonic image of an occupancy lesion, and aims to overcome the defect that a diagnosis result is extremely subjective due to the fact that a clinician excessively relies on experience when diagnosing the occupancy lesion by using ultrasonic at present.
The technical scheme of the invention is as follows:
a method for processing an ultrasound image of an occupancy lesion, comprising the steps of:
s1, acquiring medical ultrasonic images;
s2, preprocessing the medical ultrasonic image, and sequentially removing invalid character information, noise, boundary enhancement and determining an effective signal area;
s3, further positioning a focus area for the preprocessed ultrasonic image;
s4, determining a rough contour of the focus area;
and S5, taking the rough contour as an initial contour of an active contour model algorithm to extract a precise contour of the lesion area.
In the processing method of the ultrasonic image of the space-occupying lesion, in the step S2, three methods, namely a median filtering method, isotropic diffusion filtering and anisotropic diffusion filtering, are comprehensively used to achieve noise removal and boundary enhancement.
The processing method of the ultrasound image of the space-occupying lesion, wherein the method for determining the effective signal area in step S2 further includes:
s21, performing region growing on the entropy processed ultrasound image, where two regions exist in the image: a background area with a gray value of 0 and a target area with a gray value of 255;
s22, scanning the image from top to bottom, stopping until finding the point with the gray value of 255, recording the coordinates of the two points: p point on the left and M point on the right;
s23, continuing to scan the image, recording the coordinates of the points meeting the following conditions: (1) a point where the gradation value at the left side of the point P is 0 while the gradation value at the upper or lower side thereof is 0, (2) a point where the gradation value at the right side of the point M is 0 while the gradation value at the upper or lower side thereof is also 0;
and S24, comparing the sizes of the areas of the rectangles containing the effective echo signal areas formed by all the points, and taking the most appropriate one of the areas as the effective signal area.
The processing method of the ultrasound image of the space-occupying lesion, wherein the step S3 further includes:
s31, setting a concentric circle template, wherein the initial value of the radius R of the small circle is 10, the initial value of the radius R of the large circle is 20, and the template has elasticity, namely the radius of the template can be changed along with the size of the focus;
s32, scanning the determined effective signal area by using the template, and simultaneously calculating the gray average ratio and the variance ratio in the small circle and the outer ring of the template;
s33, after the scanning is finished, the circle center of the template stays at the first pixel point with the minimum or maximum gray scale average value ratio and variance ratio;
s34, searching for focus area boundary points in 8 directions by taking the first pixel point as a central point, and storing the obtained last pixel point in the directions under the condition that the gray values of all the pixel points in the directions are greater than a threshold value T =52, so that 8 points in 8 directions are obtained;
s35, calculating the distance between 8 points and the central point and the average distance, and if the distance between a certain point and the central point is more than 2 times of the average distance, removing the point;
and S36, collecting the maximum and minimum horizontal and vertical coordinates of the remaining pixel points, and making a rectangle by using the four values, wherein the rectangle is the approximate region of the focus region, thereby realizing the positioning of the focus region.
The processing method of the ultrasound image of the space-occupying lesion, wherein the step S4 further includes:
s41, in a 5 x 5 first neighborhood taking a first pixel point as a center, searching a second pixel point closest to the average value of the gray levels in the first neighborhood as a seed point;
s42, performing region growth on the second pixel point according to 8 neighborhoods, merging the points meeting the first growth condition into the first neighborhood, recalculating the gray average value, and taking the point closest to the new gray average value as a new growth point;
and S43, marking the grown region, and connecting the outermost marking points to form the rough outline of the lesion region after the region growing algorithm is finished.
The processing method of the ultrasound image of the space-occupying lesion, wherein the step S5 further includes:
s51, discrete sampling is carried out on the rough contour of the focus area to obtain a first discrete point;
s52, according to the T-Snake model, calculating the internal energy and the external energy of the first discrete point to obtain a second discrete point;
s53, performing mesh division on the image of the focus area;
s54, calculating the focus of the second discrete point and the grid to obtain a new snake point, and starting to execute from S52 again until the total energy obtained by calculation is not changed any more;
and S55, connecting the finally determined snake points to form a new contour, namely the precise contour of the lesion area.
In step S53, an ACID image decomposition technique is used to perform mesh division on the image of the lesion region.
The invention has the beneficial effects that:
compared with the traditional processing method, the processing method of the placeholder lesion ultrasonic image realizes the functions of automatically segmenting the placeholder lesion ultrasonic image and automatically extracting the region of interest, so that the placeholder lesion can be automatically diagnosed, the objectivity and the accuracy of clinical diagnosis are improved, and a good auxiliary effect is realized on the placeholder lesion diagnosis.
Drawings
FIG. 1 is a flow chart of the method for processing an ultrasound image of a placeholder lesion according to the present invention.
Fig. 2 is a schematic diagram of a method for determining an effective signal area in a processing method of an ultrasound image of a placeholder lesion according to an embodiment of the present invention.
Fig. 3 is a flowchart illustrating a method for processing a placeholder lesion ultrasound image to locate a lesion region according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a method for determining a rough contour of a lesion region in the processing method of a placeholder lesion ultrasound image according to an embodiment of the present invention.
FIG. 5 is a schematic diagram of a method for determining an accurate contour of a lesion region in a method for processing a placeholder lesion ultrasound image according to an embodiment of the present invention.
Fig. 6 is an ACID image decomposition diagram according to an embodiment of the present invention.
Detailed Description
The invention provides a processing method of an ultrasonic image of an occupied lesion, and aims to overcome the defect that a diagnosis result is extremely subjective due to the fact that a clinician excessively relies on experience when diagnosing the occupied lesion by utilizing ultrasound at present.
Referring to fig. 1, fig. 1 is a flowchart illustrating a processing method of an ultrasound image of a placeholder lesion according to the present invention. As shown in the figure, the technical scheme of the invention is that the acquired ultrasonic image is preprocessed, including character information removal around the image, filtering, boundary enhancement, effective information area determination and the like; then automatically positioning a focus area, determining a rough contour of the occupied lesion, and taking the rough contour as an initial contour of an active contour model algorithm to extract a precise contour of the focus area. The process comprises the following steps:
s1, acquiring medical ultrasonic images;
s2, preprocessing the medical ultrasonic image, and sequentially removing invalid character information, noise, boundary enhancement and determining an effective signal area;
s3, further positioning a focus area for the preprocessed ultrasonic image;
s4, determining a rough contour of the focus area;
and S5, taking the rough contour as an initial contour of an active contour model algorithm to extract a precise contour of the lesion area.
In step S2, the entropy of the local area of the image is calculated, and then the area growing algorithm is combined to remove the invalid text information. Because the periphery of the ultrasonic image is distributed with a lot of information about the model of the instrument, the setting of parameters, the scanning position and mode and the patient, and the real effective echo information is distributed in the middle of the image, if the information is not removed, the processing of the ultrasonic image is influenced. Therefore, removing these text messages and extracting valid ultrasound echo information regions are the necessary tasks in the preprocessing stage. By observing the ultrasonic image, the following can be visually observed: the text information is brighter, i.e. the gray level is higher, and the fluctuation range of the gray level is smaller. Aiming at the characteristic, in the embodiment of the invention, the entropy of the local area of the image is calculated, and then the area growing algorithm is combined to distinguish and remove the character information in the image.
Entropy is a statistical form of digital features of an image, and reflects the amount of information contained in the image, i.e., the degree of disorder of the image. The entropy calculation expression is: <math><mrow><mi>E</mi><mo>=</mo><mo>-</mo><munderover><mi>Σ</mi><mrow><mi>i</mi><mo>=</mo><mn>0</mn></mrow><mi>N</mi></munderover><msub><mi>p</mi><mi>i</mi></msub><msub><mi>log</mi><mn>2</mn></msub><msub><mi>p</mi><mi>i</mi></msub></mrow></math> for an ultrasound image (typically an 8-bit grayscale image), N is 255 and pi is the ratio of the number of pixels per grayscale to the total number of pixels in the image. The E value is large, which indicates that the image contains a large amount of information and the dispersion of the image gray level distribution is larger; the small value of E indicates that the image contains a small amount of information and the dispersion of the image gray distribution is smaller. In the ultrasonic image, the E value of the text region is small. According to this property of entropy, the image is first split into small regions of 20 × 20, the entropy of each sub-region is calculated separately according to a formula, and the entropy is substituted for the pixel values of the sub-region. After replacing pixel values in the image by the entropy values, all the entropy values are normalized to 0, 255]A threshold is set empirically: m, M may range from 80 to 120, where M is taken to be 100, a sub-region with a gray value below 100 is assigned to 0, i.e. a region of text information in the image, and a region above 100 is assigned to 255, i.e. a region of ultrasound echo information in the image.
After the entropy processing is performed on the image, part of the text information is still not removed, but a certain interval is formed between the text information and the effective information area in the middle part, so that the ultrasonic image is further processed by using a region growing algorithm. Because the middle part of the image is an effective information area and the gray value of the pixel of the middle part is 255 after entropy processing, firstly, a seed point is selected from the image, and the central point of the image is selected as the seed point; then 8-direction communication is carried out by taking the seed points as growth points, and the growth conditions are that the gray value of the neighborhood points is 255; the above process is repeated until no more new nodes meeting the conditions are added to the set.
The entropy of the local area of the image is calculated, and then the invalid character information is removed by combining the area growing algorithm, so that the invalid character information in the ultrasonic image can be removed, an effective ultrasonic echo information area is obtained, and the subsequent template traversal time can be effectively saved. However, the ultrasound image is complex, has low resolution and is very noisy, so that the image needs to be subjected to noise removal and boundary enhancement processing. In the embodiment of the invention, three methods, namely a median filtering method, isotropic diffusion filtering and anisotropic diffusion filtering, are adopted to realize noise removal and boundary enhancement. If the gradient value of the ultrasonic image is too large and random noise is easy to generate, median filtering is adopted; in a uniform area with small image gradient, smoothing the image by adopting isotropic diffusion filtering; and at the boundary of the region of interest, the image has a certain gradient, and anisotropic diffusion filtering is adopted, so that the image is smoothed, noise is removed, and the purpose of protecting the image boundary is achieved.
Specifically, the filtering algorithm is as follows:wherein Δ μ is the isotropic laplacian operator;is thatThe abbreviation of (x, y, t), representing the gradient operator; div is the divergence operator;is a degraded diffusion term which plays a role of diffusion filtering, mu is the image of the current iteration, I is the original image, and (1-g) (mu-I) is a balance control term which keeps the initial image from being excessively diffused; g is a scale function used to detect boundaries and control diffusion velocity: g is 1/(1+ k x G)σ|2) G σ is a gaussian function as a convolution kernel to smooth an image: <math><mrow><msub><mi>G</mi><mi>σ</mi></msub><mrow><mo>(</mo><mi>x</mi><mo>,</mo><mi>y</mi><mo>)</mo></mrow><mo>=</mo><mfrac><mn>1</mn><msup><mrow><mn>2</mn><mi>πσ</mi></mrow><mn>2</mn></msup></mfrac><mi>exp</mi><mrow><mo>(</mo><mo>-</mo><mfrac><mrow><msup><mi>x</mi><mn>2</mn></msup><mo>+</mo><msup><mi>y</mi><mn>2</mn></msup></mrow><msup><mrow><mn>2</mn><mi>σ</mi></mrow><mn>2</mn></msup></mfrac><mo>)</mo></mrow></mrow></math> the system still continues to use the setting of each parameter in the algorithm: (1) the values of the parameters in the g function are:wherein the parameter values of the Gaussian convolution kernel are as follows: σ is 2 and the iteration step Δ t is 0.48.
(2) Lambda is used for adjusting a balance control item, and the value is 1;
(3) y represents the gradient value at a pixel point in the image, taking its empirical values as Ymax =120, Ymin =1, respectively.
(4) The iteration termination condition is: <math><mrow><msub><mrow><mo>(</mo><mi>ISC</mi><mo>)</mo></mrow><mrow><mi>k</mi><mo>+</mo><mn>1</mn></mrow></msub><mo>=</mo><mn>100</mn><mo>×</mo><mfrac><mrow><mo>|</mo><msub><mi>RSNR</mi><mrow><mi>k</mi><mo>+</mo><mn>1</mn></mrow></msub><mo>-</mo><msub><mi>RSNR</mi><mi>k</mi></msub><mo>|</mo></mrow><msub><mi>RSNR</mi><mi>k</mi></msub></mfrac><mo>≤</mo><mi>Ω</mi></mrow></math> when the ISC satisfies the condition, the image adaptively stops the iteration, where Ω is 1.
The steps achieve the effects of removing noise and enhancing the boundary of the ultrasonic image. However, in order to shorten the processing time and realize real-time processing of the ultrasound image, it is also necessary to determine a rectangle in the ultrasound image, which contains as many echo signals as possible and as few invalid signals as possible, and it is most important that the lesion region should be included in the rectangle, which is referred to as an effective signal region herein.
Referring to fig. 2, in the embodiment of the present invention, the effective signal area determination process is as follows:
s21, performing region growing on the entropy processed ultrasound image, where two regions exist in the image: a background area with a gray value of 0 and a target area with a gray value of 255;
s22, scanning the image from top to bottom, stopping until finding the point with the gray value of 255, recording the coordinates of the two points: p point on the left and M point on the right;
s23, continuing to scan the image, recording the coordinates of the points meeting the following conditions: (1) a point where the gradation value at the left side of the point P is 0 while the gradation value at the upper or lower side thereof is 0, (2) a point where the gradation value at the right side of the point M is 0 while the gradation value at the upper or lower side thereof is also 0;
and S24, comparing the sizes of the areas of the rectangles containing the effective echo signal areas formed by all the points, and taking the most appropriate one of the areas as the effective signal area. Which contains as many echo signals as possible and as few invalid signals as possible.
For the preprocessed ultrasound image, the lesion region needs to be further located. In the method for processing the ultrasonic image, a concentric circle template is adopted to traverse the image, and a focus area is positioned by the average gray ratio and the variance ratio of the inner circle and the outer circle.
Please continue to refer to fig. 3, which includes the following steps:
s31, a concentric circle template is arranged, the initial value of the radius R of the small circle can be 8-12, R =10 is taken here, the initial value of the radius R of the large circle can be 18-22, R =20 is taken here, and the template has elasticity, namely the radius of the template can be changed along with the size of the focus;
s32, scanning the determined effective signal area by using the template, and simultaneously calculating the gray average ratio and the variance ratio in the small circle and the outer ring of the template;
s33, after the scanning is finished, the circle center of the template stays at the first pixel point with the minimum or maximum gray scale average value ratio and variance ratio;
s34, searching for focus area boundary points in 8 directions by taking the first pixel point as a central point, and storing the obtained last pixel point in the directions under the condition that the gray values of all the pixel points in the directions are greater than a threshold value T =52, so that 8 points in 8 directions are obtained;
s35, calculating the distance between 8 points and the central point and the average distance, and if the distance between a certain point and the central point is more than 2 times of the average distance, removing the point;
and S36, collecting the maximum and minimum horizontal and vertical coordinates of the remaining pixel points, and making a rectangle by using the four values, wherein the rectangle is the approximate region of the focus region, thereby realizing the positioning of the focus region.
In the scanning process of the concentric circle template, the average value ratio and the variance ratio based on the pixel gray values in the inner circle and the outer circle are calculated: if the echo is of a weak echo type, staying at the pixel point with the minimum ratio; if it is a strong echo type, it stays at the pixel point where the ratio is the largest.
It should be noted that, for the iso-echo or mixed echo type, since the gray scale feature is not obvious, the template is set to be a rectangle, and the effective information area is scanned and the texture feature value within the rectangle is calculated. The texture features of the ultrasonic image mainly include a gray level co-occurrence matrix, a statistical feature matrix, a Fourier power spectrum, a gray level difference, Laws texture energy and the like. The gray level co-occurrence matrix has good description capacity on the ultrasonic image, and the ultrasonic image processing method of the invention utilizes the entropy, the angular second moment (energy) positioning and other echo focus areas.
Having defined the lesion area, a rough contour of the lesion area is determined further below. In the embodiment of the invention, the region growing algorithm is adopted to determine the rough contour of the lesion region, the algorithm is easy to implement, and the rough contour of the lesion region can be relatively objectively described.
Referring to fig. 4, the method flow is as follows:
s41, in a 5 x 5 first neighborhood taking a first pixel point as a center, searching a second pixel point closest to the average value of the gray levels in the first neighborhood as a seed point;
s42, performing region growth on the second pixel point according to 8 neighborhoods, merging the points meeting the first growth condition into the first neighborhood, recalculating the gray average value, and taking the point closest to the new gray average value as a new growth point;
and S43, marking the grown region, and connecting the outermost marking points to form the rough outline of the lesion region after the region growing algorithm is finished.
After obtaining the rough contour of the lesion area, taking the rough contour as the initial contour of an active contour model algorithm to extract the precise contour of the lesion area.
In the embodiment of the invention, a T-Snake model is adopted. T, McInerney et al propose a T-Snake model combining with the acid (affinity Cell Image composition) technology, which retains many advantages of the traditional model and realizes the topological variability of the target curve. Similar to the traditional Snake model, the model is still defined as a closed contour formed by a series of nodes, the smoothness and continuity of a curve are maintained by internal force, interaction can be carried out through elastic force and other constraint conditions, the model is pushed to evolve towards the image boundary direction by expansive force until the model is balanced with external image force, and the evolution process of the model is still described by a Lagrange equation of motion; as an improved place, the node of the T-Snake is not invariable in the evolution process of the model, the model divides the image into a grid shape, the intersection point of the model and the grid is recalculated in the process of acting with internal and external forces, and a new point set is found to be used as a new node of the model re-evolution, so that the topological property of the curve is realized.
The evolution process of T-Snake can be considered as a forward transmission process, which is divided into two stages: the first stage is similar to the traditional active contour model; the second phase model will compute the focus with the mesh to form a new set of points, effecting a change in topology. Therefore, the biggest difference between the T-Snake model and the traditional contour model is that the T-Snake model can change the topological structure of a target curve according to the boundary information of an image, so that the purpose of detecting the boundary of the image with a complex structure and segmenting a multi-target image is achieved.
The T-Snake model is defined as a closed contour formed by connecting a series of nodes (N is set), wherein the motion equation of the ith (i ═ 0, 1, 2 … … N-1) node is: <math><mrow><msub><mi>γ</mi><mi>i</mi></msub><msubsup><mover><mi>x</mi><mo>→</mo></mover><mi>i</mi><mo>′</mo></msubsup><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>+</mo><mi>a</mi><msub><mover><mi>α</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>+</mo><mi>b</mi><msub><mover><mi>β</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>=</mo><msub><mover><mi>ρ</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>+</mo><msub><mover><mi>f</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow></mrow></math> whereinIs a function of the position of the ith node,is its motion velocity and γ i is the damping coefficient. The second two terms on the left of the equation are the internal energies of the model and the right two terms are the external energies. Assuming that the motion of the node has no inertia, when the model reaches the equilibrium state of the inner energy and the outer energy, the motion speed of the node becomes zero, and the node at this time is the boundary point of the target. The position of the ith node after the time of delta t is obtained by the formula: <math><mrow><msub><mover><mi>x</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>+</mo><mi>Δt</mi><mo>)</mo></mrow><mo>=</mo><msub><mover><mi>x</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>-</mo><mfrac><mi>Δt</mi><msub><mi>γ</mi><mi>i</mi></msub></mfrac><mrow><mo>(</mo><mi>a</mi><msub><mover><mi>α</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>+</mo><mi>b</mi><msub><mover><mi>β</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>-</mo><msub><mover><mi>ρ</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>-</mo><msub><mover><mi>f</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>)</mo></mrow></mrow></math> in the internal force structure of the T-Snake model,is the tensile force at the ith node, expressed as: <math><mrow><msub><mover><mi>α</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>=</mo><mn>2</mn><msub><mover><mi>x</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>-</mo><msub><mover><mi>x</mi><mo>→</mo></mover><mrow><mi>i</mi><mo>-</mo><mn>1</mn></mrow></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>-</mo><msub><mover><mi>x</mi><mo>→</mo></mover><mrow><mi>i</mi><mo>+</mo><mn>1</mn></mrow></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow></mrow></math> which is a discrete approximation of the second derivative of the ith nodal position function, and parameter a controls the strength of the tensile force.Is the bending force at the ith node, and the expression is: <math><mrow><msub><mover><mi>β</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>=</mo><mn>2</mn><msub><mover><mi>α</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>-</mo><msub><mover><mi>α</mi><mo>→</mo></mover><mrow><mi>i</mi><mo>-</mo><mn>1</mn></mrow></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>-</mo><msub><mover><mi>α</mi><mo>→</mo></mover><mrow><mi>i</mi><mo>+</mo><mn>1</mn></mrow></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow></mrow></math> which is a discrete approximation of the ith nodal position function, and parameter b controls the strength of the bending force.
In the external force structure of the T-Snake,for the expansion force of the i-th node, tableThe expression is as follows: <math><mrow><msub><mover><mi>ρ</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mi>q</mi><mo>·</mo><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><msub><mover><mi>x</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>)</mo></mrow><mo>)</mo></mrow><mo>·</mo><msub><mover><mi>n</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow></mrow></math> wherein,is the unit normal vector of the model at the ith node, and q is the magnitude of the expansive force, controlling the strength of the expansive force. Will be provided withDenoted F (I (x, y)), which is a binary function:where I (x, y) is the pixel value of the image at (x, y), T is a threshold value related to the image pixel value, and the function F shrinks T-Snake (I (x, y) < T) to prevent the contour from moving to the background. Considering the statistical properties of the image pixel values, F is extended to:mu is the average value of the image pixel values, sigma is the standard deviation of the image pixel values, and k is a self-defined parameter and controls the expansion range of the contour. From the above expansion forceAs can be seen from the structure of (a),controlling the moving direction and intensity of the contour pointsT-Snake is able to converge quickly near the edges of the image, but cannot guarantee convergence on the effective edges of the image, so another external force based on the image gradient characteristics is introduced: <math><mrow><msub><mover><mi>f</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>=</mo><mi>p</mi><mo>·</mo><mo>▿</mo><mi>P</mi><mrow><mo>(</mo><msub><mover><mi>x</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>)</mo></mrow></mrow></math> wherein p is the amplitude of the external force, and satisfies the following conditions: <math><mrow><mi>P</mi><mrow><mo>(</mo><mi>x</mi><mo>,</mo><mi>y</mi><mo>)</mo></mrow><mo>=</mo><mo>-</mo><mi>c</mi><mo>|</mo><mo>|</mo><mo>▿</mo><mo>[</mo><msub><mi>G</mi><mi>σ</mi></msub><mo>*</mo><mi>I</mi><mrow><mo>(</mo><mi>x</mi><mo>,</mo><mi>y</mi><mo>)</mo></mrow><mo>]</mo><mo>|</mo><mo>|</mo></mrow></math> the image I is subjected to Gaussian function smoothing filtering with standard deviation sigma to remove noise, and c is the amplitude of gradient potential energy.Corresponding to a change in gradient. When in useNodes expand when positive and contract when negative. In T-snake, p is often set to a value slightly greater than q so that when T-snake reaches near the edge, the value of p is greater than q due to the fact thatIs greater thanThe model continues to evolve towards a gradient becoming larger until the gradient at the node becomes maximum, at which point the T-snake converges to the true edge of the image. That is to say that the first and second electrodes,so that the model converges rapidly to the vicinity of the true edgeIt is guaranteed that the model converges on the true edge.
Please refer to fig. 5, which is a schematic diagram illustrating a method for determining an accurate contour of a lesion region in a processing method of a placeholder lesion ultrasound image according to an embodiment of the present invention. The method for determining the precise contour comprises the following steps:
s51, discrete sampling is carried out on the rough contour of the focus area to obtain a first discrete point;
s52, according to the T-Snake model, calculating the internal energy and the external energy of the first discrete point to obtain a second discrete point;
s53, performing mesh division on the image of the focus area;
s54, calculating the focus of the second discrete point and the grid to obtain a new snake point, and starting to execute from S52 again until the total energy obtained by calculation is not changed;
and S55, connecting the finally determined snake points to form a new contour, namely the precise contour of the lesion area.
Specifically, in S51, the obtained rough contour is subjected to discrete sampling to obtain a first discrete point, which is a snake point. In the embodiment of the invention, the center of a concentric circle template is taken as a central point, 15 rays are emitted, a point positioned in the ray direction is a first discrete point, and in addition, the first discrete point is stored in the anticlockwise direction.
At S52, the internal force and the external force at the first discrete point reach equilibrium, and the energy function takes a minimum value, which mainly includes three aspects: internal energy setting, external energy setting, minimum energy setting.
(1) Internal energy setting T-Snake model internal energy and traditional activitiesThe internal energy of the contour model is the same, and the contour model plays a role in keeping the smoothness and continuity of the contour. The internal force structural formula is as follows: <math><mrow><msub><mi>F</mi><mi>int</mi></msub><mo>=</mo><mi>a</mi><msub><mover><mi>α</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow><mo>+</mo><mi>b</mi><msub><mover><mi>β</mi><mo>→</mo></mover><mi>i</mi></msub><mrow><mo>(</mo><mi>t</mi><mo>)</mo></mrow></mrow></math> when converted into energy, the internal energy can be expressed as a linear combination of continuous energy and curvature. The continuous energy is typically the first derivative integral of the curve, and the curvature is the second derivative integral of the profile curve: <math><mrow><msub><mi>E</mi><mi>int</mi></msub><mrow><mo>(</mo><mi>r</mi><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>)</mo></mrow><mo>=</mo><msubsup><mo>∫</mo><mn>0</mn><mn>1</mn></msubsup><mi>α</mi><msup><mrow><mo>|</mo><msup><mi>r</mi><mo>′</mo></msup><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>|</mo></mrow><mn>2</mn></msup><mi>ds</mi><mo>+</mo><msubsup><mo>∫</mo><mn>0</mn><mn>1</mn></msubsup><mi>β</mi><msup><mrow><mo>|</mo><msup><mi>r</mi><mrow><mo>′</mo><mo>′</mo></mrow></msup><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>|</mo></mrow><mn>2</mn></msup><mi>ds</mi></mrow></math> in the calculation process, the differential format of the snake point is generally used to replace the differential, so that when the snake point is closer, the value of the continuous energy is smaller. If the target curve has only the effect of continuous energy, the contour can shrink into a point, and the real boundary of the target cannot be obtained. To solve the problem, the control of the contour average length is added into the continuous energy term of the active contour model, and the calculation formula is as follows: <math><mrow><mi>length</mi><mo>=</mo><msup><mrow><mo>(</mo><munderover><mo>∫</mo><mn>0</mn><mn>1</mn></munderover><msup><mrow><mo>(</mo><msup><mi>x</mi><mo>′</mo></msup><msup><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mn>2</mn></msup><mo>+</mo><msup><mi>y</mi><mo>′</mo></msup><msup><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mn>2</mn></msup><mo>)</mo></mrow><mfrac><mn>1</mn><mn>2</mn></mfrac></msup><mi>ds</mi><mo>)</mo></mrow><mn>2</mn></msup></mrow></math> if the number of the nodes is M, the average length of the target curve is as follows: and c is length/M.
Thus, the expression for internal energy is: <math><mrow><msub><mi>E</mi><mi>int</mi></msub><mo>=</mo><mi>α</mi><munderover><mo>∫</mo><mn>0</mn><mn>1</mn></munderover><msup><mrow><mo>|</mo><msup><mrow><mo>|</mo><msup><mi>r</mi><mo>′</mo></msup><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>|</mo></mrow><mn>2</mn></msup><mo>-</mo><mi>c</mi><mo>|</mo></mrow><mn>2</mn></msup><mi>ds</mi><mo>+</mo><mi>β</mi><munderover><mo>∫</mo><mn>0</mn><mn>1</mn></munderover><msup><mrow><mo>|</mo><msup><mi>r</mi><mrow><mo>′</mo><mo>′</mo></mrow></msup><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>|</mo></mrow><mn>2</mn></msup><mi>ds</mi></mrow></math> in this way, after the contour average length control item is added, when the continuous energy tends to be minimum in the energy minimization process, the distance between snake points tends to average length c, so that the condition that the target curve shrinks into one point is avoided, and the accurate contour of the focus area can be extracted more accurately.
(2) External energy setting
The internal energy only keeps the continuity, smoothness of the curve, and the external energy pushes the curve to the real boundary of the target.
The invention still adopts the image force in the traditional parameter activity contour model: <math><mrow><msub><mi>E</mi><mi>image</mi></msub><mo>=</mo><mo>-</mo><mi>c</mi><msup><mrow><mo>|</mo><mo>▿</mo><mrow><mo>(</mo><msub><mi>G</mi><mi>σ</mi></msub><mrow><mo>(</mo><mi>x</mi><mo>,</mo><mi>y</mi><mo>)</mo></mrow><mo>*</mo><mi>I</mi><mrow><mo>(</mo><mi>x</mi><mo>,</mo><mi>y</mi><mo>)</mo></mrow><mo>)</mo></mrow><mo>|</mo></mrow><mn>2</mn></msup></mrow></math> wherein G sigma is a Gaussian filter,is the gradient operator, is the convolution operation, and c is the strength of the gradient magnitude. The image force purely based on the image gradient is only subjected to Gaussian smooth filtering and is still sensitive to noise, so that in order to enable the target curve to evolve towards the real boundary of the region of interest, the expansion force based on the image gray level and the region energy based on the image region are added.
To evolve the target curve to the real boundary of the region of interest, a region-based energy is introduced into the energy function of the active contour model. Based on the regional energy of the image region, the principle is to assume that the image includes two main regions: an object of interest and a background region. The pixels of the two part regions have different probability distributions. Of course, the assumption of two regions can be easily generalized to the case of a plurality of different distribution regions, so that the method is also suitable for multi-target segmentation. In the implementation process, it is often assumed that the object and the background are gaussian distributions with different mean and variance, and the region energy is designed according to the likelihood function as follows: <math><mrow><msub><mi>E</mi><mi>region</mi></msub><mo>=</mo><mo>-</mo><munder><mi>Σ</mi><mrow><mi>s</mi><mo>∈</mo><mi>R</mi></mrow></munder><mi>log</mi><msub><mi>P</mi><mi>R</mi></msub><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>)</mo></mrow><mo>-</mo><munder><mi>Σ</mi><mrow><mi>s</mi><mo>∈</mo><msup><mi>R</mi><mo>′</mo></msup></mrow></munder><mi>log</mi><msub><mi>P</mi><msup><mi>R</mi><mo>′</mo></msup></msub><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>)</mo></mrow></mrow></math> wherein, R is the target region, and R' is the background region. For the convenience of calculation, the above formula is changed into: <math><mrow><msub><mi>E</mi><mi>region</mi></msub><mo>=</mo><mo>-</mo><munder><mi>Σ</mi><mrow><mi>s</mi><mo>∈</mo><mi>R</mi></mrow></munder><mi>log</mi><msub><mi>P</mi><mi>R</mi></msub><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>)</mo></mrow><mo>-</mo><mrow><mo>(</mo><mi>C</mi><mo>-</mo><munder><mi>Σ</mi><mrow><mi>s</mi><mo>∈</mo><mi>R</mi></mrow></munder><mi>log</mi><msub><mi>P</mi><msup><mi>R</mi><mo>′</mo></msup></msub><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>)</mo></mrow><mo>)</mo></mrow></mrow></math> Wherein C is PR′(i (s)) the result of summing over the whole image: c = Σ logPR (i (s)).
Since C is independent of contour position, and can therefore be removed from the energy function, the region energy can be further simplified as: <math><mrow><msub><mi>E</mi><mi>region</mi></msub><mo>=</mo><mo>-</mo><munder><mi>Σ</mi><mrow><mi>s</mi><mo>∈</mo><mi>R</mi></mrow></munder><mi>log</mi><mrow><mo>(</mo><mfrac><mrow><msub><mi>P</mi><mi>R</mi></msub><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>)</mo></mrow></mrow><mrow><msub><mi>P</mi><msup><mi>R</mi><mo>′</mo></msup></msub><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>)</mo></mrow></mrow></mfrac><mo>)</mo></mrow></mrow></math> the probability density functions for the target and background regions are: <math><mrow><msub><mi>P</mi><mi>R</mi></msub><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>)</mo></mrow><mo>=</mo><mfrac><mn>1</mn><mrow><msub><mi>σ</mi><mi>R</mi></msub><msqrt><mn>2</mn><mi>π</mi></msqrt></mrow></mfrac><msup><mi>e</mi><mfrac><mrow><mo>-</mo><msup><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>-</mo><msub><mi>μ</mi><mi>R</mi></msub><mo>)</mo></mrow><mn>2</mn></msup></mrow><mrow><mn>2</mn><msup><msub><mi>σ</mi><mi>R</mi></msub><mn>2</mn></msup></mrow></mfrac></msup></mrow></math> <math><mrow><msub><mi>P</mi><msup><mi>R</mi><mo>′</mo></msup></msub><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>)</mo></mrow><mo>=</mo><mfrac><mn>1</mn><mrow><msub><mi>σ</mi><msup><mi>R</mi><mo>′</mo></msup></msub><msqrt><mn>2</mn><mi>π</mi></msqrt></mrow></mfrac><msup><mi>e</mi><mfrac><mrow><mo>-</mo><msup><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>-</mo><msub><mi>μ</mi><msup><mi>R</mi><mo>′</mo></msup></msub><mo>)</mo></mrow><mn>2</mn></msup></mrow><msup><msub><mrow><mn>2</mn><mi>σ</mi></mrow><msup><mi>R</mi><mo>′</mo></msup></msub><mn>2</mn></msup></mfrac></msup></mrow></math> the expression for the final region energy is: <math><mrow><msub><mi>E</mi><mi>region</mi></msub><mo>=</mo><munder><mi>Σ</mi><mrow><mi>s</mi><mo>∈</mo><mi>R</mi></mrow></munder><mrow><mo>(</mo><mrow><mo>(</mo><mfrac><mn>1</mn><msubsup><mi>σ</mi><msup><mi>R</mi><mo>′</mo></msup><mn>2</mn></msubsup></mfrac><mo>-</mo><mfrac><mn>1</mn><msubsup><mi>σ</mi><mi>R</mi><mn>2</mn></msubsup></mfrac><mo>)</mo></mrow><msup><mi>I</mi><mn>2</mn></msup><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>-</mo><mn>2</mn><mrow><mo>(</mo><mfrac><msub><mi>u</mi><msup><mi>R</mi><mo>′</mo></msup></msub><msubsup><mi>σ</mi><msup><mi>R</mi><mo>′</mo></msup><mn>2</mn></msubsup></mfrac><mo>-</mo><mfrac><msub><mi>u</mi><mi>R</mi></msub><msubsup><mi>σ</mi><mi>R</mi><mn>2</mn></msubsup></mfrac><mo>)</mo></mrow><mi>I</mi><mrow><mo>(</mo><mi>s</mi><mo>)</mo></mrow><mo>+</mo><mfrac><msubsup><mi>μ</mi><msup><mi>R</mi><mo>′</mo></msup><mn>2</mn></msubsup><msubsup><mi>σ</mi><msup><mi>R</mi><mo>′</mo></msup><mn>2</mn></msubsup></mfrac><mo>-</mo><mfrac><msubsup><mi>μ</mi><msup><mi>R</mi><mo>′</mo></msup><mn>2</mn></msubsup><msubsup><mi>σ</mi><mi>R</mi><mn>2</mn></msubsup></mfrac><mo>+</mo><mi>log</mi><mrow><mo>(</mo><mfrac><msub><mi>σ</mi><msup><mi>R</mi><mo>′</mo></msup></msub><msub><mi>σ</mi><mi>R</mi></msub></mfrac><mo>)</mo></mrow><mo>)</mo></mrow></mrow></math> (3) the evolution problem of the curve in the minimum energy setting active contour model algorithm becomes an energy function minimization problem, and the variation problem is converted into the problem of solving the optimal solution. The energy function model we finally designed is: <math><mrow><msub><mi>E</mi><mi>snake</mi></msub><mo>=</mo><mi>min</mi><mrow><mo>(</mo><msub><mi>E</mi><mi>total</mi></msub><mo>)</mo></mrow><mo>=</mo><mi>min</mi><munderover><mi>Σ</mi><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mi>N</mi></munderover><mrow><mo>[</mo><msub><mi>E</mi><mrow><mi>int</mi><mo>,</mo><mi>i</mi></mrow></msub><mo>+</mo><msub><mi>E</mi><mrow><mi>image</mi><mo>,</mo><mi>i</mi></mrow></msub><mo>+</mo><msub><mi>E</mi><mrow><mi>inf lation</mi><mo>,</mo><mi>i</mi></mrow></msub><mo>+</mo><msub><mi>E</mi><mrow><mi>region</mi><mo>,</mo><mi>i</mi></mrow></msub><mo>)</mo></mrow></mrow></math> after factors such as calculation complexity and optimal solution are comprehensively considered, a greedy algorithm is adopted to optimize snake points and solve a minimum energy function. The algorithm is a local optimal algorithm, only the energy change of the current snake point in the neighborhood of the current snake point is considered, the influence caused by other snake points is not considered, and the position of each snake point is sequentially adjusted in each iteration according to the principle of minimum energy until the positions of all snake points are not changed any more.
Further, in step S53, an ACID image decomposition technique is used to mesh the image of the lesion region.
The ACID image decomposition technique decomposes the two-dimensional spatial image into a series of triangular meshes, as shown in fig. 6. The triangular meshes avoid ambiguity of the T-Snake model in the evolution process.
The processing method of the ultrasonic image of the placeholder lesion, provided by the invention, realizes the functions of automatically segmenting the ultrasonic image of the placeholder lesion and automatically extracting the region of interest, so that the placeholder lesion can be automatically diagnosed, the objectivity and the accuracy of clinical diagnosis are improved, and a good auxiliary effect is realized on the diagnosis of the placeholder lesion.
It should be understood that the above description of the preferred embodiments is given for clarity and not for any purpose of limitation, and that various alternatives, modifications and combinations can be devised by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (6)
1. A method for processing an occupancy lesion ultrasonic image is characterized by comprising the following steps:
s1, acquiring medical ultrasonic images;
s2, preprocessing the medical ultrasonic image, wherein the preprocessing comprises removing invalid character information, removing noise, enhancing a boundary and determining an effective signal area;
s3, further positioning a focus area for the preprocessed ultrasonic image;
s4, determining a rough contour of the focus area;
and S5, taking the rough contour as an initial contour of an active contour model algorithm to extract a precise contour of the lesion area.
2. The method for processing the ultrasound image of the placeholder lesion of claim 1, wherein the noise removal and the boundary enhancement are achieved by comprehensively adopting three methods of a median filtering method, an isotropic diffusion filtering method and an anisotropic diffusion filtering method in step S2.
3. The method for processing a placeholder lesion ultrasound image according to claim 1, wherein the method for determining a valid signal region in step S2 further comprises:
s21, performing region growing on the entropy processed ultrasound image, where two regions exist in the image: a background area with a gray value of 0 and a target area with a gray value of 255;
s22, scanning the image from top to bottom, stopping until finding the point with the gray value of 255, recording the coordinates of the two points: p point on the left and M point on the right;
s23, continuing to scan the image, recording the coordinates of the points meeting the following conditions: (1) a point where the gradation value at the left side of the point P is 0 while the gradation value at the upper or lower side thereof is 0, (2) a point where the gradation value at the right side of the point M is 0 while the gradation value at the upper or lower side thereof is also 0;
and S24, comparing the sizes of the areas of the rectangles containing the effective echo signal areas formed by all the points, and taking the most appropriate one of the areas as the effective signal area.
4. The method for processing a placeholder lesion ultrasound image according to claim 1, wherein the step S3 further comprises:
s31, setting a concentric circle template, wherein the initial value of the radius R of the small circle is 10, the initial value of the radius R of the large circle is 20, and the template has elasticity, namely the radius of the template can be changed along with the size of the focus;
s32, scanning the determined effective signal area by using the template, and simultaneously calculating the gray average ratio and the variance ratio in the small circle and the outer ring of the template;
s33, after the scanning is finished, the circle center of the template stays at the first pixel point with the minimum or maximum gray scale average value ratio and variance ratio;
s34, searching for focus area boundary points in 8 directions by taking the first pixel point as a central point, and storing the obtained last pixel point in the directions under the condition that the gray values of all the pixel points in the directions are greater than a threshold value T =52, so that 8 points in 8 directions are obtained;
s35, calculating the distance between 8 points and the central point and the average distance, and if the distance between a certain point and the central point is more than 2 times of the average distance, removing the point;
and S36, collecting the maximum and minimum horizontal and vertical coordinates of the remaining pixel points, and making a rectangle by using the four values, wherein the rectangle is the approximate region of the focus region, thereby realizing the positioning of the focus region.
5. The method for processing a placeholder lesion ultrasound image according to claim 1, wherein the step S4 further comprises:
s41, in a 5 x 5 first neighborhood taking a first pixel point as a center, searching a second pixel point closest to the average value of the gray levels in the first neighborhood as a seed point;
s42, performing region growth on the second pixel point according to 8 neighborhoods, merging the points meeting the first growth condition into the first neighborhood, recalculating the gray average value, and taking the point closest to the new gray average value as a new growth point;
and S43, marking the grown region, and connecting the outermost marking points to form the rough outline of the lesion region after the region growing algorithm is finished.
6. The method for processing a placeholder lesion ultrasound image according to claim 1, wherein the step S5 further comprises:
s51, discrete sampling is carried out on the rough contour of the focus area to obtain a first discrete point;
s52, according to the T-Snake model, calculating the internal energy and the external energy of the first discrete point to obtain a second discrete point;
s53, performing mesh division on the image of the focus area;
s54, calculating the focus of the second discrete point and the grid to obtain a new snake point, and starting to execute from S52 again until the total energy obtained by calculation is not changed;
s55, connecting the finally determined snake points to form a new contour, namely the accurate contour of the focus area;
in step S53, an ACID image decomposition technique is used to perform mesh division on the image of the lesion region.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110022840A CN102068281B (en) | 2011-01-20 | 2011-01-20 | Processing method for space-occupying lesion ultrasonic images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110022840A CN102068281B (en) | 2011-01-20 | 2011-01-20 | Processing method for space-occupying lesion ultrasonic images |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102068281A true CN102068281A (en) | 2011-05-25 |
CN102068281B CN102068281B (en) | 2012-10-03 |
Family
ID=44027168
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201110022840A Expired - Fee Related CN102068281B (en) | 2011-01-20 | 2011-01-20 | Processing method for space-occupying lesion ultrasonic images |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102068281B (en) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102855483A (en) * | 2011-06-30 | 2013-01-02 | 北京三星通信技术研究有限公司 | Method and device for processing ultrasonic images and breast cancer diagnosis equipment |
CN102871686A (en) * | 2012-03-05 | 2013-01-16 | 杭州弘恩医疗科技有限公司 | Device and method for determining physiological parameters based on 3D (three-dimensional) medical images |
CN103456002A (en) * | 2012-05-29 | 2013-12-18 | 通用电气公司 | Methods and system for displaying segmented images |
CN103577824A (en) * | 2012-07-24 | 2014-02-12 | 浙江大华技术股份有限公司 | Method and device for extracting target image |
CN103970262A (en) * | 2013-02-06 | 2014-08-06 | 原相科技股份有限公司 | Optical pointing system |
CN104021539A (en) * | 2013-02-28 | 2014-09-03 | 北京三星通信技术研究有限公司 | System used for automatically detecting tumour in ultrasonic image |
CN104463892A (en) * | 2014-12-24 | 2015-03-25 | 福州大学 | Bacterial colony image segmentation method based on level set and GVF Snake accurate positioning |
CN106419821A (en) * | 2016-08-31 | 2017-02-22 | 北京大学第三医院 | Method and device for arthroscopically measuring intra-articular structure |
CN106651892A (en) * | 2016-12-21 | 2017-05-10 | 福建师范大学 | Edge detection method |
US9766717B2 (en) | 2013-01-29 | 2017-09-19 | Pixart Imaging Inc. | Optical pointing system |
CN107789056A (en) * | 2017-10-19 | 2018-03-13 | 青岛大学附属医院 | A kind of medical image matches fusion method |
CN108027959A (en) * | 2015-09-25 | 2018-05-11 | 皇家飞利浦有限公司 | Spatial flicker in fluorescence imaging under low frame per second removes |
CN110009645A (en) * | 2019-04-11 | 2019-07-12 | 东北大学 | A kind of double-deck profile dividing method of liver lesion image |
CN110251083A (en) * | 2019-06-20 | 2019-09-20 | 深圳大学 | A kind of processing method, system and the storage medium of the location data of epileptic focus |
CN112053400A (en) * | 2020-09-09 | 2020-12-08 | 北京柏惠维康科技有限公司 | Data processing method and robot navigation system |
CN113034426A (en) * | 2019-12-25 | 2021-06-25 | 飞依诺科技(苏州)有限公司 | Ultrasonic image focus description method, device, computer equipment and storage medium |
CN113040873A (en) * | 2019-12-27 | 2021-06-29 | 深圳市理邦精密仪器股份有限公司 | Image processing method of ultrasound image, ultrasound apparatus, and storage medium |
CN113744846A (en) * | 2020-05-27 | 2021-12-03 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasonic image processing method, ultrasonic imaging system and computer storage medium |
CN117408988A (en) * | 2023-11-08 | 2024-01-16 | 北京维思陆科技有限公司 | Artificial intelligence-based focus image analysis method and apparatus |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1202096A (en) * | 1995-10-02 | 1998-12-16 | 奈科姆成像有限公司 | Improvements in or relating to ultrasound imaging |
CN1205868A (en) * | 1997-06-19 | 1999-01-27 | 梅迪诺尔有限公司 | Intravascular ultrasound enhanced image and signal processing |
US20060184031A1 (en) * | 2005-01-26 | 2006-08-17 | Kabushiki Kaisha Toshiba | Ultrasonic diagnostic apparatus and ultrasonic image acquiring method |
-
2011
- 2011-01-20 CN CN201110022840A patent/CN102068281B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1202096A (en) * | 1995-10-02 | 1998-12-16 | 奈科姆成像有限公司 | Improvements in or relating to ultrasound imaging |
CN1205868A (en) * | 1997-06-19 | 1999-01-27 | 梅迪诺尔有限公司 | Intravascular ultrasound enhanced image and signal processing |
US20060184031A1 (en) * | 2005-01-26 | 2006-08-17 | Kabushiki Kaisha Toshiba | Ultrasonic diagnostic apparatus and ultrasonic image acquiring method |
Non-Patent Citations (2)
Title |
---|
《中原工学院学报》 20021231 苗凤君 B超图像数据采集及其计算机图像处理技术 , 第04期 * |
《生物医学工程学杂志》 20070430 陈科等 超声医学图像滤波和对比度增强新方法 , 第02期 * |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102855483A (en) * | 2011-06-30 | 2013-01-02 | 北京三星通信技术研究有限公司 | Method and device for processing ultrasonic images and breast cancer diagnosis equipment |
CN102855483B (en) * | 2011-06-30 | 2017-09-12 | 北京三星通信技术研究有限公司 | Handle the method and apparatus and breast cancer diagnosis apparatus of ultrasonoscopy |
CN102920477B (en) * | 2012-03-05 | 2015-05-20 | 杭州弘恩医疗科技有限公司 | Device and method for determining target region boundary of medical image |
CN102871686A (en) * | 2012-03-05 | 2013-01-16 | 杭州弘恩医疗科技有限公司 | Device and method for determining physiological parameters based on 3D (three-dimensional) medical images |
CN102920477A (en) * | 2012-03-05 | 2013-02-13 | 杭州弘恩医疗科技有限公司 | Device and method for determining target region boundary of medical image |
WO2013131421A1 (en) * | 2012-03-05 | 2013-09-12 | 杭州弘恩医疗科技有限公司 | Device and method for determining physiological parameters based on 3d medical images |
CN102871686B (en) * | 2012-03-05 | 2015-08-19 | 杭州弘恩医疗科技有限公司 | The apparatus and method of physiological parameter are measured based on 3D medical image |
CN103456002A (en) * | 2012-05-29 | 2013-12-18 | 通用电气公司 | Methods and system for displaying segmented images |
CN103577824A (en) * | 2012-07-24 | 2014-02-12 | 浙江大华技术股份有限公司 | Method and device for extracting target image |
US9958961B2 (en) | 2013-01-29 | 2018-05-01 | Pixart Imaging Inc. | Optical pointing system |
US9766717B2 (en) | 2013-01-29 | 2017-09-19 | Pixart Imaging Inc. | Optical pointing system |
US10228772B2 (en) | 2013-01-29 | 2019-03-12 | Pixart Imaging Inc. | Remote controller |
CN103970262B (en) * | 2013-02-06 | 2018-01-16 | 原相科技股份有限公司 | Optical profile type pointing system |
CN107992198B (en) * | 2013-02-06 | 2021-01-05 | 原相科技股份有限公司 | Optical pointing system |
CN107992198A (en) * | 2013-02-06 | 2018-05-04 | 原相科技股份有限公司 | Optical profile type pointing system |
CN103970262A (en) * | 2013-02-06 | 2014-08-06 | 原相科技股份有限公司 | Optical pointing system |
CN104021539A (en) * | 2013-02-28 | 2014-09-03 | 北京三星通信技术研究有限公司 | System used for automatically detecting tumour in ultrasonic image |
CN104021539B (en) * | 2013-02-28 | 2019-07-16 | 北京三星通信技术研究有限公司 | System for detecting tumour automatically in ultrasound image |
CN104463892A (en) * | 2014-12-24 | 2015-03-25 | 福州大学 | Bacterial colony image segmentation method based on level set and GVF Snake accurate positioning |
CN104463892B (en) * | 2014-12-24 | 2017-06-06 | 福州大学 | Based on level set and the pinpoint Colony hybridization dividing methods of GVF Snake |
CN108027959A (en) * | 2015-09-25 | 2018-05-11 | 皇家飞利浦有限公司 | Spatial flicker in fluorescence imaging under low frame per second removes |
CN106419821B (en) * | 2016-08-31 | 2018-06-08 | 北京大学第三医院 | A kind of method and apparatus that joint inner structure is measured under arthroscope |
CN106419821A (en) * | 2016-08-31 | 2017-02-22 | 北京大学第三医院 | Method and device for arthroscopically measuring intra-articular structure |
CN106651892A (en) * | 2016-12-21 | 2017-05-10 | 福建师范大学 | Edge detection method |
CN106651892B (en) * | 2016-12-21 | 2019-09-17 | 福建师范大学 | A kind of edge detection method |
CN107789056A (en) * | 2017-10-19 | 2018-03-13 | 青岛大学附属医院 | A kind of medical image matches fusion method |
CN107789056B (en) * | 2017-10-19 | 2021-04-13 | 青岛大学附属医院 | Medical image matching and fusing method |
CN110009645A (en) * | 2019-04-11 | 2019-07-12 | 东北大学 | A kind of double-deck profile dividing method of liver lesion image |
CN110251083A (en) * | 2019-06-20 | 2019-09-20 | 深圳大学 | A kind of processing method, system and the storage medium of the location data of epileptic focus |
CN113034426A (en) * | 2019-12-25 | 2021-06-25 | 飞依诺科技(苏州)有限公司 | Ultrasonic image focus description method, device, computer equipment and storage medium |
CN113034426B (en) * | 2019-12-25 | 2024-03-08 | 飞依诺科技股份有限公司 | Ultrasonic image focus description method, device, computer equipment and storage medium |
CN113040873A (en) * | 2019-12-27 | 2021-06-29 | 深圳市理邦精密仪器股份有限公司 | Image processing method of ultrasound image, ultrasound apparatus, and storage medium |
CN113744846A (en) * | 2020-05-27 | 2021-12-03 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasonic image processing method, ultrasonic imaging system and computer storage medium |
CN112053400A (en) * | 2020-09-09 | 2020-12-08 | 北京柏惠维康科技有限公司 | Data processing method and robot navigation system |
CN117408988B (en) * | 2023-11-08 | 2024-05-14 | 北京维思陆科技有限公司 | Artificial intelligence-based focus image analysis method and apparatus |
CN117408988A (en) * | 2023-11-08 | 2024-01-16 | 北京维思陆科技有限公司 | Artificial intelligence-based focus image analysis method and apparatus |
Also Published As
Publication number | Publication date |
---|---|
CN102068281B (en) | 2012-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102068281B (en) | Processing method for space-occupying lesion ultrasonic images | |
CN110047082B (en) | Deep learning-based pancreatic neuroendocrine tumor automatic segmentation method and system | |
CN107480677B (en) | Method and device for identifying interest region in three-dimensional CT image | |
CN100561518C (en) | Self-adaptation medical image sequence interpolation method based on area-of-interest | |
US7787673B2 (en) | Method and apparatus for airway detection and segmentation using 3D morphological operators | |
EP2401719B1 (en) | Methods for segmenting images and detecting specific structures | |
CN101576997A (en) | Abdominal organ segmentation method based on secondary three-dimensional region growth | |
CN107527339B (en) | Magnetic resonance scanning method, device and system | |
CN106530236B (en) | Medical image processing method and system | |
CN107680110B (en) | Inner ear three-dimensional level set segmentation method based on statistical shape model | |
CN103886576A (en) | Glandular tissue characteristic gray scale detection method and device | |
CN111815663A (en) | Hepatic vessel segmentation system based on Hessian matrix and gray scale method | |
CN111127404B (en) | Medical image contour rapid extraction method | |
CN113989407B (en) | Training method and system for limb part recognition model in CT image | |
JP3678378B2 (en) | Abnormal shadow candidate detection method and apparatus | |
CN110874860A (en) | Target extraction method of symmetric supervision model based on mixed loss function | |
CN108961278B (en) | Method and system for abdominal wall muscle segmentation based on image data | |
CN108280833B (en) | Skeleton extraction method for plant root system bifurcation characteristics | |
CN115564782A (en) | 3D blood vessel and trachea segmentation method and system | |
CN106780718A (en) | A kind of three-dimensional rebuilding method of paleontological fossil | |
Purnama et al. | Follicle detection on the usg images to support determination of polycystic ovary syndrome | |
CN116309806A (en) | CSAI-Grid RCNN-based thyroid ultrasound image region of interest positioning method | |
KR101251822B1 (en) | System and method for analysising perfusion in dynamic contrast-enhanced lung computed tomography images | |
CN117576123A (en) | Cardiovascular CT image data segmentation detection method | |
CN104915989A (en) | CT image-based blood vessel three-dimensional segmentation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20121003 Termination date: 20190120 |
|
CF01 | Termination of patent right due to non-payment of annual fee |