CN113487537A - Information processing method, device and storage medium for breast cancer ultrasonic high-echo halo - Google Patents

Information processing method, device and storage medium for breast cancer ultrasonic high-echo halo Download PDF

Info

Publication number
CN113487537A
CN113487537A CN202110609095.7A CN202110609095A CN113487537A CN 113487537 A CN113487537 A CN 113487537A CN 202110609095 A CN202110609095 A CN 202110609095A CN 113487537 A CN113487537 A CN 113487537A
Authority
CN
China
Prior art keywords
gray scale
gray
curve
halo
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110609095.7A
Other languages
Chinese (zh)
Inventor
牛司华
黄剑华
朱家安
颜子夜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Peking University
Harbin Institute of Technology
Peking University Peoples Hospital
Original Assignee
Harbin Institute of Technology
Peking University Peoples Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology, Peking University Peoples Hospital filed Critical Harbin Institute of Technology
Priority to CN202110609095.7A priority Critical patent/CN113487537A/en
Publication of CN113487537A publication Critical patent/CN113487537A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Abstract

The present disclosure relates to an information processing method, apparatus and storage medium for ultrasound hyperechoic halo of breast cancer, wherein the method comprises determining a focal region based on a breast ultrasound image; obtaining a gray scale change curve from the focus area to the peripheral direction of the breast ultrasonic image; determining whether hyperechoic halo exists around the focal region based on the gray scale variation curve. The embodiment of the disclosure can realize high-echo halo detection around a focus area.

Description

Information processing method, device and storage medium for breast cancer ultrasonic high-echo halo
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an information processing method, an apparatus, and a storage medium for ultrasound hyperechoic halo of breast cancer.
Background
Currently, breast cancer is a common malignant tumor, and early and definite diagnosis is crucial to later treatment and prognosis in clinic, but over-treatment caused by misdiagnosis is frequent. How to diagnose and identify breast cancer more accurately is a constant difficulty for imaging doctors.
Artificial intelligence techniques have become an important means of clinical assisted diagnosis. However, the stability problem of the artificial intelligence network still exists, and there are cases of missed diagnosis and misdiagnosis when the breast cancer diagnosis is performed by simply relying on the artificial intelligence technology, so a scheme capable of further analyzing and confirming the detected ultrasound image of the breast cancer focus is urgently needed to reduce the misdiagnosis cases.
Disclosure of Invention
The disclosure provides an information processing method, an information processing device and a storage medium for breast cancer ultrasonic hyperechoic halo. The method and the device can determine whether the detected focus area meets the high-echo halo condition, thereby helping to reduce the misdiagnosis probability.
According to an aspect of the present disclosure, there is provided an information processing method for breast cancer ultrasonic hyperechoic halo, including:
determining a lesion area based on the breast ultrasound image;
obtaining a gray scale change curve from the focus area to the peripheral direction of the breast ultrasonic image;
determining whether hyperechoic halo exists around the focal region based on the gray scale variation curve.
In some possible embodiments, the curve of the gray level change obtained from the lesion region to the peripheral direction of the breast ultrasound image includes:
determining a center of the focal region;
taking the center as a starting point, and carrying out gray sampling in the peripheral direction of the breast ultrasonic image to obtain at least one group of gray sequences;
determining the gray scale change curve based on the at least one group of gray scale sequences.
In some possible embodiments, the performing gray sampling in the peripheral direction of the breast ultrasound image with the center as the starting point to obtain at least one gray sequence includes:
expanding a concentric frame to the peripheral direction of the breast ultrasonic image according to the center;
determining at least one ray which takes the center as a starting point and faces to the peripheral direction;
and determining the intersection point of the ray and the concentric frame as a sampling point, and executing gray level sampling to obtain a gray level sequence corresponding to the ray.
In some possible embodiments, the determining the gray-scale variation curve based on the at least one group of gray-scale sequences includes:
performing filtering processing on the gray sequence;
and performing curve fitting on the filtered gray sequence to obtain the gray change curve.
In some possible embodiments, the determining whether hyperechoic halo exists around the lesion area based on the gray scale variation curve includes:
determining that the high-echo halo exists in the gray scale change curve under the condition that the gray scale change curve has a preset trend;
the preset trend comprises rising from a first gray scale interval to a second gray scale interval and falling from the second gray scale interval to a third gray scale interval, wherein the gray value of the second gray scale interval is larger than that of the third gray scale interval, and the gray value of the third gray scale interval is larger than that of the first gray scale interval.
In some possible embodiments, in the case that a set of gray-scale variation curves is obtained, the determining whether a hyperechoic halo exists around the lesion area based on the gray-scale variation curves further includes:
determining the similarity between the gray scale change curve and a standard high echo halo curve;
and determining that hyperechoic halo exists around the lesion area under the condition that the similarity degree represents that the gray scale change curve is similar to a standard hyperechoic halo curve.
In some possible embodiments, in the case that a plurality of sets of gray-scale variation curves are obtained, the determining whether a hyperechoic halo exists around the lesion area based on the gray-scale variation curves further includes at least one of the following manners:
performing weighting and processing on the similarity between the gray scale change curve and a standard hyperechoic halo curve, and determining that hyperechoic halo exists around the focus area under the condition that the obtained weighting sum shows that the gray scale change curve and the standard hyperechoic halo curve are similar curves;
and performing classification clustering processing on the gray scale change curve and the focus area, and determining that high echo halo exists around the focus area under the condition that the distance between two clustering centers obtained by clustering indicates that the gray scale change curve is similar to a standard high echo halo curve.
In some possible embodiments, the method further comprises:
and under the condition that the gray scale change curve determines that hyperechoic halo exists around the focus area, correcting the focus area by using the gray scale change curve.
According to a second aspect of the present disclosure, there is provided an information processing apparatus for ultrasound hyperechoic halo of breast cancer, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to invoke the instructions stored in the memory to execute the information processing method for breast cancer ultrasound hyperechoic halo in the embodiment of the present disclosure.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions, wherein the computer program instructions, when executed by a processor, implement the information processing method for breast cancer ultrasound hyperechoic halo described in the embodiments of the present disclosure.
In the embodiment of the present disclosure, a lesion area in a breast ultrasound image may be obtained, and a gray scale variation curve of the lesion area in the peripheral direction of the ultrasound image is obtained by using the lesion area as a reference, so as to determine whether a hyperechoic halo exists in the ultrasound image according to the gray scale variation curve. The existence of the hyperechoic halo can be used as an accuracy reference for detecting the focus, and the misdiagnosis condition of the focus can be reduced by the scheme of the embodiment of the disclosure.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
Fig. 1 shows a flow chart of an information processing method for breast cancer ultrasound hyperechoic halo according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram illustrating gray level co-occurrence matrix calculation using a sliding window for an original image according to an embodiment of the present disclosure, where a left matrix is a pixel that is slid to the right by a sliding window of a right matrix before sliding;
FIG. 3 illustrates a flow chart for determining a gray scale change curve according to an embodiment of the present disclosure;
FIG. 4 illustrates a schematic diagram of a central concentric box implemented in accordance with the present disclosure;
fig. 5 shows a block diagram of an information processing apparatus for breast cancer ultrasound hyperechoic halo, according to an embodiment of the present disclosure;
FIG. 6 illustrates a block diagram of an electronic device 800 in accordance with an embodiment of the disclosure;
fig. 7 illustrates a block diagram of another electronic device 1900 in accordance with an embodiment of the disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present disclosure.
The main body of the information processing method for the breast cancer ultrasound hyperechoic halo can be an image processing apparatus or an information processing method, for example, the method can be executed by a terminal device or a server or other processing devices, wherein the terminal device can be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle-mounted device, a wearable device, and the like. In some possible implementations, the image processing method may be implemented by a processor calling computer readable instructions stored in a memory.
It is understood that the above-mentioned method embodiments of the present disclosure can be combined with each other to form a combined embodiment without departing from the logic of the principle, which is limited by the space, and the detailed description of the present disclosure is omitted.
In addition, the present disclosure also provides an information processing apparatus, an electronic device, a computer-readable storage medium, and a program for ultrasound hyperechoic halo of breast cancer, which can be used to implement any of the information processing methods for ultrasound hyperechoic halo of breast cancer provided by the present disclosure, and the corresponding technical solutions and descriptions and corresponding descriptions in the methods section are omitted for brevity.
Fig. 1 shows a flowchart of an information processing method for breast cancer ultrasound hyperechoic halo, as shown in fig. 1, the method comprising:
s10: determining a lesion area based on the breast ultrasound image;
in some possible embodiments, the lesion area in the breast ultrasound image may be determined by manual labeling by a physician or by algorithm assistance, and the lesion area may be the area where the breast cancer lesion is located.
The focal region of the embodiments of the present disclosure may be a circle or a square, which is not specifically limited by the present disclosure.
S20: obtaining a gray scale change curve from the focus area to the peripheral direction of the breast ultrasonic image;
in some possible embodiments, in the case of obtaining the lesion area, the gray level variation of the lesion area and its periphery may be analyzed to obtain a gray level variation curve.
S30: determining whether hyperechoic halo exists around the focal region based on the gray scale variation curve.
In some possible embodiments, in the case of obtaining a gray scale variation curve, it may be further determined whether the gray scale variation curve indicates the presence of hyperechoic (hyperechoic halo) near the lesion area in the ultrasound image. Under the condition that hyperechoic halo exists, the focus detection is correct, the focus area corresponds to breast cancer, and under the condition that hyperechoic halo does not exist, the focus detection is inaccurate, and the focus area is not breast cancer.
The embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
The breast ultrasound image may be first acquired, wherein the ultrasound instrument may be connected to receive the ultrasound image, or the ultrasound image may be read from a database, server, or other electronic device, as the present disclosure is not particularly limited.
In the case of obtaining an ultrasound image, lesion detection may be performed on the ultrasound image to detect a lesion region of breast cancer in the ultrasound image. Wherein the lesion area can be determined by receiving box selection information input by a user (doctor). Or the lesion area can be detected by artificial intelligence.
Specifically, feature extraction of the ultrasound image may be performed first to obtain image features, and classification processing is performed by using a classifier to obtain a lesion area and a normal area in the ultrasound image.
The method introduces the semantic interpretable breast cancer-based ultrasonic image processing method to extract image features, and meets the requirement of auxiliary identification on interpretation of output results in a clinical practical process through interpretation of the features of the images and deep learning, so that the understandability of doctors is enhanced. The basic idea is that by extracting image gray level co-occurrence matrix, textural features and morphological features and combining with a feature map generated by an interpretable method of deep learning, a low-rank regression model is utilized to map abstract features which are not interpretable by the deep learning to an interpretable semantic feature space, and interpretable breast cancer ultrasonic-assisted identification is carried out in the interpretable semantic feature space according to the importance degree of each feature. The specific content comprises:
the first step, extracting interpretable features from the ultrasound image of the breast, includes: extracting a gray level co-occurrence matrix and textural features of the ultrasonic image, and extracting a plurality of morphological features, wherein the morphological features are the same as or similar to the features described by the BI-RADS, and the morphological features represent the shape, the orientation, the boundary and the internal textural features of the tumor;
with reference to fig. 2, the first step is as follows:
1.1, extracting a gray level co-occurrence matrix from the ultrasonic image, specifically: the texture is described by analyzing the characteristic that the gray distribution repeatedly and alternately changes on the spatial position, and the method comprises the steps of determining the frequency of occurrence of a pixel pair with a position (i, j) having a value along the direction theta and a pixel value i and a pixel value j which are separated by d for an image represented in a matrix form on the basis of a given direction theta and a step length d so as to calculate the comprehensive information of the direction, the interval, the change amplitude and the speed of the ultrasonic image, wherein the theta is 0 DEG, 45 DEG, 90 DEG or 135 DEG.
And 1.2, extracting a gray gradient co-occurrence matrix of pixel gray and gradient based on the ultrasonic image, wherein the gray gradient co-occurrence matrix of the image is the joint distribution of the image gray pixel and the image gradient, and the image gradient is obtained through various differential operators and is used for detecting the gray jumping part in the ultrasonic image. The fusion of the two can help to find information of image texture arrangement and pixel change.
1.3, extracting texture features from the ultrasonic image of the mammary gland to quantitatively describe information in the ultrasonic image, wherein 5 features such as a gray standard deviation, energy, a gray entropy value, a gray mean value, correlation and the like are extracted from the ultrasonic image.
And 1.4, extracting morphological characteristics from the ultrasonic image of the mammary gland to describe the shape, the orientation, the boundary and the internal texture characteristics of the tumor, wherein the morphological characteristics comprise roundness, aspect ratio, growth angle of skin, leaf formation, edge roughness, edge acupuncture, internal calcification number, border ambiguity and the like of the tumor.
Secondly, abstract feature extraction of the ultrasonic image comprises the following steps: and (3) learning the features to be extracted of each convolution kernel by utilizing a deep neural network and a regression algorithm to form a feature map, representing the feature map by using the global mean value of the feature map, and taking the feature map mean value vector formed by the last convolution layer as an abstract feature vector.
2.1, abstract feature extraction of the ultrasonic image by using a deep neural network, wherein the abstract feature extraction of the ultrasonic image can use, but is not limited to, a CAM and/or Grad-CAM method.
And thirdly, calculating a weight coefficient matrix W of different features in the mapping process by a semantic low-rank regression algorithm according to the interpretable features and the abstract features, and selecting the object features with the weight higher than a preset value to explain the information concerned in the deep learning discrimination model, wherein the information is used for improving the credibility of the feature test model when the breast cancer ultrasonic image is subjected to auxiliary discrimination.
3.1, extracting the abstract features through a semantic low-rank regression algorithm, and finding out features which play an important role in auxiliary judgment of breast cancer ultrasonic images so as to improve the credibility of the abstract features. Here, the low rank regression algorithm defines abstract features
Figure BDA0003094847830000041
Input data of the abstract features, xtCorresponding to the t (t is more than or equal to 1 and less than or equal to N) th sample depth learning abstract feature, d is abstract feature dimension
Figure BDA0003094847830000042
An interpretable feature being an output of the semantic low-rank regression algorithm, c being a dimension of the interpretable feature, ytIs the corresponding interpretable feature of the t-th sample, the mathematical description of the regression model of the semantic low-rank regression algorithm:
Figure BDA0003094847830000043
wherein the first term represents l of the mapping penalty2Norm,/, of2The norm represents the difference degree between the abstract characteristic and the specific characteristic after the abstract characteristic passes through the mapping matrix, and the larger the difference degree is, the larger the loss is, and the worse the expression of the mapping matrix is. Theta is a regular penalty coefficient and is used for a regular low-rank penalty term RankW, the deep learning abstract characteristics of different samples generally have correlation, the rank of the W matrix is low according to the priori knowledge, and the trace of the matrix is required to be used for the matrix according to the approximate theory of the rankIs approximated, TrW is the trace of the matrix and can be defined as:
Figure BDA0003094847830000051
σiis the ith singular value of W, WTIs the transpose of W.
The weight coefficient matrix W is a weight in the mapping process, specifically, a total weight obtained by calculating a comprehensive calculation of the appearance features in the N sample mapping operations, wherein the influence degree of the appearance features on the discriminator on the sample set X is:
Figure BDA0003094847830000052
and W' obtained by calculation, selecting the image features with higher weight to explain the highly concerned information in the deep learning discrimination model, and using the information to improve the credibility of the feature test model.
An embodiment of the present application further provides an electronic device, including a memory and a processor, where the memory stores an executable program, and the processor executes the executable program to implement the following steps:
the first step, extracting interpretable features from the ultrasound image of the breast, includes: extracting a gray level co-occurrence matrix and textural features of the ultrasonic image, and extracting a plurality of morphological features, wherein the morphological features are the same as or similar to the features described by the BI-RADS, and the morphological features represent the shape, the orientation, the boundary and the internal textural features of the tumor;
secondly, abstract features of the ultrasonic image are extracted, wherein the abstract features of the ultrasonic image comprise: learning the features to be extracted of each convolution kernel by using a regression algorithm by using a deep neural network to form a feature map, representing the feature map by using the global mean value of the feature map, and taking the feature map mean value vector formed by the last layer of convolution layer as an abstract feature vector;
and thirdly, calculating a weight coefficient matrix W of different features in the mapping process by a semantic low-rank regression algorithm according to the interpretable features and the abstract features, and selecting the object features with the weight higher than a preset value to explain the information concerned in the deep learning discrimination model, wherein the information is used for improving the credibility of the feature test model when the breast cancer ultrasonic image is subjected to auxiliary discrimination.
The image characteristics of the ultrasound image can be obtained by the above configuration, and the obtained characteristic information is input to a classifier (at least one convolution layer) to obtain the position of the lesion region. The method can utilize the gray level co-occurrence matrix, the texture characteristics and the morphological characteristics of the breast cancer ultrasonic image and use a regression algorithm to extract the characteristics of the deep learning mode to form a characteristic diagram, and the global mean value of the characteristic diagram represents the characteristic diagram. And finally, calculating by using a low-rank regression algorithm to obtain the importance of different features in image discrimination, realizing the interpretability of breast cancer ultrasonic image auxiliary discrimination, and having semantic meanings consistent with clinical features. Aiming at the problem that the output result in the existing ultrasonic breast auxiliary identification technology has no medical interpretability, the method solves the problem that the output result can not be interpreted when a deep learning method is used for carrying out ultrasonic auxiliary identification on the breast cancer, and improves the credibility in clinic. The method realizes that the characteristics are selected in a targeted manner while the discrimination effect is improved by using a deep learning method, the result has a semantic interpretation discrimination process, and a bridge between doctor knowledge and computer algorithm knowledge is established to help the doctor understand and receive the classification result. The consistency between the decision process and the output result of the deep learning method and the evidence-based medical concept promotes the application of the deep learning algorithm in the breast tumor diagnosis.
Under the condition of obtaining the focus area, the gray level condition around the focus area can be analyzed to obtain a gray level change curve.
Fig. 3 shows a flowchart for determining a gray scale variation curve obtained from the lesion area to the peripheral direction of the breast ultrasound image according to an embodiment of the present disclosure, including:
s21: determining a center of the focal region;
s22: taking the center as a starting point, and carrying out gray sampling in the peripheral direction of the breast ultrasonic image to obtain at least one group of gray sequences;
s23: determining the gray scale change curve based on the at least one group of gray scale sequences.
In some possible embodiments, the center of the circle of the lesion area is determined as the center in the case where the lesion area is a circular frame, and the intersection of the diagonal lines of the lesion area is determined as the center in the case where the lesion area is a square frame. When the focus area is in other shapes, the geometric center of the focus area is determined as the center.
Under the condition of determining the center, the center of the lesion area can be determined as the starting point of the gray scale change, the gray scale sampling is carried out in the peripheral direction of the breast ultrasound image according to the preset direction, and at least one group of gray scale sampling values can be obtained in each direction to form a gray scale sequence.
Specifically, in this embodiment of the present disclosure, the performing gray sampling in the peripheral direction of the breast ultrasound image with the center as a starting point to obtain at least one group of gray sequences includes: expanding a concentric frame to the peripheral direction of the breast ultrasonic image according to the center; determining at least one ray which takes the center as a starting point and faces to the peripheral direction; and determining the intersection point of the ray and the concentric frame as a sampling point, and executing gray level sampling to obtain a gray level sequence corresponding to the ray.
With reference to fig. 4, fig. 4 shows a schematic diagram of a concentric box implemented according to the present disclosure, wherein the ultrasound image represented by fig. 4 is described herein as an example, and the concentric box of the embodiments herein is mainly described with emphasis. Therefore, it can be assumed that a concentric frame (illustrated by concentric circles in the figure) with point a as the center identifies a lesion region and a periphery (e.g., a breast cancer lesion, a breast tumor region, etc.) identified by a breast ultrasound image, and the concentric frame is continuously expanded outward on the periphery of the lesion region with point a at the center of the lesion region as the starting point of the gray scale variation curve until the boundary of the breast region in the ultrasound image is reached. The concentric frame can be circular or square, and can be set according to requirements. The interval distance between the concentric frames may be a predetermined distance value, which may be a value less than half the length of the lesion area.
In the case of determining the concentric frame, at least one ray is taken from the center as a starting point to the outside direction of the ultrasound image, as in the embodiment of the present disclosure, the ray may be a horizontal ray, or may also be a ray located in the right direction of the vertical line passing through the point a, in the embodiment of the present disclosure, the obtained ray may be one ray, or may also be multiple rays, in the case of one ray, the horizontal ray is preferred for the image capturing, and in the case of multiple rays, the angle between two adjacent rays may be determined according to the quotient between 180 degrees and the number of the rays, so that each ray is uniformly distributed in the right region of the center of the lesion.
In the case of obtaining at least one ray, the gray-scale value corresponding to each ray (e.g., the gray-scale sequence formed by A, B, C, D, E, F, G, H, I, J, K, L in fig. 4 corresponding to the gray-scale value) can be determined by using the gray-scale value corresponding to the intersection between each ray and the concentric frame. Based on the above, the embodiments of the present disclosure may obtain at least one group of gray scale sequences.
In some possible embodiments, in the case of obtaining the gray sequence, a curve fitting may be performed on the gray sequence to obtain a gray variation curve. In an embodiment of the disclosure, the determining the gray-scale variation curve based on the at least one group of gray-scale sequences includes: performing filtering processing on the gray sequence; and performing curve fitting on the filtered gray sequence to obtain the gray change curve.
According to the embodiment of the invention, the filtering processing can be performed on the gray sequence in a moving average filtering mode, the noise influence is reduced, and then the fitting processing of a least square method is performed on the filtered gray sequence to obtain the gray change curve.
Since hyperechoic may be present around a breast cancer lesion, embodiments of the present disclosure may determine whether a hyperechoic condition is present based on a gray scale variation curve. In the embodiment of the present disclosure, it may be determined that the high-echo halo exists in the gray scale change curve under the condition that the gray scale change curve has a preset trend; the preset trend comprises rising from a first gray scale interval to a second gray scale interval and falling from the second gray scale interval to a third gray scale interval, wherein the gray value of the second gray scale interval is larger than that of the third gray scale interval, and the gray value of the third gray scale interval is larger than that of the first gray scale interval. Wherein the second gray scale interval is a high echo region relative to the first gray scale interval and the second gray scale interval.
In addition, in the embodiment of the present disclosure, a standard hyperechoic curve may be obtained, and whether hyperechoic halo exists or not may be determined according to a similarity between the gray scale change curve and the standard hyperechoic curve, where the standard hyperechoic curve meets the preset trend. Specifically, the embodiment of the present disclosure may first obtain a determined standard hyperechoic curve, and then perform curve similarity determination. The method for determining the standard hyperechoic halo curve comprises the following steps: acquiring a plurality of groups of ultrasonic images determined to have breast cancer lesions; acquiring a gray scale change curve from the breast cancer focus area to the periphery direction of the ultrasonic image in the horizontal direction and the right direction aiming at each group of ultrasonic images; and performing mean value processing on the gray scale change curve to obtain the standard high echo halo curve.
According to the embodiment of the disclosure, as the gray scale change of the focus region and the periphery of the focus region can be comprehensively reflected in the horizontal rightward direction (towards the horizontal direction of the areola), the embodiment of the disclosure can collect the gray scale change curves of a large number (more than ten thousand) of breast cancer images in the direction, perform curve fitting, obtain the standard hyperechoic areola curve, and ensure the accuracy of the curve change. Or, in the embodiment of the present disclosure, the gray scale change curves in multiple directions may also be collected, and a fitting is performed to obtain a standard gray scale curve, which is not specifically limited in the present disclosure.
In an embodiment of the present disclosure, in obtaining a set of gray-scale variation curves, the determining whether a hyperechoic halo exists around the focal region based on the gray-scale variation curves includes: determining the similarity between the gray scale change curve and a standard high echo halo curve; and determining that hyperechoic halo exists around the lesion area under the condition that the similarity degree represents that the gray scale change curve is similar to a standard hyperechoic halo curve.
As described in the above embodiment, the obtained ray may be one, and the corresponding gray-scale variation curve is a group. At this time, the gray scale change curve can be compared with a standard hyperechoic halo curve corresponding to the breast cancer obtained in advance, and the similarity between the gray scale change curve and the standard hyperechoic halo curve can be determined. The similarity between the gray scale variation curve and the standard hyperechoic halo curve can be determined by using the distance between the two curves, wherein the distance can be a euclidean distance or a discrete freschel distance, which is not specifically limited by the present disclosure.
In the case where the similarity between the gradation change curve and the standard hyperechoic halo curve is obtained, it is possible to determine whether or not hyperechoic halo exists in the gradation change curve based on the similarity. And if the similarity is greater than the similarity threshold, determining that the similarity is similar to the threshold, namely that high echo halo exists in the gray scale change curve, otherwise, determining that the high echo halo does not exist. The similarity threshold may be a value greater than or equal to 0.6 and less than 1.
In other embodiments of the present disclosure, in the case that a plurality of sets of gray-scale variation curves are obtained, the determining whether a hyperechoic halo exists around the lesion area based on the gray-scale variation curves includes at least one of the following ways:
A) performing weighting and processing on the similarity between the gray scale change curve and a standard hyperechoic halo curve, and determining that hyperechoic halo exists around the focus area under the condition that the obtained weighting sum shows that the gray scale change curve and the standard hyperechoic halo curve are similar curves;
in the embodiment of the disclosure, when a plurality of rays are obtained, a plurality of groups of gray level sequences and gray level change curves are correspondingly obtained. Because the ray direction corresponding to each gray scale change curve has difference, the gray scale values in the obtained gray scale sequence also have difference. In order to improve the accuracy of the gray scale change curve, the embodiment of the present disclosure fuses the gray scale change rule of each ray, wherein each ray may be assigned with a corresponding weight, and a final gray scale change curve is obtained by using the weighted sum of the gray scale change curves of each ray.
The sum of the weights of the rays is 1, the weight of the ray to the right in the horizontal direction is the largest, the weight of the ray in the vertical direction is the smallest, and the weight of the ray close to the horizontal direction is larger than the weight of the ray close to the vertical direction.
And obtaining a weighted sum gray curve by using the addition result of the products between each gray curve and the corresponding weight, and determining whether the hyperechoic halo exists or not by using the similarity between the weighted sum gray curve and the standard hyperechoic halo curve.
B) And performing classification clustering processing on the gray scale change curve and the focus area, and determining that high echo halo exists around the focus area under the condition that the distance between two clustering centers obtained by clustering indicates that the gray scale change curve is similar to a standard high echo halo curve.
In addition, in the embodiment of the present disclosure, in the case of obtaining each gray scale variation curve, curve feature extraction and clustering processing may be further performed on the gray scale variation curve and the standard high echo gray scale curve, where the feature extraction may be performed on each gray scale variation curve to obtain a curve feature, where the curve feature may include a gray scale sequence on the gray scale variation curve, and a slope between adjacent sampling points. The curve features obtained for each gray scale variation curve and standard hyperechoic halo curve can be represented as feature vectors. And further, clustering processing of the feature vectors can be performed, and the embodiment of the disclosure can cluster the feature vectors into two types by means of mean value clustering, and obtain two types of clustering centers.
If high-echo halo exists in the gray scale change curve, the distances between the two cluster centers are very close, and the embodiment of the disclosure can determine that the gray scale change curve is similar to the standard high-echo halo curve when the distance between the cluster centers is greater than the distance threshold, wherein the high-echo halo exists. The distance threshold of the embodiments of the present disclosure is a value less than 0.3cm, but is not a specific limitation of the present disclosure.
In the embodiment of the present disclosure, when it is determined that hyperechoic halo exists around the lesion region, it is determined that the lesion corresponds to breast cancer, and at this time, the lesion region may be corrected using a plurality of gray scale variation curves.
Specifically, in the embodiment of the present disclosure, the pixel points of all the gray curves that change from the first gray range to the second gray range may be used as boundary points of the focal region, and the corrected focal region may be determined by using the region connected by the updated boundary points.
Based on the above configuration, in the embodiment of the present disclosure, a lesion area in the breast ultrasound image may be obtained, and a gray scale change curve of the lesion area in the peripheral direction of the ultrasound image is obtained with the lesion area as a reference, so as to determine whether a hyperechoic halo exists in the ultrasound image according to the gray scale change curve. The existence of the hyperechoic halo can be used as an accuracy reference for detecting the focus, and the misdiagnosis condition of the focus can be reduced by the scheme of the embodiment of the disclosure.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Fig. 5 is a block diagram of an information processing apparatus for ultrasound hyperechoic halo of breast cancer according to an embodiment of the present disclosure, and as shown in fig. 5, the information processing apparatus for ultrasound hyperechoic halo of breast cancer includes:
a first determination module for determining a lesion area based on the breast ultrasound image;
the acquisition module is used for acquiring a gray scale change curve from the focus area to the peripheral direction of the breast ultrasonic image;
a second determination module for determining whether hyperechoic halo exists around the focal region based on the gray scale variation curve.
In some possible embodiments, the obtaining module is further configured to: determining a center of the focal region; taking the center as a starting point, and carrying out gray sampling in the peripheral direction of the breast ultrasonic image to obtain at least one group of gray sequences;
determining the gray scale change curve based on the at least one group of gray scale sequences.
In some possible embodiments, the acquisition module is further configured to expand a concentric frame in a peripheral direction of the breast ultrasound image according to the center; determining at least one ray which takes the center as a starting point and faces to the peripheral direction; and determining the intersection point of the ray and the concentric frame as a sampling point, and executing gray level sampling to obtain a gray level sequence corresponding to the ray.
In some possible embodiments, the determining the gray-scale variation curve based on the at least one group of gray-scale sequences includes: performing filtering processing on the gray sequence; and performing curve fitting on the filtered gray sequence to obtain the gray change curve.
In some possible embodiments, the second determining module is further configured to,
determining that the high-echo halo exists in the gray scale change curve under the condition that the gray scale change curve has a preset trend;
the preset trend comprises rising from a first gray scale interval to a second gray scale interval and falling from the second gray scale interval to a third gray scale interval, wherein the gray value of the second gray scale interval is larger than that of the third gray scale interval, and the gray value of the third gray scale interval is larger than that of the first gray scale interval.
In some possible embodiments, the second determining module is further configured to, in a case that a set of gray-scale variation curves is obtained, determine whether hyperechoic halo exists around the lesion area based on the gray-scale variation curves, and further includes:
determining the similarity between the gray scale change curve and a standard high echo halo curve;
and determining that hyperechoic halo exists around the lesion area under the condition that the similarity degree represents that the gray scale change curve is similar to a standard hyperechoic halo curve.
In some possible embodiments, the second determining module is further configured to, in a case that a plurality of sets of gray-scale variation curves are obtained, determine whether hyperechoic halo exists around the lesion area based on the gray-scale variation curves, and further include at least one of:
performing weighting and processing on the similarity between the gray scale change curve and a standard hyperechoic halo curve, and determining that hyperechoic halo exists around the focus area under the condition that the obtained weighting sum shows that the gray scale change curve and the standard hyperechoic halo curve are similar curves;
and performing classification clustering processing on the gray scale change curve and the focus area, and determining that high echo halo exists around the focus area under the condition that the distance between two clustering centers obtained by clustering indicates that the gray scale change curve is similar to a standard high echo halo curve.
In some possible embodiments, the apparatus further comprises:
and the correction module is used for correcting the focus area by using the gray scale change curve under the condition that the gray scale change curve determines that high-echo halo exists around the focus area.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
Embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the above-mentioned method. The computer readable storage medium may be a non-volatile computer readable storage medium.
An embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured as the above method.
The electronic device may be provided as a terminal, server, or other form of device.
Fig. 6 illustrates a block diagram of an electronic device 800 in accordance with an embodiment of the disclosure. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like terminal.
Referring to fig. 6, electronic device 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 800 is in an operation mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the electronic device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or a component of the electronic device 800, the presence or absence of user contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium, such as the memory 804, is also provided that includes computer program instructions executable by the processor 820 of the electronic device 800 to perform the above-described methods.
Fig. 7 illustrates a block diagram of another electronic device 1900 in accordance with an embodiment of the disclosure. For example, the electronic device 1900 may be provided as a server. Referring to fig. 7, electronic device 1900 includes a processing component 1922 further including one or more processors and memory resources, represented by memory 1932, for storing instructions, e.g., applications, executable by processing component 1922. The application programs stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the above-described method.
The electronic device 1900 may also include a power component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input/output (I/O) interface 1958. The electronic device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium, such as the memory 1932, is also provided that includes computer program instructions executable by the processing component 1922 of the electronic device 1900 to perform the above-described methods.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terms used herein were chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the techniques in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (10)

1. An information processing method for breast cancer ultrasonic hyperechoic halo is characterized by comprising the following steps:
determining a lesion area based on the breast ultrasound image;
obtaining a gray scale change curve from the focus area to the peripheral direction of the breast ultrasonic image;
determining whether hyperechoic halo exists around the focal region based on the gray scale variation curve.
2. The method of claim 1, wherein the obtaining of the gray scale change curve from the lesion area to the peripheral direction of the breast ultrasound image comprises:
determining a center of the focal region;
taking the center as a starting point, and carrying out gray sampling in the peripheral direction of the breast ultrasonic image to obtain at least one group of gray sequences;
determining the gray scale change curve based on the at least one group of gray scale sequences.
3. The method of claim 2, wherein said sampling the intensity of the breast ultrasound image in the peripheral direction from the center to obtain at least one set of intensity sequences comprises:
expanding a concentric frame to the peripheral direction of the breast ultrasonic image according to the center;
determining at least one ray which takes the center as a starting point and faces to the peripheral direction;
and determining the intersection point of the ray and the concentric frame as a sampling point, and executing gray level sampling to obtain a gray level sequence corresponding to the ray.
4. The method according to claim 2 or 3, wherein said determining the gray-scale variation curve based on the at least one set of gray-scale sequences comprises:
performing filtering processing on the gray sequence;
and performing curve fitting on the filtered gray sequence to obtain the gray change curve.
5. The method of claim 1, wherein said determining whether hyperechoic halo is present around the focal region based on the gray scale variation curve comprises:
determining that the high-echo halo exists in the gray scale change curve under the condition that the gray scale change curve has a preset trend;
the preset trend comprises rising from a first gray scale interval to a second gray scale interval and falling from the second gray scale interval to a third gray scale interval, wherein the gray value of the second gray scale interval is larger than that of the third gray scale interval, and the gray value of the third gray scale interval is larger than that of the first gray scale interval.
6. The method of claim 5, wherein said determining whether hyperechoic halos are present around the focal region based on a set of gray scale variation curves obtained further comprises:
determining the similarity between the gray scale change curve and a standard high echo halo curve;
and determining that hyperechoic halo exists around the lesion area under the condition that the similarity degree represents that the gray scale change curve is similar to a standard hyperechoic halo curve.
7. The method of claim 5 or 6, wherein, in the case of obtaining a plurality of sets of gray-scale variation curves, the determining whether hyperechoic halo exists around the lesion area based on the gray-scale variation curves further comprises at least one of:
performing weighting and processing on the similarity between the gray scale change curve and a standard hyperechoic halo curve, and determining that hyperechoic halo exists around the focus area under the condition that the obtained weighting sum shows that the gray scale change curve and the standard hyperechoic halo curve are similar curves;
and performing classification clustering processing on the gray scale change curve and the focus area, and determining that high echo halo exists around the focus area under the condition that the distance between two clustering centers obtained by clustering indicates that the gray scale change curve is similar to a standard high echo halo curve.
8. The method of claim 1, further comprising:
and under the condition that the gray scale change curve determines that hyperechoic halo exists around the focus area, correcting the focus area by using the gray scale change curve.
9. An information processing device for breast cancer ultrasonic hyperechoic halo, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to invoke the memory-stored instructions to perform the method of any one of claims 1-8.
10. A computer readable storage medium having computer program instructions stored thereon, which when executed by a processor implement the method of any one of claims 1 to 8.
CN202110609095.7A 2021-06-01 2021-06-01 Information processing method, device and storage medium for breast cancer ultrasonic high-echo halo Pending CN113487537A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110609095.7A CN113487537A (en) 2021-06-01 2021-06-01 Information processing method, device and storage medium for breast cancer ultrasonic high-echo halo

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110609095.7A CN113487537A (en) 2021-06-01 2021-06-01 Information processing method, device and storage medium for breast cancer ultrasonic high-echo halo

Publications (1)

Publication Number Publication Date
CN113487537A true CN113487537A (en) 2021-10-08

Family

ID=77934257

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110609095.7A Pending CN113487537A (en) 2021-06-01 2021-06-01 Information processing method, device and storage medium for breast cancer ultrasonic high-echo halo

Country Status (1)

Country Link
CN (1) CN113487537A (en)

Similar Documents

Publication Publication Date Title
CN109871883B (en) Neural network training method and device, electronic equipment and storage medium
CN111310616B (en) Image processing method and device, electronic equipment and storage medium
TWI770754B (en) Neural network training method electronic equipment and storage medium
JP7089106B2 (en) Image processing methods and equipment, electronic devices, computer-readable storage media and computer programs
CN112785565B (en) Target detection method and device, electronic equipment and storage medium
CN110674719B (en) Target object matching method and device, electronic equipment and storage medium
CN112767329B (en) Image processing method and device and electronic equipment
CN106228556B (en) image quality analysis method and device
US20190188460A1 (en) Method and device for use in hand gesture recognition
JP2022535219A (en) Image segmentation method and device, electronic device, and storage medium
CN109934275B (en) Image processing method and device, electronic equipment and storage medium
WO2022036972A1 (en) Image segmentation method and apparatus, and electronic device and storage medium
CN113222038B (en) Breast lesion classification and positioning method and device based on nuclear magnetic image
CN114820584B (en) Lung focus positioner
CN111243011A (en) Key point detection method and device, electronic equipment and storage medium
CN110532956B (en) Image processing method and device, electronic equipment and storage medium
JP2022522551A (en) Image processing methods and devices, electronic devices and storage media
WO2023050691A1 (en) Image processing method and apparatus, and electronic device, storage medium and program
CN112115894B (en) Training method and device of hand key point detection model and electronic equipment
CN113012146B (en) Vascular information acquisition method and device, electronic equipment and storage medium
CN112967264A (en) Defect detection method and device, electronic equipment and storage medium
CN113012816B (en) Brain partition risk prediction method and device, electronic equipment and storage medium
JP2022548453A (en) Image segmentation method and apparatus, electronic device and storage medium
CN111178115B (en) Training method and system for object recognition network
KR20210054522A (en) Face recognition method and device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination