CN114171197A - Method and related equipment for predicting HER2 state of breast cancer - Google Patents

Method and related equipment for predicting HER2 state of breast cancer Download PDF

Info

Publication number
CN114171197A
CN114171197A CN202111338891.8A CN202111338891A CN114171197A CN 114171197 A CN114171197 A CN 114171197A CN 202111338891 A CN202111338891 A CN 202111338891A CN 114171197 A CN114171197 A CN 114171197A
Authority
CN
China
Prior art keywords
her2
expression probability
breast
image
classification model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111338891.8A
Other languages
Chinese (zh)
Other versions
CN114171197B (en
Inventor
刘碧华
谢晓彤
黄炳升
樊雅恒
王铭宇
林楚旋
潘浩瑜
郭媛
唐文洁
陈思义
胡闻珂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen University
Dongguan Peoples Hospital
Original Assignee
Shenzhen University
Dongguan Peoples Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen University, Dongguan Peoples Hospital filed Critical Shenzhen University
Priority to CN202111338891.8A priority Critical patent/CN114171197B/en
Publication of CN114171197A publication Critical patent/CN114171197A/en
Application granted granted Critical
Publication of CN114171197B publication Critical patent/CN114171197B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Epidemiology (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Primary Health Care (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The application discloses a method and related equipment for predicting breast cancer HER2 states, wherein the method comprises the steps of respectively inputting a breast MR image to be predicted into a first classification model and a second classification model, and controlling the first classification model and the second classification model to determine that a HER2 type corresponding to the breast MR image is a HER2 high-expression type, a HER2 low-expression type or a HER2 zero-expression type. According to the method, the first classification model and the second classification model are adopted to learn the breast MR images carrying breast lesion features in different aspects, so that the focus information mined in different aspects is correlated, the HER2 type is determined to be the HER2 high expression type, the HER2 low expression type or the HER2 zero expression type through the correlated focus information, and the accuracy of identifying the HER2 type to be the HER2 high expression type, the HER2 low expression type or the HER2 zero expression type is further improved.

Description

Method and related equipment for predicting HER2 state of breast cancer
Technical Field
The application relates to the technical field of biomedical engineering, in particular to a method for predicting breast cancer HER2 state and related equipment.
Background
Breast Cancer (BC) is a heterogeneous disease that has surpassed lung Cancer and becomes the first Cancer worldwide, and is also a leading cause of Cancer death in women. Human Epidermal Growth Factor Receptor 2(Human Epidermal Growth Factor Receptor 2, HER2) is a protooncogene encoding a tyrosine kinase active Epidermal Growth Factor Receptor and is the only predictive marker for targeted therapy of breast tumors, the HER2 status plays a crucial role in selection of breast cancer treatment regimens and prognosis evaluation, according to the definition of the ASCO/CAP guideline, BC is classified as HER2 high-expression and HER2 negative, and HER2 negative can be further classified as HER2 low-expression and HER2 zero-expression as determined by standardized central pathology. HER2 high expressing BC patients may benefit from anti-HER 2 therapy, while two HER 2-directed Antibody Drug Conjugates (ADCs) with chemotherapeutic drugs, trastuzumab deglutition (T-DXd) and trastuzumab docazine (SYD985), showed very promising therapeutic activity in HER2 low expressing BC patients. Therefore, the objective and accurate identification of different patients with high expression of HER2, low expression of HER2 and zero expression of HER2 has important significance for the treatment and prognosis of BC patients.
Currently, HER2 expression is routinely evaluated clinically using pathology. According to 2019 edition of Chinese breast cancer HER2 detection guidelines, when a patient is subjected to HER2 pathological evaluation, an Immunohistochemistry (IHC) method is recommended to detect the expression level of HER2 protein, the IHC method needs to carry out invasive examination firstly, then a doctor evaluates the staining degree of invasive cancer cells by experience, and finally obtains an evaluation result of HER 2. However, invasive examination cannot be performed for many times, the change of HER2 level cannot be observed dynamically, the subjectivity of the evaluation result is strong, and a quantitative and objective evaluation result cannot be obtained. Therefore, finding a non-invasive method to achieve HER2 status assessment is a clinically urgent challenge.
Therefore, the prior art uses the traditional iconomics to evaluate the HER2 state, cannot sufficiently mine the internal connection between data, cannot accurately predict the HER2 state, and thus has the problem of wrong prediction.
Thus, the prior art has yet to be improved and enhanced.
Disclosure of Invention
In order to solve the above technical problem, a first aspect of the embodiments of the present application provides a method for predicting breast cancer HER2 status, the method including:
respectively inputting a mammary gland MR image to be predicted into a first classification model and a second classification model;
controlling the first classification model and the second classification model to determine a HER2 category corresponding to the breast MR image, wherein the HER2 category comprises HER2 high expression, HER2 low expression or HER2 zero expression.
The method for predicting breast cancer HER2 status, wherein the controlling the first classification model and the second classification model to determine the HER2 category corresponding to the breast MR image specifically includes:
controlling the first classification model to determine the high expression probability of HER2 and the negative expression probability of HER2 corresponding to the breast MR image;
controlling the second classification model to determine a corresponding predicted HER2 low expression probability and a predicted HER2 zero expression probability of the breast MR image;
determining the HER2 category corresponding to the mammary MR image based on the HER2 high expression probability, the HER2 negative expression probability, the predicted HER2 low expression probability and the predicted HER2 zero expression probability.
The method for predicting the breast cancer HER2 state, wherein the step of determining the HER2 category corresponding to the breast MR image based on the HER2 negative expression probability, the HER2 low expression probability prediction and the HER2 zero expression probability prediction specifically comprises the following steps of:
respectively calculating the products of the HER2 negative expression probability and the predicted HER2 low expression probability and the HER2 negative expression probability and the predicted HER2 zero expression probability to obtain a HER2 low expression probability and a HER2 zero expression probability;
determining the HER2 category corresponding to the breast MR image based on the high HER2 expression probability, the low HER2 expression probability and the zero HER2 expression probability.
The method for predicting breast cancer HER2 state, wherein the first classification model comprises a shadowmics module, a semantic segmentation module and a prediction module; the controlling the first classification model to determine the HER2 high expression probability and the HER2 negative expression probability corresponding to the breast MR image specifically includes:
controlling the second proteomics module to determine the proteomics characteristics corresponding to the mammary gland MR image, and determining the first HER2 high expression probability and the first HER2 negative expression probability corresponding to the mammary gland MR image based on the proteomics characteristics;
controlling the semantic segmentation module to determine a second HER2 high expression probability and a second HER2 negative expression probability corresponding to the breast MR image;
controlling the prediction module to determine a high HER2 expression probability and a negative HER2 expression probability corresponding to the breast MR image based on the first high HER2 expression probability, the first negative HER2 expression probability, the second high HER2 expression probability and the second negative HER2 expression probability.
The method for predicting the HER2 state of the breast cancer, wherein the imaging group module comprises a focus delineation unit, an imaging group feature extraction unit, a first feature screening unit and a first prediction unit; the controlling the second omics module to determine the omics characteristics corresponding to the breast MR image, and the determining the first HER2 high expression probability and the first HER2 negative expression probability corresponding to the breast MR image based on the omics characteristics specifically includes:
controlling the focus delineation unit to extract a region of interest in the breast MR image;
controlling the imaging omics feature extraction unit to extract imaging features of the region of interest so as to obtain candidate imaging omics features;
controlling the first feature screening unit to perform feature screening on the candidate imagery omics features to obtain imagery omics features;
controlling the first prediction unit to determine a corresponding first HER2 high expression probability and a first HER2 negative expression probability of the breast MR image based on the iconic features.
The method for predicting the HER2 state of the breast cancer, wherein the semantic segmentation module comprises a feature extraction unit, a second feature screening unit and a second prediction unit; the controlling the semantic segmentation module to determine the second HER2 high expression probability and the second HER2 negative expression probability corresponding to the breast MR image specifically includes:
controlling the feature extraction unit to determine candidate semantic features corresponding to the mammary gland MR image;
controlling the second feature screening unit to screen the candidate semantic features to obtain semantic features corresponding to the mammary gland MR image;
controlling the second prediction unit to determine a second HER2 high expression probability and a second HER2 negative expression probability corresponding to the breast MR image based on the semantic features.
The method for predicting breast cancer HER2 state, wherein the controlling the feature extraction unit to determine the candidate semantic features corresponding to the breast MR image specifically includes:
controlling the feature extraction unit to determine a plurality of slice feature maps based on the mammary gland MR image, wherein the plurality of slice feature maps correspond to a plurality of slice images included in the mammary gland MR image one by one;
dividing the plurality of slice characteristic graphs into two characteristic clusters based on a clustering mode, and removing the characteristic clusters with less slice characteristic graphs from the two characteristic clusters to obtain a plurality of target slice characteristics;
and fusing the target slice features to obtain candidate semantic features corresponding to the breast MR image.
The method for predicting breast cancer HER2 state is characterized in that the model structure of the second classification model is the same as that of the first classification model, the second classification model is different from the first classification model in that a first feature screening unit in the first classification model screens candidate imagery omics features by adopting Lasso regression and chi-square test, and a second feature screening unit in the first classification model screens candidate semantic features by adopting Lasso regression and mutual information; and the second feature screening unit in the second classification model screens the candidate semantic features by adopting Lasso regression and chi-square test.
A second aspect of embodiments of the present application provides a computer readable storage medium having stored thereon one or more programs executable by one or more processors to perform the steps of the method for predicting breast cancer HER2 status as described in any of the above.
A third aspect of the embodiments of the present application provides a terminal device, including: a processor, a memory, and a communication bus; the memory has stored thereon a computer readable program executable by the processor;
the communication bus realizes connection communication between the processor and the memory;
the processor, when executing the computer readable program, carries out the steps in a method of predicting breast cancer HER2 status as described in any one of the above.
Has the advantages that: compared with the prior art, the method and the related equipment have the advantages that the breast cancer HER2 state prediction method includes the steps that the breast MR image to be predicted is respectively input into the first classification model and the second classification model, and the first classification model and the second classification model are controlled to determine whether the HER2 type corresponding to the breast MR image is a HER2 high-expression type, a HER2 low-expression type or a HER2 zero-expression type. According to the method, the first classification model and the second classification model are adopted to learn the breast MR images carrying breast lesion features in different aspects, so that the focus information mined in different aspects is correlated, the HER2 type is determined to be the HER2 high expression type, the HER2 low expression type or the HER2 zero expression type through the correlated focus information, and the accuracy of identifying the HER2 type to be the HER2 high expression type, the HER2 low expression type or the HER2 zero expression type is further improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without any inventive work.
Fig. 1 is a flow chart of a method for predicting breast cancer HER2 status provided herein.
Figure 2 is a schematic representation of HER2 class in the methods of predicting breast cancer HER2 status provided herein.
Fig. 3 is a model structure diagram of a first classification model in the method for predicting breast cancer HER2 status provided in the present application.
Fig. 4 is a schematic model structure diagram of a semantic segmentation model in the method for predicting breast cancer HER2 status provided in the present application.
Fig. 5 is a flowchart of a clustering algorithm in the method for predicting breast cancer HER2 status provided herein.
Fig. 6 is a schematic structural diagram of a terminal device provided in the present application.
Detailed Description
The present application provides a method and related apparatus for predicting breast cancer HER2 status, and in order to make the objects, technical solutions and effects of the present application clearer and clearer, the present application will be further described in detail with reference to the drawings and examples. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those within the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
It should be understood that, the sequence numbers and sizes of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process is determined by its function and inherent logic, and should not constitute any limitation on the implementation process of this embodiment.
The inventor has found that Breast Cancer (BC) is a heterogeneous disease, now surpassing lung Cancer and becoming the first Cancer worldwide, and is also the leading cause of Cancer death in women. Human Epidermal Growth Factor Receptor 2(Human Epidermal Growth Factor Receptor 2, HER2) is a protooncogene encoding a tyrosine kinase active Epidermal Growth Factor Receptor and is the only predictive marker for targeted therapy of breast tumors, the HER2 status plays a crucial role in selection of breast cancer treatment regimens and prognosis evaluation, according to the definition of the ASCO/CAP guideline, BC is classified as HER2 high-expression and HER2 negative, and HER2 negative can be further classified as HER2 low-expression and HER2 zero-expression as determined by standardized central pathology. HER2 high expressing BC patients may benefit from anti-HER 2 therapy, while two HER 2-directed Antibody Drug Conjugates (ADCs) with chemotherapeutic drugs, trastuzumab deglutition (T-DXd) and trastuzumab docazine (SYD985), showed very promising therapeutic activity in HER2 low expressing BC patients. Therefore, the objective and accurate identification of different patients with high expression of HER2, low expression of HER2 and zero expression of HER2 has important significance for the treatment and prognosis of BC patients.
Currently, HER2 expression is routinely evaluated clinically using pathology. According to 2019 edition of Chinese breast cancer HER2 detection guidelines, when a patient is subjected to HER2 pathological evaluation, an Immunohistochemistry (IHC) method is recommended to detect the expression level of HER2 protein, the IHC method needs to carry out invasive examination firstly, then a doctor evaluates the staining degree of invasive cancer cells by experience, and finally obtains an evaluation result of HER 2. However, invasive examination cannot be performed for many times, the change of HER2 level cannot be observed dynamically, the subjectivity of the evaluation result is strong, and a quantitative and objective evaluation result cannot be obtained. Therefore, finding a non-invasive method to achieve HER2 status assessment is a clinically urgent challenge.
For this reason, the conventional imaging group is used to evaluate the HER2 state, but the conventional imaging group needs manual design features, which are not enough to fully mine the internal connection between data, and cannot accurately evaluate the HER2 state, so that misdiagnosis is easily caused.
In order to solve the above problem, in the embodiment of the present application, the breast MR images to be predicted are respectively input into a first classification model and a second classification model, and the first classification model and the second classification model are controlled to determine that the HER2 category corresponding to the breast MR images is a HER2 high expression category, a HER2 low expression category or a HER2 zero expression category. According to the method, the first classification model and the second classification model are adopted to learn the breast MR images carrying breast lesion features in different aspects, so that the focus information mined in different aspects is correlated, the HER2 type is determined to be the HER2 high expression type, the HER2 low expression type or the HER2 zero expression type through the correlated focus information, and the accuracy of identifying the HER2 type to be the HER2 high expression type, the HER2 low expression type or the HER2 zero expression type is further improved.
The following further describes the content of the application by describing the embodiments with reference to the attached drawings.
The present embodiment provides a method for predicting breast cancer HER2 status, as shown in fig. 1, the method comprising:
and S10, respectively inputting the breast MR image to be predicted into the first classification model and the second classification model.
Specifically, the breast MR (magnetic resonance) image to be predicted is a breast MR image of a breast cancer patient acquired in advance by an MR image acquisition device, and the breast MR image carries a breast cancer region. The first classification model and the second classification model are both preset, the first classification model and the second classification model are parallel, and input items of the first classification model and the second classification model are both breast MR images, the first classification model is used for predicting high expression probability and negative expression probability of HER2 state of a breast tumor region in the breast MR images, and the second classification model is used for predicting low expression probability and predicting HER2 zero expression probability of HER2 state of the breast tumor region in the breast MR images.
S20, controlling the first classification model and the second classification model to determine the HER2 category corresponding to the mammary MR image.
Specifically, HER2 is a protooncogene encoding a tyrosine kinase-active Epidermal Growth Factor Receptor (Human Epidermal Growth Factor Receptor 2, HER2) and is the only predictive marker for HER2 targeted therapy at chromosome 17, q 21. The HER2 categories comprise high HER2 expression, low HER2 expression or zero HER2 expression, wherein the high HER2 expression, the low HER2 expression and the zero HER2 expression respectively represent different states of HER2, and the HER2 state of a breast tumor area can be determined through the predicted HER2 categories, so that a doctor can select a treatment mode based on the predicted HER2 categories. Among them, Immunohistochemistry (IHC) can convert IHC0-1+, IHC2+, IHC3 +. IHC3+ or IHC2+ and ISH (in situ hybridization) + reads high expression, IHC1+ or IHC2+ and ISH-reads low expression of HER2, IHC0 reads zero expression of HER 2.
In an implementation manner of this embodiment, the controlling the first classification model and the second classification model to determine the HER2 category corresponding to the breast MR image specifically includes:
s21, controlling the first classification model to determine the HER2 high expression probability and the HER2 negative expression probability corresponding to the mammary gland MR image;
s22, controlling the second classification model to determine the corresponding predicted HER2 low expression probability and predicted HER2 zero expression probability of the breast MR image;
s23, determining a HER2 category corresponding to the breast MR image based on the HER2 high expression probability, the HER2 negative expression probability, the predicted HER2 low expression probability and the predicted HER2 zero expression probability.
In particular, the first classification model is used for HER2 high expression probability and HER2 negative expression probability based on a breast tumor region in the breast MR image, wherein the sum of the HER2 high expression probability and HER2 negative expression probability equals 1. The second classification model is used for predicting HER2 low expression probability and HER2 zero expression probability based on the corresponding breast MR image, wherein the sum of the predicted HER2 low expression probability and the predicted HER2 zero expression probability is equal to 1. Furthermore, as shown in figure 2, the low expression of HER2 and zero expression of HER2 only occur when HER2 status is negative and not when HER2 status is high, so that HER2 class includes high expression of HER2, low expression of HER2 and zero expression of HER 2. Thus, the probability of HER2 under-expression of the breast MR image was determined based on the predicted probability of HER2 under-expression and the probability of HER2 negative expression, and the probability of HER2 zero-expression was determined based on the predicted probability of HER2 zero-expression and the probability of HER2 negative expression.
Based on this, in an implementation manner of this embodiment, the determining, based on the HER2 negative expression probability, the predicted HER2 low expression probability, and the predicted HER2 zero expression probability, the HER2 category corresponding to the breast MR image specifically includes:
respectively calculating the products of the HER2 negative expression probability and the predicted HER2 low expression probability and the HER2 negative expression probability and the predicted HER2 zero expression probability to obtain a HER2 low expression probability and a HER2 zero expression probability;
determining the HER2 category corresponding to the breast MR image based on the high HER2 expression probability, the low HER2 expression probability and the zero HER2 expression probability.
In particular, the HER2 under-expression probability may be the product of the HER2 negative expression probability and the predicted HER2 under-expression probability, and the HER2 zero-expression probability may be the product of the HER2 negative expression probability and the predicted HER2 zero-expression probability, such that the sum of the HER2 high-expression probability, the HER2 under-expression probability, and the HER2 zero-expression probability is equal to 1. Furthermore, after determining the high expression probability of HER2, the low expression probability of HER2 and the zero expression probability of HER2, the HER2 category corresponding to the highest probability value among the high expression probability of HER2, the low expression probability of HER2 and the zero expression probability of HER2 may be used as the HER2 category corresponding to the breast MR image, or the HER2 high expression probability, the HER2 low expression probability and the zero expression probability of HER2 may be compared with a preset threshold, and the HER2 category corresponding to the probability greater than the preset threshold may be used as the HER2 category corresponding to the breast MR image, and the like.
In one implementation manner of this embodiment, as shown in fig. 3, the first classification model includes a cinematology module, a semantic segmentation module, and a prediction module, and the controlling the first classification model to determine the HER2 high expression probability and the HER2 negative expression probability corresponding to the breast MR image specifically includes:
controlling the second proteomics module to determine the proteomics characteristics corresponding to the mammary gland MR image, and determining the first HER2 high expression probability and the first HER2 negative expression probability corresponding to the mammary gland MR image based on the proteomics characteristics;
controlling the semantic segmentation module to determine a second HER2 high expression probability and a second HER2 negative expression probability corresponding to the breast MR image;
controlling the prediction module to determine a high HER2 expression probability and a negative HER2 expression probability corresponding to the breast MR image based on the first high HER2 expression probability, the first negative HER2 expression probability, the second high HER2 expression probability and the second negative HER2 expression probability.
Specifically, the imagery omics feature is used for obtaining an imagery omics feature of the breast MR image, and predicting a first HER2 high expression probability and a first HER2 negative expression probability corresponding to the breast MR image based on the obtained influential omics feature, and the semantic feature module is used for obtaining a semantic feature of the breast MR image and determining a second HER2 high expression probability and a second HER2 negative expression probability corresponding to the breast MR image based on the obtained semantic image. The prediction module is used for fusing a first HER2 high expression probability, a first HER2 negative expression probability, a second HER2 high expression probability and a second HER2 negative expression probability to obtain a HER2 high expression probability and a HER2 negative expression probability. It can be understood that the high expression probability of HER2 and the negative expression probability of HER2 are determined based on the imagery features and the semantic features, that is, the first classification model learns the imagery features and the semantic features in the breast MR image and fully mines the image features in the image MR image, thereby improving the accuracy of the high expression probability of HER2 and the negative expression probability of HER 2. In one implementation, the average of the first HER2 high expression probability and the second HER2 high expression probability is taken as the HER2 high expression probability and the average of the second HER2 high expression probability and the second HER2 negative expression probability is taken as the HER2 negative expression probability. Of course, in practical applications, the prediction module may also use the weighted values of the first HER2 high expression probability and the second HER2 high expression probability as the HER2 high expression probability, and the weighted values of the second HER2 high expression probability and the second HER2 negative expression probability as the HER2 negative expression probability.
In one implementation manner of this embodiment, the omics module includes a lesion delineation unit, an omics feature extraction unit, a first feature screening unit, and a first prediction unit; the controlling the second omics module to determine the omics characteristics corresponding to the breast MR image, and the determining the first HER2 high expression probability and the first HER2 negative expression probability corresponding to the breast MR image based on the omics characteristics specifically includes:
controlling the focus delineation unit to extract a region of interest in the breast MR image;
controlling the imaging omics feature extraction unit to extract imaging features of the region of interest so as to obtain candidate imaging omics features;
controlling the first feature screening unit to perform feature screening on the candidate imagery omics features to obtain imagery omics features;
controlling the first prediction unit to determine a corresponding first HER2 high expression probability and a first HER2 negative expression probability of the breast MR image based on the iconic features.
In particular, the lesion delineation unit is configured to extract a region of interest in a breast MR image, wherein the region of interest comprises an image tumor region. The breast MR image is obtained by scanning the whole breast, the lesion area only occupies a small part of the breast MR image, and in order to reduce redundant information of the breast MR image, the breast MR image needs to be preprocessed after being acquired so as to extract the lesion area of the breast in the breast MR image and remove the redundant information in the breast MR image, so that the accuracy of the subsequent first HER2 high expression probability and the first HER2 negative expression probability can be improved. In one implementation, the process of extracting the region of interest by the lesion delineation unit may be: and inputting the mammary gland MR image into a focus delineation unit, delineating and extracting a mammary gland tumor region in the mammary gland MR image based on the cross section, the coronal plane and the sagittal plane of the mammary gland MR image by the focus delineation unit so as to obtain a region of interest corresponding to the mammary gland MR image.
The cineomics feature extraction unit is configured to extract the cineomics features in the region of interest, as shown in fig. 3, the cineomics features may be determined based on the region of interest, a derivative map of the region of interest generated by wavelet filtering, and a derivative map of the region of interest generated by gaussian filtering, where the wavelet filtering generates 8 derivative images and the gaussian filtering generates 3 derivative images. Furthermore, as shown in table 1, the imagery omics features may include 1130 imagery features consisting of first-order features, texture features, and shape features, wherein the first-order features and the texture features are extracted from the region of interest, the derivative map generated by wavelet filtering, and the derivative map generated by gaussian filtering, respectively, and the shape features are extracted from the original image (breast MR image).
TABLE 1 characteristics of imaging group
Figure BDA0003351663510000121
The first feature screening unit is used for screening the image omics features acquired by the image omics feature extracting unit and removing useless features in the image omics features so as to reduce the operation amount of the candidate prediction unit. The first feature screening unit may screen candidate imagery omics features by using Lasso regression and chi-square test, and the specific process may be as follows: firstly, screening candidate image omics characteristics by selecting a Lasso linear regression mode; and then screening the screened candidate image omics characteristics again by adopting chi-square test to obtain the image omics characteristics. In this embodiment, the regression coefficients of the image features in the candidate image omics features may be compressed by using a Lasso linear regression method, and the regression coefficients of some image features may be changed to 0, so as to remove the image features with the regression coefficients of 0. Then, the relevance among the measured image features is checked through chi-square, and the image features which are most probably irrelevant to classification are removed based on the relevance, so that the effectiveness of the screened image features is improved.
In one implementation, the chi-square test determines the correlation between image features by comparing the degree of coincidence or goodness of fit between the theoretical frequency and the actual frequency by using the correlation of classification variables (discrete variables) and correlation analysis, wherein the expression formula of the chi-square test may be:
Figure BDA0003351663510000122
wherein f isoRepresenting the actual frequency Observation; f. ofeIndicating the desired frequency expecteration.
The first prediction unit is used for predicting a first HER2 high expression probability and a first HER2 negative expression probability based on the iconomics characteristics, wherein the first prediction unit is a binary model. In a specific implementation manner, the first prediction unit comprises an LR classifier, and the LR classifier is a classification manner based on linear regression and Sigmoid function combination. Thus, the expression for the first prediction unit to determine its corresponding first HER2 high expression probability and first HER2 negative expression probability may be:
Figure BDA0003351663510000131
Figure BDA0003351663510000132
wherein x ∈ RnRepresenting the feature vector of the input first prediction unit, w ∈ RnRepresents a weight vector, b ∈ RnRepresenting the bias vector.
In one implementation manner of this embodiment, the semantic segmentation module includes a feature extraction unit, a second feature screening unit, and a second prediction unit; the controlling the semantic segmentation module to determine the second HER2 high expression probability and the second HER2 negative expression probability corresponding to the breast MR image specifically includes:
controlling the feature extraction unit to determine candidate semantic features corresponding to the mammary gland MR image;
controlling the second feature screening unit to screen the candidate semantic features to obtain semantic features corresponding to the mammary gland MR image;
controlling the second prediction unit to determine a second HER2 high expression probability and a second HER2 negative expression probability corresponding to the breast MR image based on the semantic features.
Specifically, the feature extraction unit is configured to extract candidate semantic features of the breast MR image, where the feature extraction unit may adopt an encoding and decoding structure, that is, the feature extraction unit may include an encoder and a decoder, and input items of the encoder are the breast MR image and a plurality of first feature maps, where feature dimensions of each of the plurality of first feature maps are different. The encoder is used for determining candidate semantic features based on a plurality of first feature maps, wherein the decoder learns the semantic features in the breast tumor region in the breast MR image in an up-sampling mode to obtain the candidate semantic features.
In one implementation, the encoder includes a convolution unit and a plurality of down-sampling units that are cascaded in sequence; the decoder comprises a plurality of up-sampling units and a convolution layer which are sequentially cascaded, wherein the down-sampling units correspond to the up-sampling units one by one, and each down-sampling unit is in jumping connection with the corresponding up-sampling unit. The convolution unit in the encoder is in jump connection with the convolution layer in the decoder, and in the embodiment, the plurality of down-sampling units and the plurality of up-sampling units are in jump connection, and the convolution unit and the convolution layer are in jump connection, so that high-resolution and low-resolution image features can be fused, a breast tumor region can be finely divided, more candidate semantic features can be extracted, and the module performance of a semantic feature module can be improved. In addition, the problem that the gradient disappears under the condition that the network layer number is deep is solved by using the jump connection, the reverse propagation of the gradient is facilitated to accelerate the training process, and the feature information of different dimensions acquired by the encoder is fused, so that the feature accuracy can be improved.
In one exemplary implementation, as shown in fig. 4, the down-sampling units are 4 down-sampling units, P1, P2, P3 and P3; each downsampling unit comprises a pooling layer pool and two convolution blocks which are sequentially cascaded, and each convolution block comprises a convolution layer conv, a normalization layer GN and an activation layer relu. The plurality of up-sampling units are 4 up-sampling units which are U1, U2, U3 and U4 respectively, each up-sampling unit comprises an up-sampling layer up sample and a convolution block, each down-sampling unit is connected with each up-sampling unit through a concatemate jump, and the model structure of the convolution block in the up-sampling unit is the same as that of the convolution block in the down-sampling unit. Further, the convolution unit C1 in the encoder includes two convolution blocks, and the model structure of the convolution blocks in the convolution unit is the same as that of the convolution blocks in the downsampling unit. In addition, the feature extraction unit may adopt local as a loss function in a training stage to evaluate a difference between a prediction result of the segmentation stage and an actual segmentation result, and provide a training direction for the segmentation model. The Focal loss expression is as follows:
Figure BDA0003351663510000141
in an implementation manner of this embodiment, the controlling the feature extraction unit to determine the candidate semantic features corresponding to the MR image specifically includes:
controlling the feature extraction unit to determine a plurality of slice feature maps based on the mammary gland MR image, wherein the plurality of slice feature maps correspond to a plurality of slice images included in the mammary gland MR image one by one;
dividing the plurality of slice characteristic graphs into two characteristic clusters based on a clustering mode, and removing the characteristic clusters with less slice characteristic graphs from the two characteristic clusters to obtain a plurality of target slice characteristics;
and fusing the target slice features to obtain candidate semantic features corresponding to the MR image.
Specifically, the breast MR image includes a plurality of slice images, and when the breast MR image is input to the feature extraction unit, the feature extraction unit obtains a slice feature map corresponding to each slice image, where the slice feature map is used to reflect semantic features of the slice image corresponding to the slice image, and the slice features are output items of a penultimate convolutional layer in the encoder. After the slice features corresponding to all slice images are obtained, because all slice features carry a large number of image features and have redundancy and useless features, after the slice feature maps corresponding to all slice images are obtained, the obtained slice feature maps are screened in a clustering mode to obtain a plurality of target slice features, and then the candidate semantic features are determined based on the target slice features.
In an implementation manner of this embodiment, the slicing feature maps are divided into two feature clusters based on a clustering manner, and the feature clusters with fewer slicing feature maps in the two feature clusters are removed, so as to obtain a plurality of target slicing features. As shown in fig. 4, the specific process of the clustering process may be to preset the number of clusters to be taken, and randomly select a slice feature map of the number of clusters to be taken as an initial clustering center; respectively calculating the distance from each slice characteristic diagram to each initial clustering center, and dividing each section characteristic diagram into clustering clusters corresponding to the initial clustering centers with the minimum distance; updating the clustering centers of the number of the clustering clusters, and detecting whether the clustering center of each clustering cluster changes; and if the change value occurs, repeating the step of respectively calculating the distance from each slice characteristic diagram to each initial clustering center until the clustering center of each clustering cluster does not change. In this embodiment, the number of clusters is 2.
The second feature screening unit screens the candidate semantic features by using a Lasso regression and mutual information, wherein the Lasso regression is the same as the Lasso regression used by the first feature screening unit, and a description thereof is omitted here. The mutual information is used for measuring the dependency relationship between variables, and the correlation between the candidate semantic features and the preset HER2 categories can be displayed through the mutual information, so that the candidate semantic features can be screened. The formula for calculating the mutual information may be:
Figure BDA0003351663510000151
wherein X and Y are two random variables.
In step S22, the input items of the second classification model are the breast MR image, and the output items include a predicted HER2 low expression probability and a predicted HER2 zero expression probability, wherein the model structure of the second classification model is substantially the same as the model structure of the first classification model, and the difference between the first and second feature screening units is different, and the same parts are not described again, and only the difference between the first and second feature screening units is described. The first feature screening unit in the first classification model screens candidate imaging omics features by adopting Lasso regression and chi-square test, and the second feature screening unit in the first classification model screens candidate semantic features by adopting Lasso regression and mutual information; and the second feature screening unit in the second classification model screens the candidate semantic features by adopting Lasso regression and chi-square test.
In summary, the present embodiment provides a method for predicting a breast cancer HER2 status, the method includes inputting a breast MR image to be predicted into a first classification model and a second classification model, respectively, and controlling the first classification model and the second classification model to determine that a HER2 category corresponding to the breast MR image is a HER2 high expression category, a HER2 low expression category, or a HER2 zero expression category. According to the method, the first classification model and the second classification model are adopted to learn the breast MR images carrying breast lesion features in different aspects, so that the focus information mined in different aspects is correlated, the HER2 type is determined to be the HER2 high expression type, the HER2 low expression type or the HER2 zero expression type through the correlated focus information, and the accuracy of identifying the HER2 type to be the HER2 high expression type, the HER2 low expression type or the HER2 zero expression type is further improved.
Based on the above method for predicting breast cancer HER2 status, the present embodiment provides a computer readable storage medium storing one or more programs, which are executable by one or more processors to implement the steps in the method for predicting breast cancer HER2 status as described in the above embodiments.
Based on the above method for predicting breast cancer HER2 status, the present application further provides a terminal device, as shown in fig. 6, including at least one processor (processor) 20; a display screen 21; and a memory (memory)22, and may further include a communication Interface (Communications Interface)23 and a bus 24. The processor 20, the display 21, the memory 22 and the communication interface 23 can communicate with each other through the bus 24. The display screen 21 is configured to display a user guidance interface preset in the initial setting mode. The communication interface 23 may transmit information. The processor 20 may call logic instructions in the memory 22 to perform the methods in the embodiments described above.
Furthermore, the logic instructions in the memory 22 may be implemented in software functional units and stored in a computer readable storage medium when sold or used as a stand-alone product.
The memory 22, which is a computer-readable storage medium, may be configured to store a software program, a computer-executable program, such as program instructions or modules corresponding to the methods in the embodiments of the present disclosure. The processor 20 executes the functional application and data processing, i.e. implements the method in the above-described embodiments, by executing the software program, instructions or modules stored in the memory 22.
The memory 22 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. Further, the memory 22 may include a high speed random access memory and may also include a non-volatile memory. For example, a variety of media that can store program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk, may also be transient storage media.
In addition, the specific processes loaded and executed by the storage medium and the instruction processors in the terminal device are described in detail in the method, and are not stated herein.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. A method of predicting breast cancer HER2 status, the method comprising:
respectively inputting a mammary gland MR image to be predicted into a first classification model and a second classification model;
controlling the first classification model and the second classification model to determine a HER2 category corresponding to the breast MR image, wherein the HER2 category comprises HER2 high expression, HER2 low expression or HER2 zero expression.
2. The method for predicting breast cancer HER2 status according to claim 1, wherein said controlling the first classification model and the second classification model to determine the HER2 classification corresponding to the breast MR image comprises:
controlling the first classification model to determine the high expression probability of HER2 and the negative expression probability of HER2 corresponding to the breast MR image;
controlling the second classification model to determine a corresponding predicted HER2 low expression probability and a predicted HER2 zero expression probability of the breast MR image;
determining the HER2 category corresponding to the mammary MR image based on the HER2 high expression probability, the HER2 negative expression probability, the predicted HER2 low expression probability and the predicted HER2 zero expression probability.
3. The method for predicting the state of breast cancer HER2 according to claim 2, wherein the determining the HER2 class corresponding to the breast MR image based on the HER2 negative expression probability, the HER2 low expression probability and the HER2 zero expression probability specifically comprises:
respectively calculating the products of the HER2 negative expression probability and the predicted HER2 low expression probability and the HER2 negative expression probability and the predicted HER2 zero expression probability to obtain a HER2 low expression probability and a HER2 zero expression probability;
determining the HER2 category corresponding to the breast MR image based on the high HER2 expression probability, the low HER2 expression probability and the zero HER2 expression probability.
4. The method of predicting breast cancer HER2 status of claim 2, wherein the first classification model comprises a visualizer module, a semantic segmentation module, and a prediction module; the controlling the first classification model to determine the HER2 high expression probability and the HER2 negative expression probability corresponding to the breast MR image specifically includes:
controlling the imagomics module to determine the imagomics characteristics corresponding to the mammary gland MR image, and determining the first HER2 high expression probability and the first HER2 negative expression probability corresponding to the mammary gland MR image based on the imagomics characteristics;
controlling the semantic segmentation module to determine a second HER2 high expression probability and a second HER2 negative expression probability corresponding to the breast MR image;
controlling the prediction module to determine a high HER2 expression probability and a negative HER2 expression probability corresponding to the breast MR image based on the first high HER2 expression probability, the first negative HER2 expression probability, the second high HER2 expression probability and the second negative HER2 expression probability.
5. The method of claim 4, wherein the imaging module comprises a lesion delineation unit, an imaging feature extraction unit, a first feature screening unit, and a first prediction unit; the controlling the second omics module to determine the omics characteristics corresponding to the breast MR image, and the determining the first HER2 high expression probability and the first HER2 negative expression probability corresponding to the breast MR image based on the omics characteristics specifically includes:
controlling the focus delineation unit to extract a region of interest in the breast MR image;
controlling the imaging omics feature extraction unit to extract imaging features of the region of interest so as to obtain candidate imaging omics features;
controlling the first feature screening unit to perform feature screening on the candidate imagery omics features to obtain imagery omics features;
controlling the first prediction unit to determine a corresponding first HER2 high expression probability and a first HER2 negative expression probability of the breast MR image based on the iconic features.
6. The method of claim 4, wherein the semantic segmentation module comprises a feature extraction unit, a second feature screening unit and a second prediction unit; the controlling the semantic segmentation module to determine the second HER2 high expression probability and the second HER2 negative expression probability corresponding to the breast MR image specifically includes:
controlling the feature extraction unit to determine candidate semantic features corresponding to the mammary gland MR image;
controlling the second feature screening unit to screen the candidate semantic features to obtain semantic features corresponding to the mammary gland MR image;
controlling the second prediction unit to determine a second HER2 high expression probability and a second HER2 negative expression probability corresponding to the breast MR image based on the semantic features.
7. The method for predicting the HER2 status of breast cancer as recited in claim 6, wherein said controlling the feature extraction unit to determine the semantic feature candidates corresponding to the MR image of the breast comprises:
controlling the feature extraction unit to determine a plurality of slice feature maps based on the mammary gland MR image, wherein the plurality of slice feature maps correspond to a plurality of slice images included in the mammary gland MR image one by one;
dividing the plurality of slice characteristic graphs into two characteristic clusters based on a clustering mode, and removing the characteristic clusters with less slice characteristic graphs from the two characteristic clusters to obtain a plurality of target slice characteristics;
and fusing the target slice features to obtain candidate semantic features corresponding to the breast MR image.
8. The method of claim 4, wherein the model structure of the second classification model is the same as the model structure of the first classification model, and the second classification model is different from the first classification model in that the first feature screening unit in the first classification model screens the candidate visual omics features by using Lasso regression and chi-square test, and the second feature screening unit in the first classification model screens the candidate semantic features by using Lasso regression and mutual information; and the second feature screening unit in the second classification model screens the candidate semantic features by adopting Lasso regression and chi-square test.
9. A computer readable storage medium, having one or more programs stored thereon for execution by one or more processors to perform the steps of the method for predicting breast cancer HER2 status of any of claims 1-8.
10. A terminal device, comprising: a processor, a memory, and a communication bus; the memory has stored thereon a computer readable program executable by the processor;
the communication bus realizes connection communication between the processor and the memory;
the processor, when executing the computer readable program, carries out the steps in a method of predicting breast cancer HER2 status as claimed in any one of claims 1-8.
CN202111338891.8A 2021-11-12 2021-11-12 Breast cancer HER2 state prediction method and related equipment Active CN114171197B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111338891.8A CN114171197B (en) 2021-11-12 2021-11-12 Breast cancer HER2 state prediction method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111338891.8A CN114171197B (en) 2021-11-12 2021-11-12 Breast cancer HER2 state prediction method and related equipment

Publications (2)

Publication Number Publication Date
CN114171197A true CN114171197A (en) 2022-03-11
CN114171197B CN114171197B (en) 2022-10-04

Family

ID=80479183

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111338891.8A Active CN114171197B (en) 2021-11-12 2021-11-12 Breast cancer HER2 state prediction method and related equipment

Country Status (1)

Country Link
CN (1) CN114171197B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115147668A (en) * 2022-09-06 2022-10-04 北京鹰瞳科技发展股份有限公司 Training method of disease classification model, disease classification method and related products
CN115440383A (en) * 2022-09-30 2022-12-06 中国医学科学院北京协和医院 System for predicting curative effect of PD-1/PD-L1 monoclonal antibody of advanced cancer patient
WO2024021037A1 (en) * 2022-07-29 2024-02-01 京东方科技集团股份有限公司 Disease analysis method and apparatus, and disease analysis model training method and apparatus

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108898160A (en) * 2018-06-01 2018-11-27 中国人民解放军战略支援部队信息工程大学 Breast cancer tissue's Pathologic Grading method based on CNN and image group Fusion Features
US20190259157A1 (en) * 2018-02-21 2019-08-22 Case Western Reserve University Predicting neo-adjuvant chemotherapy response from pre-treatment breast magnetic resonance imaging using artificial intelligence and her2 status
CN111401214A (en) * 2020-03-12 2020-07-10 四川大学华西医院 Multi-resolution integrated HER2 interpretation method based on deep learning
CN111583320A (en) * 2020-03-17 2020-08-25 哈尔滨医科大学 Breast cancer ultrasonic image typing method and system fusing deep convolutional network and image omics characteristics and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190259157A1 (en) * 2018-02-21 2019-08-22 Case Western Reserve University Predicting neo-adjuvant chemotherapy response from pre-treatment breast magnetic resonance imaging using artificial intelligence and her2 status
CN108898160A (en) * 2018-06-01 2018-11-27 中国人民解放军战略支援部队信息工程大学 Breast cancer tissue's Pathologic Grading method based on CNN and image group Fusion Features
CN111401214A (en) * 2020-03-12 2020-07-10 四川大学华西医院 Multi-resolution integrated HER2 interpretation method based on deep learning
CN111583320A (en) * 2020-03-17 2020-08-25 哈尔滨医科大学 Breast cancer ultrasonic image typing method and system fusing deep convolutional network and image omics characteristics and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘桐桐等: "基于影像组学预测乳腺癌雌激素受体表达情况的可行性分析", 《生物医学工程学杂志》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024021037A1 (en) * 2022-07-29 2024-02-01 京东方科技集团股份有限公司 Disease analysis method and apparatus, and disease analysis model training method and apparatus
CN115147668A (en) * 2022-09-06 2022-10-04 北京鹰瞳科技发展股份有限公司 Training method of disease classification model, disease classification method and related products
CN115147668B (en) * 2022-09-06 2022-12-27 北京鹰瞳科技发展股份有限公司 Training method of disease classification model, disease classification method and related products
CN115440383A (en) * 2022-09-30 2022-12-06 中国医学科学院北京协和医院 System for predicting curative effect of PD-1/PD-L1 monoclonal antibody of advanced cancer patient
CN115440383B (en) * 2022-09-30 2023-03-21 中国医学科学院北京协和医院 System for predicting curative effect of PD-1/PD-L1 monoclonal antibody of advanced cancer patient

Also Published As

Publication number Publication date
CN114171197B (en) 2022-10-04

Similar Documents

Publication Publication Date Title
CN114171197B (en) Breast cancer HER2 state prediction method and related equipment
CN106815481B (en) Lifetime prediction method and device based on image omics
ES2680678T3 (en) Detection of the edges of a core using image analysis
CN112768072B (en) Cancer clinical index evaluation system constructed based on imaging omics qualitative algorithm
JP2022527145A (en) Multiple instance Lana for predictive organizational pattern identification
Zhang et al. Automated semantic segmentation of red blood cells for sickle cell disease
CN113574534A (en) Machine learning using distance-based similarity labels
US20220058839A1 (en) Translation of images of stained biological material
Ström et al. Pathologist-level grading of prostate biopsies with artificial intelligence
CN105579847A (en) Disease analysis device, control method, and program
EP3811281A1 (en) A method to determine a degree of abnormality, a respective computer readable medium and a distributed cancer analysis system
CN112561869B (en) Pancreatic neuroendocrine tumor postoperative recurrence risk prediction method
US20230306598A1 (en) Systems and methods for mesothelioma feature detection and enhanced prognosis or response to treatment
CN109191422B (en) System and method for detecting early ischemic stroke based on conventional CT image
Wetteland et al. Automatic diagnostic tool for predicting cancer grade in bladder cancer patients using deep learning
CN115457361A (en) Classification model obtaining method, expression class determining method, apparatus, device and medium
CN115938597A (en) cancer prognosis
CN113764101B (en) Novel auxiliary chemotherapy multi-mode ultrasonic diagnosis system for breast cancer based on CNN
CN114445356A (en) Multi-resolution-based full-field pathological section image tumor rapid positioning method
US20230411014A1 (en) Apparatus and method for training of machine learning models using annotated image data for pathology imaging
CN116504406A (en) Method and system for constructing lung cancer postoperative risk model based on image combination pathology
US20230169662A1 (en) System and method for generating a morphological atlas of an embryo
Sertel et al. Computer-aided prognosis of neuroblastoma: classification of stromal development on whole-slide images
Mansour et al. Kidney segmentations using cnn models
CN115063351A (en) Deep learning-based fetal MRI brain tissue segmentation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant