CN115761360A - Tumor gene mutation classification method and device, electronic equipment and storage medium - Google Patents

Tumor gene mutation classification method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115761360A
CN115761360A CN202211485858.2A CN202211485858A CN115761360A CN 115761360 A CN115761360 A CN 115761360A CN 202211485858 A CN202211485858 A CN 202211485858A CN 115761360 A CN115761360 A CN 115761360A
Authority
CN
China
Prior art keywords
target
image
gene mutation
tumor
positron emission
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211485858.2A
Other languages
Chinese (zh)
Inventor
胡战利
肖正辉
梁栋
郑海荣
杨永峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN202211485858.2A priority Critical patent/CN115761360A/en
Publication of CN115761360A publication Critical patent/CN115761360A/en
Priority to PCT/CN2023/133473 priority patent/WO2024109859A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/082Learning methods modifying the architecture, e.g. adding, deleting or silencing nodes or connections
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • General Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Nuclear Medicine (AREA)

Abstract

The embodiment of the invention discloses a tumor gene mutation classification method, a tumor gene mutation classification device, electronic equipment and a storage medium. The method comprises the following steps: acquiring a target positron emission tomography image containing a target tumor and a target electron computer tomography image corresponding to the target positron emission tomography image, and acquiring a trained tumor gene mutation classification model; fusing the target positron emission tomography image and the target electron computed tomography image into a target fused image; and inputting the target fusion image into the tumor gene mutation classification model, and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model. The technical scheme of the embodiment of the invention can improve the accuracy of tumor gene mutation classification.

Description

Tumor gene mutation classification method and device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of image processing, in particular to a tumor gene mutation classification method and device, electronic equipment and a storage medium.
Background
Classification of tumor gene mutations is crucial for the determination of targeted therapeutic regimens.
However, the existing tumor gene mutation classification method has the problem of low accuracy rate, and the problem needs to be solved.
Disclosure of Invention
The embodiment of the invention provides a tumor gene mutation classification method, a tumor gene mutation classification device, electronic equipment and a storage medium, so as to improve the accuracy of tumor gene mutation classification.
According to an aspect of the present invention, there is provided a method for classifying a tumor gene mutation, which may include:
acquiring a target positron emission tomography image containing a target tumor and a target electron computer tomography image corresponding to the target positron emission tomography image, and acquiring a trained tumor gene mutation classification model;
fusing the target positron emission tomography image and the target electron computed tomography image into a target fusion image;
and inputting the target fusion image into the tumor gene mutation classification model, and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model.
According to another aspect of the present invention, there is provided a tumor gene mutation classification apparatus, which may include:
the tumor gene mutation classification model acquisition module is used for acquiring a target positron emission tomography image containing a target tumor and a target electron computed tomography image corresponding to the target positron emission tomography image, and acquiring a trained tumor gene mutation classification model;
the target fusion image fusion module is used for fusing the target positron emission tomography image and the target electron computed tomography image into a target fusion image;
and the gene mutation classification result determining module is used for inputting the target fusion image into the tumor gene mutation classification model and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model.
According to another aspect of the present invention, there is provided an electronic device, which may include:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor such that the at least one processor, when executed, implements the oncogene mutation classification method provided by any of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer-readable storage medium having stored thereon computer instructions for causing a processor to execute the method for classifying a tumor gene mutation provided in any of the embodiments of the present invention.
According to the technical scheme of the embodiment of the invention, a target positron emission tomography image containing a target tumor and a target electron computer tomography image corresponding to the target positron emission tomography image are obtained, and a trained tumor gene mutation classification model is obtained; fusing the target positron emission tomography image and the target electron computed tomography image into a target fusion image; and inputting the target fusion image into the tumor gene mutation classification model, and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model. According to the technical scheme of the embodiment of the invention, the target positron emission tomography image and the target electron computer tomography image are fused, so that the fused target fused image has more characteristic information, and the accuracy of tumor gene mutation classification is improved.
It should be understood that the statements in this section do not necessarily identify key or critical features of any embodiment of the present invention, nor do they necessarily limit the scope of the present invention. Other features of the present invention will become apparent from the following description.
Drawings
FIG. 1 is a flowchart of a method for classifying a tumor gene mutation according to one embodiment of the present invention;
FIG. 2 is a flowchart of a method for classifying a tumor gene mutation according to the second embodiment of the present invention;
FIG. 3 is a flowchart of a method for classifying mutations in tumor genes according to a third embodiment of the present invention;
FIG. 4 is a flowchart of a method for classifying a tumor gene mutation provided in the fourth embodiment of the present invention;
FIG. 5 is a structural diagram of a classification model of tumor gene mutations in a classification method of tumor gene mutations provided in the fourth embodiment of the present invention;
FIG. 6 is a structural diagram of an inverted linear bottleneck layer in a classification method of tumor gene mutations provided in the fourth embodiment of the present invention;
FIG. 7 is a structural diagram of a fused inverted residual error layer in a classification method of tumor gene mutation provided in the fourth embodiment of the present invention;
FIG. 8 is a structural diagram of a compression and excitation network in a classification method of a tumor gene mutation provided in the fourth embodiment of the present invention;
FIG. 9 is a lung nodule attention heat map in a method of classifying tumor gene mutations provided in the fourth embodiment of the present invention;
FIG. 10 is a graph showing the calculation results of the ten-fold cross-validation accuracy of the classification method of tumor gene mutations provided in the fourth embodiment of the present invention;
FIG. 11 is a confusion matrix chart of the validation results of the classification method of tumor gene mutation provided in the fourth embodiment of the present invention;
FIG. 12 is a diagram showing a procedure of a pretreatment operation in an alternative example of a classification method of a tumor gene mutation provided in the fourth embodiment of the present invention;
FIG. 13 is a block diagram showing the structure of a tumor gene mutation classification device according to the fifth embodiment of the present invention;
FIG. 14 is a schematic structural diagram of an electronic device for implementing the method for classifying a tumor gene mutation according to the embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in other sequences than those illustrated or described herein. The cases of "target", "original", etc. are similar and will not be described in detail herein. Moreover, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example one
FIG. 1 is a flowchart of a method for classifying tumor gene mutations according to an embodiment of the present invention. This example is applicable to the case of classifying tumor gene mutations. The method may be performed by the tumor gene mutation classification apparatus provided in the embodiment of the present invention, which may be implemented by software and/or hardware, and the apparatus may be integrated on an electronic device, which may be a variety of user terminals or servers.
Referring to fig. 1, the method of the embodiment of the present invention specifically includes the following steps:
and S110, acquiring a target positron emission tomography image containing the target tumor and a target electron computer tomography image corresponding to the target positron emission tomography image, and acquiring a trained tumor gene mutation classification model.
The target tumor may be a tumor requiring gene mutation classification, such as a brain tumor or a lung tumor. A target Positron emission tomography image may be understood as a Positron emission computed tomography (PET) image containing a target tumor; the positron emission tomography image of the target may be a two-dimensional image or a three-dimensional image. The target electron Computed Tomography image may be understood as a Computed Tomography (CT) image corresponding to the target positron emission Tomography image, containing the target tumor; the target electron computed tomography image may be a two-dimensional image or a three-dimensional image. The tumor gene mutation classification model can be understood as a model capable of classifying tumor gene mutations; the predicted gene mutation classification may be, for example, a gene mutation classification of an Epidermal Growth Factor Receptor (EGFR) of non-small cell lung cancer (NSCLC).
In the embodiment of the invention, considering that tumors with gene mutation and non-gene mutation may have wide variation in form, texture, vision and the like, whether the target tumor has the gene mutation or not can be determined by acquiring the target positron emission tomography image containing the target tumor and the target electron computed tomography image corresponding to the target positron emission tomography image and acquiring the trained tumor gene mutation classification model.
And S120, fusing the target positron emission tomography image and the target electron computed tomography image into a target fusion image.
The target fusion image can be understood as an image obtained by fusing a target positron emission tomography image and a target electron computed tomography image.
In the embodiment of the present invention, in consideration that the target positron emission tomography image may provide more detailed molecular information such as functions and metabolism of the target tumor, and the target electron computed tomography image may provide accurate anatomical localization of the target tumor, the target positron emission tomography image and the target electron computed tomography image may be fused, and the fusion may be, for example, an addition operation, so that the fused target image may have feature information of both the target positron emission tomography image and the target electron computed tomography image.
S130, inputting the target fusion image into the tumor gene mutation classification model, and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model.
Wherein, the output result can be understood as the output data of the tumor gene mutation classification model; the output result may be a probability value of each type of the tumor gene mutation classification, or may be a number value corresponding to a certain tumor gene mutation classification. The gene mutation classification result can be understood as a result of classifying the gene mutation condition of the target tumor; the classification result of gene mutation may be, for example, gene mutation or non-gene mutation.
In the embodiment of the present invention, the target fusion image with more feature information may be input into the tumor gene mutation classification model, and the gene mutation classification result of the target tumor is determined according to the output result of the tumor gene mutation classification model, for example, the output result is a probability value of each type of the tumor gene mutation classification, and the type of the tumor gene mutation classification with the highest probability value is used as the gene mutation classification result of the target tumor.
According to the technical scheme of the embodiment of the invention, a target positron emission tomography image containing a target tumor and a target electron computer tomography image corresponding to the target positron emission tomography image are obtained, and a trained tumor gene mutation classification model is obtained; fusing the target positron emission tomography image and the target electron computed tomography image into a target fusion image; and inputting the target fusion image into the tumor gene mutation classification model, and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model. According to the technical scheme of the embodiment of the invention, the target positron emission tomography image and the target electron computer tomography image are fused, so that the fused target fused image has more characteristic information, and the accuracy of tumor gene mutation classification is improved.
An optional technical solution is that after acquiring a target positron emission tomography image including a target tumor and a target electron computed tomography image corresponding to the target positron emission tomography image, the method for classifying tumor gene mutations further includes: segmenting a first target tumor region with a preset size in the target positron emission tomography image according to the position of the target tumor in the target positron emission tomography image, and updating the segmented first target tumor region into a target positron emission tomography image; and cutting out a second target tumor region with a preset size from the target electron computed tomography image according to the position of the target tumor in the target electron computed tomography image, and updating the cut-out second target tumor region into the target electron computed tomography image.
The preset size may be understood as a preset size of the first target tumor region or the second target tumor region, and the preset size may be a preset fixed value, for example, a size of 64 × 32; the preset size may also be determined according to the size of the target tumor.
In the embodiment of the present invention, a first target tumor region with a preset size may be segmented from a region of a position of a target tumor in a target positron emission tomography image according to the position of the target tumor in the target positron emission tomography image, where the first target tumor region is the region of the target tumor in the target positron emission tomography image, and the target tumor may be located at any position of the first target tumor region, or may be located at a central position of the first target tumor region; and updating the first segmented target tumor region into a target positron emission tomography image.
Correspondingly, according to the position of the target tumor in the target electronic computed tomography image, a second target tumor area with a preset size is cut out from the area of the position of the target tumor in the target electronic computed tomography image, wherein the first target tumor area is the area of the target tumor in the target electronic computed tomography image, and the target tumor can be located at any position of the second target tumor area or at the center of the second target tumor area; and updating the second cut target tumor region into a target electron computed tomography image.
In embodiments of the present invention, the first target tumor region and the second target tumor region may be the same or different in region size. In order to more easily fuse the target positron emission tomography image and the target electron computed tomography image into the target fusion image, the sizes of the first target tumor region and the second target tumor region may be made the same. However, the sizes of the target tumors in the acquired positron emission tomography image and the acquired target electron computed tomography image may be different due to differences of the attributes of the images, in this case, segmentation may be performed according to the sizes of the target tumors in the target positron emission tomography image and the target electron computed tomography image, so that the sizes of the first target tumor region and the second target tumor region are different, and the first target tumor region or the second target tumor region may be correspondingly enlarged or reduced, so that the sizes of the target tumors in the first target tumor region and the second target tumor region are the same, and thus the fusion degree of the target tumors in the subsequent target fusion image is better.
In the embodiment of the invention, a great amount of picture data irrelevant to the tumor gene mutation classification is removed by segmenting the target positron emission tomography image and the target electron computer tomography image, so that the attention degree to the target tumor region can be improved, and the accuracy of the tumor gene mutation classification is further improved.
Example two
FIG. 2 is a flowchart of another method for classifying tumor gene mutations provided in example two of the present invention. The present embodiment is optimized based on the above technical solutions. In this embodiment, optionally, the target positron emission tomography image is a three-dimensional image, and the target electron computed tomography image is a three-dimensional image; fusing the target positron emission tomography image and the target electron computed tomography image into a target fusion image may include: slicing a target positron emission tomography image to obtain at least two positron emission tomography slice images, and slicing a target electron computer tomography image to obtain at least two electron computer tomography slice images, wherein the positron emission tomography slice image is a two-dimensional image, and the electron computer tomography slice image is a two-dimensional image; and fusing the at least two positron emission tomography slice images with the at least two electron computer tomography slice images to obtain a target fusion image. The same or corresponding terms as those in the above embodiments are not explained in detail herein.
Referring to fig. 2, the method of the present embodiment may specifically include the following steps:
s210, acquiring a target positron emission tomography image containing a target tumor and a target electron computer tomography image corresponding to the target positron emission tomography image, and acquiring a trained tumor gene mutation classification model, wherein the target positron emission tomography image is a three-dimensional image, and the target electron computer tomography image is a three-dimensional image.
S220, slicing the positron emission tomography image of the target to obtain at least two positron emission tomography slice images, and slicing the electron computed tomography image of the target to obtain at least two electron computed tomography slice images, wherein the positron emission tomography slice image is a two-dimensional image, and the electron computed tomography slice image is a two-dimensional image.
It can be understood that, since the target positron emission tomography image is a three-dimensional image and the target electron computed tomography image is a three-dimensional image, the target positron emission tomography image can be directly fused with the target electron computed tomography image. However, the fusion difficulty is high in consideration of directly fusing the three-dimensional images. In order to solve the problems, the positron emission tomography image of the target can be transversely sliced to obtain at least two positron emission tomography slice images, wherein the positron emission tomography slice images are two-dimensional images obtained by slicing the positron emission tomography image of the target; and slicing the target electronic computed tomography image to obtain at least two electronic computed tomography slice images, wherein the electronic computed tomography slice images are two-dimensional images obtained by slicing the target electronic computed tomography image. So that the subsequent fusion is carried out on the basis of the two-dimensional image, and the fusion difficulty is reduced.
In the embodiment of the present invention, the number of the positron emission tomography slice images obtained by slicing the target positron emission tomography image may be a preset number, for example, 32 positron emission tomography slice images; the number of the slice images of the positron emission tomography obtained by slicing can be determined according to the size of the target tumor; the method can also be determined according to the accuracy requirement of tumor gene mutation classification, for example, if the accuracy requirement is high, the number of the positron emission tomography slice images obtained by slicing is large, and if the accuracy requirement is low, the number of the positron emission tomography slice images obtained by slicing is small; the method can also be determined according to the speed of the required tumor gene mutation classification, for example, if the speed requirement is slow, the number of the positron emission tomography slice images obtained by slicing is large, and if the speed requirement is fast, the number of the positron emission tomography slice images obtained by slicing is small. Accordingly, the number of slice images obtained by slicing the target electron computed tomography image may also be determined by the above factors, and is not described herein in detail.
It should be noted that the slices in the embodiments of the present invention may be transverse slices; or longitudinal slices; the method can also be characterized in that the target positron emission tomography image and the target electron computer tomography image are respectively transversely sliced to obtain at least two transverse positron emission tomography slice images and at least two electron computer tomography slice images, the target positron emission tomography image and the target electron computer tomography image are respectively longitudinally sliced to obtain at least two longitudinal positron emission tomography slice images, the obtained at least two transverse positron emission tomography slice images and the obtained at least two longitudinal positron emission tomography slice images are both used as positron emission tomography slice images, and the obtained at least two transverse electron computer tomography slice images and the obtained at least two longitudinal electron computer tomography slice images are both used as electron computer tomography slice images.
And S230, fusing the at least two positron emission tomography slice images with the at least two electron computer tomography slice images to obtain a target fusion image.
In the embodiment of the present invention, under the condition that the number of the positron emission tomography slice images is the same as the number of the electron computer tomography slice images, each of the at least two positron emission tomography slice images is fused with the electron computer tomography slice image corresponding to the positron emission tomography slice image in the at least two electron computer tomography slice images to obtain a target fusion image; or fusing each positron emission tomography slice image in at least two positron emission tomography slice images with a random electron computer tomography slice image in at least two electron computer tomography slice images to obtain a target fused image; or fusing each positron emission tomography slice image in the at least two positron emission tomography slice images with at least two electron computer tomography slice images in the at least two electron computer tomography slice images to obtain a target fused image; or at least two positron emission tomography slice images in the at least two positron emission tomography slice images are fused with at least two electron computer tomography slice images in the at least two electron computer tomography slice images to obtain a target fusion image.
In the embodiment of the invention, at least one positron emission tomography slice image in at least two positron emission tomography slice images can be fused with a target electron computed tomography image to obtain a target fusion image; or fusing the target positron emission tomography image with at least one of the at least two electron computed tomography slice images to obtain a target fusion image.
S240, inputting the target fusion image into the tumor gene mutation classification model, and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model.
According to the technical scheme of the embodiment of the invention, under the condition that a target positron emission tomography image is a three-dimensional image and a target electron computer tomography image is a three-dimensional image, the target positron emission tomography image is sliced to obtain at least two positron emission tomography slice images, and the target electron computer tomography image is sliced to obtain at least two electron computer tomography slice images, wherein the positron emission tomography slice image is a two-dimensional image, and the electron computer tomography slice image is a two-dimensional image; and fusing the at least two positron emission tomography slice images with the at least two electron computer tomography slice images to obtain a target fusion image. In the embodiment of the invention, the target positron emission tomography image and the target electron computer tomography image can be sliced and then fused, so that the fusion difficulty can be reduced.
On the basis of any of the above technical solutions, in an optional technical solution, before fusing the target positron emission tomography image and the target electron computed tomography image into the target fusion image, the method for classifying tumor gene mutations further includes: carrying out standardization processing on data in the target positron emission tomography image, and updating the target positron emission tomography image based on the obtained first standardization processing result; normalizing the data in the target electron computed tomography image, and updating the target electron computed tomography image based on the obtained second normalization processing result; and the data of the target positron emission tomography image and the data of the target electron computer tomography image both accord with standard normal distribution.
The first normalization processing result may be understood as a result of normalizing data in the positron emission tomography image of the target. The second normalization processing result may be understood as a result of normalizing data in the slice image of the electron computer tomography.
It can be understood that, because the positron emission tomography image of the object and the electron computed tomography image of the object are different due to the difference of the attributes of the images, the data in the positron emission tomography image of the object and the data in the electron computed tomography image of the object may have a great difference, so that the fusion effect of the positron emission tomography image of the object and the electron computed tomography image of the object is poor. In order to solve the above problem, in an embodiment of the present invention, data in the target positron emission tomography image may be subjected to normalization processing, and the target positron emission tomography image may be updated based on the obtained first normalization processing result; and normalizing the data in the target electron computed tomography image, and updating the target electron computed tomography image based on the obtained second normalization processing result. The updated target positron emission tomography image and the target electron computer tomography image are in accordance with the standard normal distribution, data with different magnitudes are unified into the same magnitude, and comparability among the data is guaranteed, so that the fusibility between the target positron emission tomography image and the target electron computer tomography image is improved, and the fusion effect of the target positron emission tomography image and the target electron computer tomography image is better.
Illustratively, the normalization process may be, for example, a Z-Score normalization (zero-mean normalization) process, so that data in the updated target positron emission tomography image and the updated target electron computed tomography image are uniformly measured by the calculated Z-Score value, a mean value of data in the updated target positron emission tomography image and the updated target electron computed tomography image is 0, a variance of the data in the updated target positron emission tomography image and the updated target electron computed tomography image is 1, the standard normal distribution is met, and the method is dimensionless, and further improves the fusibility between the target positron emission tomography image and the target electron computed tomography image.
EXAMPLE III
FIG. 3 is a flowchart of another method for classifying tumor gene mutations provided in the third embodiment of the present invention. The present embodiment is optimized based on the above technical solutions. In this embodiment, optionally, the number of positron emission tomography slice images is the same as the number of electron computer tomography slice images; fusing at least two positron emission tomography slice images with at least two electron computer tomography slice images to obtain a target fusion image, comprising: for each positron emission tomography slice image in at least two positron emission tomography slice images, fusing the positron emission tomography slice image with an electronic computer tomography slice image corresponding to the positron emission tomography slice image in at least two electronic computer tomography slice images to obtain a target fusion image; inputting the target fusion image into a tumor gene mutation classification model, and determining a gene mutation classification result of the target tumor according to an output result of the tumor gene mutation classification model, wherein the method comprises the following steps: and inputting the obtained at least two target fusion images into the tumor gene mutation classification model, and determining a gene mutation classification result of the target tumor according to an output result of the tumor gene mutation classification model. The same or corresponding terms as those in the above embodiments are not explained in detail herein.
Referring to fig. 3, the method of this embodiment may specifically include the following steps:
s310, acquiring a target positron emission tomography image containing a target tumor and a target electron computer tomography image corresponding to the target positron emission tomography image, and acquiring a trained tumor gene mutation classification model, wherein the target positron emission tomography image is a three-dimensional image, and the target electron computer tomography image is a three-dimensional image.
S320, slicing the positron emission tomography images of the target to obtain at least two positron emission tomography slice images, and slicing the electron computed tomography images of the target to obtain at least two electron computed tomography slice images, wherein the positron emission tomography slice images are two-dimensional images, the electron computed tomography slice images are two-dimensional images, and the number of the positron emission tomography slice images is the same as that of the electron computed tomography slice images.
In the embodiment of the invention, in order to ensure that the feature representation performance after the fusion of the number of the positron emission tomography slice images and the number of the electron computed tomography slice images is better, the number of the positron emission tomography slice images can be the same as the number of the electron computed tomography slice images, and the number of the subsequent aligned electron emission tomography slice images is correspondingly fused with the electron computed tomography slice images.
S330, aiming at each positron emission tomography slice image in at least two positron emission tomography slice images, fusing the positron emission tomography slice image with an electronic computer tomography slice image corresponding to the positron emission tomography slice image in at least two electronic computer tomography slice images to obtain a target fusion image.
It can be understood that, because the positron emission tomography slice image and the corresponding electron computer tomography slice image have the characteristics corresponding to each other, fusing the positron emission tomography slice image and the electron computer tomography slice image corresponding to each other can make the characteristics on the target fusion image more expressive and accurate. Therefore, in the embodiment of the present invention, the positron emission tomography slice image and the electronic computed tomography slice image corresponding to the positron emission tomography slice image in the at least two electronic computed tomography slice images can be fused for each of the at least two positron emission tomography slice images to obtain the target fusion image, so that the obtained target fusion image has better feature representation, and the problem that the gene mutation classification accuracy of the target tumor is affected due to feature mixing does not occur.
S340, inputting the at least two obtained target fusion images into a tumor gene mutation classification model, and determining a gene mutation classification result of the target tumor according to an output result of the tumor gene mutation classification model.
It can be understood that, since the positron emission tomography slice images and the corresponding electron computer tomography slice images thereof can be fused in a one-to-one correspondence manner, and the number of the positron emission tomography slice images and the electron computer tomography slice images is at least two, the number of the obtained target fusion images is also at least two. Therefore, the obtained at least two target fusion images can be input into the tumor gene mutation classification model, and the gene mutation classification result of the target tumor can be determined according to the output result of the tumor gene mutation classification model. In practical application, optionally, at least two target fusion images can be input into the tumor gene mutation classification model as a whole, so that the tumor gene mutation classification model processes the whole to obtain an output result corresponding to the whole; each target fusion image in the at least two target fusion images can be respectively input into the tumor gene mutation classification model, so that the tumor gene mutation classification model respectively processes each target fusion image to obtain output results respectively corresponding to each target fusion image; etc., and are not specifically limited herein.
According to the technical scheme of the embodiment of the invention, the number of the positron emission tomography slice images is the same as that of the electron computer tomography slice images; for each positron emission tomography slice image in at least two positron emission tomography slice images, fusing the positron emission tomography slice image with an electron computer tomography slice image corresponding to the positron emission tomography slice image in at least two electron computer tomography slice images to obtain a target fusion image; and inputting the obtained at least two target fusion images into a tumor gene mutation classification model, and determining a gene mutation classification result of the target tumor according to an output result of the tumor gene mutation classification model. In the embodiment of the invention, the target positron emission tomography image and the target electron computer tomography image are correspondingly fused after being sliced, so that the fusion characteristic expressiveness can be ensured under the condition of low fusion difficulty, and the tumor gene mutation classification accuracy is further improved.
An optional technical solution is that at least two obtained target fusion images are input into a tumor gene mutation classification model, and a gene mutation classification result of a target tumor is determined according to an output result of the tumor gene mutation classification model, including: inputting the target fusion image into a tumor gene mutation classification model aiming at each target fusion image in the at least two obtained target fusion images, and obtaining an initial mutation classification result corresponding to the target fusion image according to the output result of the tumor gene mutation classification model, wherein the initial mutation classification result comprises gene mutation or non-gene mutation; determining a first number of target fusion images corresponding to gene mutation and a second number of target fusion images corresponding to non-gene mutation in at least two target fusion images, and obtaining a gene mutation classification result of the target tumor according to the first number and the second number.
Wherein, the initial mutation classification result can be understood as a classification result corresponding to each target fusion image. The initial mutation classification result may include genetic mutations or non-genetic mutations. The gene mutation can be understood as a classification result of the existence of the gene mutation in the target tumor. Non-genetic mutations are understood as the result of classification of the absence of genetic mutations in the target tumor.
It is understood that, since the obtained at least two target fusion images can be input into the tumor gene mutation classification model, each target fusion image of the two target fusion images may have an output result of the corresponding tumor gene mutation classification model. Therefore, in the embodiment of the present invention, for each target fusion image of the obtained at least two target fusion images, the target fusion image may be input into the tumor gene mutation classification model, and an initial mutation classification result corresponding to the target fusion image may be obtained according to an output result of the tumor gene mutation classification model. Since the number of the target fusion images is at least two, the number of the initial mutation classification results is also at least two, in order to determine the gene mutation classification result, a first number of target fusion images corresponding to the gene mutation and a second number of target fusion images corresponding to the non-gene mutation in the at least two target fusion images can be determined, and the gene mutation classification result of the target tumor is obtained according to the first number and the second number, so that the accuracy of tumor gene mutation classification is further improved.
In the embodiment of the present invention, the method for obtaining the gene mutation classification result of the target tumor according to the first number and the second number may be to compare the first number with the second number, and use the initial mutation classification result corresponding to the larger one of the first number and the second number as the gene mutation classification result; the first quantity and the second quantity can be compared with a preset quantity respectively, and the initial mutation classification result corresponding to the party with the quantity larger than the preset quantity is used as the gene mutation classification result. It is noted that if the first number is equal to the second number, the result of the classification of the gene mutation may be a classification failure.
Example four
FIG. 4 is a flowchart of another method for classifying mutations in tumor genes according to the fourth embodiment of the present invention. The present embodiment is optimized based on the above technical solutions. In this embodiment, optionally, the classification model of tumor gene mutation is obtained by pre-training the following steps: acquiring a sample positron emission tomography image containing a target tumor and a sample electron computer tomography image corresponding to the sample positron emission tomography image, and acquiring a gene mutation classification label of the target tumor, wherein the gene mutation classification label comprises gene mutation or non-gene mutation; fusing the sample positron emission tomography image and the sample electron computer tomography image into a sample fusion image; taking the sample fusion image and the gene mutation classification label as a group of training samples; and training the original gene mutation classification model based on a plurality of groups of training samples to obtain the tumor gene mutation classification model. The same or corresponding terms as those in the above embodiments are not explained in detail herein.
Referring to fig. 4, the method of this embodiment may specifically include the following steps:
s410, acquiring a sample positron emission tomography image containing the target tumor and a sample electron computer tomography image corresponding to the sample positron emission tomography image, and acquiring a gene mutation classification label of the target tumor, wherein the gene mutation classification label comprises gene mutation or non-gene mutation.
Wherein, the sample positron emission tomography image can be understood as a PET image containing a sample tumor and used for training an original gene mutation classification model; the positron emission tomography image of the sample may be a two-dimensional image or a three-dimensional image. A sample electron computer tomography image may be understood as a CT image corresponding to a sample positron emission tomography image, containing a target tumor; the sample electron computer tomography image may be a two-dimensional image or a three-dimensional image. The gene mutation classification label can be understood as a label capable of reflecting the gene mutation classification of the target tumor in the sample positron emission tomography image and the sample positron emission tomography image; the gene mutation classification signature may be artificially determined in advance.
In the embodiment of the invention, after a sample positron emission tomography image containing a target tumor and a sample electron computer tomography image corresponding to the sample positron emission tomography image are obtained and a gene mutation classification label of the target tumor is obtained, a first sample target tumor area with a preset size can be cut out from the sample positron emission tomography image according to the position of the target tumor in the sample positron emission tomography image, and the cut-out first sample target tumor area is updated to the sample positron emission tomography image; and cutting out a second sample target tumor area with a preset size in the sample electron computed tomography image according to the position of the target tumor in the sample electron computed tomography image, and updating the cut-out second sample target tumor area into the sample electron computed tomography image.
And S420, fusing the sample positron emission tomography image and the sample electron computer tomography image into a sample fusion image.
The sample fusion image can be understood as an image obtained by fusing a sample positron emission tomography image and a sample electron computer tomography image.
In the embodiment of the invention, considering that the sample positron emission tomography image can provide more detailed molecular information such as functions and metabolism of the target tumor, and the sample electron computed tomography image can provide accurate anatomical positioning of the target tumor, the sample positron emission tomography image and the sample electron computed tomography image can be fused, and the fusion can be, for example, an addition operation, so that the fused sample fusion image can have the characteristic information of the sample positron emission tomography image and the sample electron computed tomography image at the same time, and the classification accuracy of the tumor gene mutation classification model obtained by subsequent training can be higher.
In the embodiment of the invention, the positron emission tomography image of the sample can be sliced to obtain at least two positron emission tomography slice images of the sample under the condition that the positron emission tomography image of the sample is a three-dimensional image and the electron computer tomography image of the sample is a three-dimensional image, and the electron computer tomography image of the sample is sliced to obtain at least two electron computer tomography slice images of the sample, wherein the positron emission tomography slice image of the sample is a two-dimensional image and the electron computer tomography slice image of the sample is a two-dimensional image; and fusing the at least two sample positron emission tomography slice images with the at least two sample electron computer tomography slice images to obtain a sample fused image.
On the basis, the number of the positron emission tomography slice images of the sample can be the same as that of the electron computer tomography slice images of the sample; and for each sample positron emission tomography slice image in the at least two sample positron emission tomography slice images, fusing the sample positron emission tomography slice image with a sample electron computer tomography slice image corresponding to the sample positron emission tomography slice image in the at least two sample electron computer tomography slice images to obtain a sample fused image.
In the embodiment of the invention, before the sample positron emission tomography image and the sample electron computer tomography image are fused into the sample fusion image, the data in the sample positron emission tomography image can be standardized, and the sample positron emission tomography image is updated based on the obtained first sample standardization processing result; normalizing the data in the sample electronic computed tomography image, and updating the sample electronic computed tomography image based on the obtained second sample normalization processing result; and the data of the sample positron emission tomography image and the data of the sample electron computer tomography image both accord with the standard normal distribution.
And S430, taking the sample fusion image and the gene mutation classification label as a group of training samples.
The training samples can be understood as sample data for training to obtain a tumor gene mutation classification model, and a group of training samples can include a sample fusion image and a gene mutation classification label corresponding to the sample fusion image.
In the embodiment of the invention, before the sample fusion image and the gene mutation classification label are used as a group of training samples, the sample fusion image is subjected to data enhancement by technologies such as turning and tilting, so that the number of the training samples is increased.
S440, training the original gene mutation classification model based on multiple groups of training samples to obtain the tumor gene mutation classification model.
The original gene mutation classification model can be understood as a model to be trained, which can classify the target tumor by the tumor gene mutation.
In the embodiment of the invention, the original gene mutation classification model can be trained based on a plurality of groups of training samples to obtain a tumor gene mutation classification model; the original gene mutation classification model can be trained based on part of the training samples in the multiple groups of training samples to obtain the tumor gene mutation classification model, and then the rest training samples in the multiple groups of training samples are used for evaluating the obtained tumor gene mutation classification model.
Illustratively, referring to fig. 5, the classification model of tumor gene mutations according to the embodiment of the present invention may include one convolution layer with a convolution kernel size of 3 × 3, three fused inverted residual layers, two inverted linear bottleneck layers, and one full-link layer. And outputting the target fusion image to a tumor gene mutation classification model, processing the target fusion image by a convolution layer with convolution kernel size of 3 x 3, a fusion inversion residual layer with three convolution kernels with convolution kernel size of 3 x 3 and two inversion linear bottleneck layers with convolution kernel size of 3 x 3, integrating according to a full connection layer, and obtaining an output result of the tumor gene mutation classification model according to the integrated characteristics.
Referring to fig. 6, the inverted linear bottleneck layer may include a compression and excitation network, three convolutional layers, each convolutional layer having a Batch Normalization layer (BN) after convolution, and a random deactivation layer; the output of the first convolutional layer and the second convolutional layer may be processed by a SilU (signed Weighted Linear Unit) activation function. The first convolution layer is an expanded convolution layer which has the convolution kernel size of 1 × 1 and the step size of 1 and can perform dimension-increasing on the features, the second convolution layer is a depth separable convolution layer with the convolution kernel size of 3 × 3, the depth separable convolution layer comprises a convolution layer with the step size of 1 and a convolution layer with the step size of 2, and the third convolution layer is a Project convolution layer which has the convolution kernel size of 1 × 1 and the step size of 1 and can perform dimension-decreasing operation; in the inverted linear bottleneck layer, the output results of the random inactivation layer can be connected with the primary feature residual errors to form new features, and all the output results of the inverted linear bottleneck layer can be stitched into a depth feature map. The scheme considers that different types of information can be obtained from input data because convolution kernels with different sizes are different in the aspect of receiving domain sizes, and various output feature maps are subjected to parallel convolution and splicing, so that a tumor gene mutation classification model can realize better image representation; the output result of the third convolutional layer can be connected with the primary characteristic residual error, the problem of gradient disappearance in the training process can be solved, and performance degradation caused by a deep network is avoided.
Referring to fig. 7, the fused inverted residual layer may include a convolution layer and a random inactivation layer without the number of up channels, i.e., expansion = 1; the convolution layer has a BN layer after convolution; the output result of the convolutional layer can be processed through a SiLU activation function; the convolution layer is a Project convolution layer with convolution kernel size of 3 x 3, and comprises a convolution layer with step size of 1 and a convolution layer with step size of 2; in the fusion inversion residual error layer, the output result of the random inactivation layer can be connected with the primary characteristic residual error to form a new characteristic, and all the output results of the fusion inversion residual error layer can be stitched into a depth characteristic mapping. The fusion inversion residual layer can comprise two convolution layers and a random inactivation layer under the condition that the number of channels needs to be increased, namely the Expansion is not equal to 1; each convolution layer has a BN layer after convolution; the output of the first convolutional layer may be processed by the sulu activation function. The first convolution layer is an expanded convolution layer with convolution kernel size of 3 x 3, the first convolution layer comprises a convolution layer with step size of 1 and a convolution layer with step size of 2, and the second convolution layer is a Project convolution layer with convolution kernel size of 1 x 1 and step size of 1; in the fusion inversion residual error layer, the output result of the random inactivation layer can be connected with the primary characteristic residual error to form a new characteristic, and all the output results of the fusion inversion residual error layer can be stitched into a depth characteristic mapping.
The compression and excitation network may include a compression module and an excitation module; the compression and excitation network can enable the channel characteristics of the characteristic diagram to have higher characteristic extraction capability according to the weight coefficient added to each characteristic diagram on the channel dimension, and can amplify effective characteristic information and reduce ineffective characteristic information. The compression module may include a global averaging pooling layer for performing the compression operation. The excitation module may include two fully-connected layers and two activation functions, which may constitute a bottleneck structure to achieve correlation between channels and output and input the same number of weights as features, and the mechanism in the excitation module is similar to that of gates in a recurrent neural network. For example, referring to fig. 8, the dimension of the feature is W × H × C, and the dimension of the feature is compressed to 1 × C through the global averaging pooling layer in the compression module, which may be represented by the formula:
Figure BDA0003962264470000111
Figure BDA0003962264470000112
to indicate the manner in which, among others,
Figure BDA0003962264470000113
obtaining a feature map result by the statistic of global average pooling, wherein H is the height of the feature map, W is the width of the feature map, u c Mapping the characteristics of the c filter; inputting the characteristic diagram after dimensionality reduction into a first full-connection layer in the excitation module, and increasing the first full-connection layer back to the original dimensionality, outputting the characteristic diagram with the dimensionality of 1 × c seradio, so that the characteristic diagram has more nonlinearity, can better fit the complex correlation among channels, and can greatly reduce parameter quantity and calculated quantity, wherein SERatio is a scaling parameter; then, the feature graph with the dimension of 1 × c seradio after processing is input into a second full-connection layer, the feature graph with the dimension of 1 × c is output, and then the feature graph is processed through a Sigmoid Function to obtain the normalized weight between 0 and 1, wherein the process can be realized through a formula: s = F ex (z,W)=σ(g(z,W))=σ(W 2 δ(W 1 z)), wherein W is the width of the feature map, z is the global attention information corresponding to the feature map, δ is the ReLU activation function, σ is the sigmoid activation function, W1 and W2 respectively represent the first fully-connected operation and the second fully-connected operation, and s represents the importance degree of each feature map, namely the weight value; in addition, the Sigmoid function can be replaced by a ReLU activation function to process the feature graph; and weighting the normalized weight to the characteristic of each channel through a channel weight multiplication operation, namely multiplying an output weight matrix of the sigmoid activation function by the original characteristic diagram through Scale operation to obtain a new characteristic diagram with the weight, wherein the process can be realized through a formula:
Figure BDA0003962264470000114
to indicate the manner in which, among other things,
Figure BDA0003962264470000115
represents a new feature diagram, s c A weight matrix is represented.
In the process of training an original gene mutation classification model based on a plurality of groups of training samples to obtain a tumor gene mutation classification model, the training samples can be a given training data set D = { (x) 1 ,y 1 ),(x 2 ,y 2 ),…,(x n ,y n ) Where x is lesion feature, x = { x = } 1 ,x 2 ,…,x n };
Figure BDA0003962264470000116
Is an estimated value after the neural network extracts the characteristics, y = { y = } 1 ,y 2 ,…,y n The real focus category obtained according to the clinical report, and n is the total number of training samples; cross entropy error minimization may be by a loss function
Figure BDA0003962264470000117
Thus obtaining the compound. An Adaptive Moment Estimation (Adam) optimizer may be employed to optimize the loss function.
Illustratively, referring to fig. 9, taking as an example the predicted gene mutation classification such as that of the Epidermal Growth Factor Receptor (EGFR) of non-small cell lung cancer (NSCLC), it can be found that the tumor gene mutation classification model can fully pay attention to the lesions in the map by extracting the attention map of the tumor gene mutation classification model. Referring to fig. 10 and fig. 11, the ten-fold intersection can be used to verify the tumor gene mutation classification model, and the mean value of the accuracy after verification is as high as 0.81.
S450, acquiring a target positron emission tomography image containing the target tumor and a target electron computer tomography image corresponding to the target positron emission tomography image, and acquiring a trained tumor gene mutation classification model.
And S460, fusing the target positron emission tomography image and the target electron computed tomography image into a target fusion image.
S470, inputting the target fusion image into the tumor gene mutation classification model, and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model.
According to the technical scheme of the embodiment of the invention, the tumor gene mutation classification model is obtained by pre-training the following steps: acquiring a sample positron emission tomography image containing a target tumor and a sample electron computer tomography image corresponding to the sample positron emission tomography image, and acquiring a gene mutation classification label of the target tumor, wherein the gene mutation classification label comprises a gene mutation or a non-gene mutation; fusing the sample positron emission tomography image and the sample electron computer tomography image into a sample fusion image; taking the sample fusion image and the gene mutation classification label as a group of training samples; and training the original gene mutation classification model based on multiple groups of training samples to obtain the tumor gene mutation classification model. The sample positron emission tomography image and the sample electron computer tomography image are fused into a sample fusion image, and the sample fusion image and the gene mutation classification label are used as a group of training samples for training an original gene mutation classification model, so that the classification accuracy of the tumor gene mutation classification model can be improved, and the classification accuracy of the tumor gene mutation is further improved.
In order to better understand the technical solutions of the embodiments of the present invention, an alternative example is provided herein. For example, a CT image containing a target tumor and a PET image corresponding to the CT image may be acquired, and a gene mutation classification tag of the target tumor is acquired; carrying out preprocessing operations such as fusion on the CT image and the PET image to obtain a sample fusion image; taking the sample fusion image and the gene mutation classification label as a group of training samples; and training the original gene mutation classification model based on a plurality of groups of training samples to obtain the tumor gene mutation classification model. Referring to fig. 12, a process of obtaining a sample fusion image by performing preprocessing operations such as fusion on a CT image and a PET image includes respectively segmenting a CT image including a sample tumor and a PET image corresponding to the CT image into a first sample target tumor region and a second sample target tumor region of a preset size, and respectively updating the first sample target tumor region and the second sample target tumor region into the CT image and the PET image; adding and fusing the CT image and the PET image to obtain a sample fusion image with RGB (red, green and blue) attributes; segmenting the sample fusion image, and segmenting at least two-dimensional sample fusion images; and performing data enhancement on at least two-dimensional sample fusion images by technologies such as overturning, tilting and the like.
EXAMPLE five
Fig. 13 is a block diagram showing a structure of a tumor gene mutation classification device according to a fifth embodiment of the present invention, which is used for performing the tumor gene mutation classification method according to any of the above-mentioned embodiments. The apparatus and the classification method of tumor gene mutation of each embodiment belong to the same inventive concept, and details that are not described in detail in the embodiment of the classification apparatus of tumor gene mutation can refer to the embodiment of the classification method of tumor gene mutation. Referring to fig. 13, the apparatus may specifically include: a tumor gene mutation classification model obtaining module 510, a target fusion image fusion module 520 and a gene mutation classification result determining module 530.
The tumor gene mutation classification model obtaining module 510 is configured to obtain a target positron emission tomography image including a target tumor and a target electron computed tomography image corresponding to the target positron emission tomography image, and obtain a trained tumor gene mutation classification model;
a target fusion image fusion module 520, configured to fuse the target positron emission tomography image and the target electron computed tomography image into a target fusion image;
and the gene mutation classification result determining module 530 is configured to input the target fusion image into the tumor gene mutation classification model, and determine a gene mutation classification result of the target tumor according to an output result of the tumor gene mutation classification model.
Optionally, the target positron emission tomography image is a three-dimensional image, and the target electron computed tomography image is a three-dimensional image;
the target fusion image fusion module 520 may include:
an electron computer tomography slice image obtaining unit, configured to slice a positron emission tomography image of a target to obtain at least two positron emission tomography slice images, and slice the electron computer tomography image of the target to obtain at least two electron computer tomography slice images, where the positron emission tomography slice image is a two-dimensional image and the electron computer tomography slice image is a two-dimensional image;
and the target fusion image obtaining unit is used for fusing the at least two positron emission tomography slice images with the at least two electron computer tomography slice images to obtain a target fusion image.
On the basis of the scheme, optionally, the number of the positron emission tomography slice images is the same as that of the electron computer tomography slice images;
the target fusion image obtaining unit may include:
a target fusion image obtaining subunit, configured to fuse, for each positron emission tomography slice image of the at least two positron emission tomography slice images, the positron emission tomography slice image with an electronic computed tomography slice image corresponding to the positron emission tomography slice image of the at least two electronic computed tomography slice images, so as to obtain a target fusion image;
the gene mutation classification result determination module 530 may include:
and the gene mutation classification result determining unit is used for inputting the obtained at least two target fusion images into the tumor gene mutation classification model and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model.
On the basis of the above scheme, optionally, the gene mutation classification result determining unit may include:
an initial mutation classification result obtaining subunit, configured to input the target fusion image into the tumor gene mutation classification model for each of the obtained at least two target fusion images, and obtain an initial mutation classification result corresponding to the target fusion image according to an output result of the tumor gene mutation classification model, where the initial mutation classification result includes a gene mutation or a non-gene mutation;
and the gene mutation classification result obtaining subunit is used for determining a first number of target fusion images corresponding to gene mutation and a second number of target fusion images corresponding to non-gene mutation in at least two target fusion images, and obtaining a gene mutation classification result of the target tumor according to the first number and the second number.
Optionally, the tumor gene mutation classification device may further include:
the first target positron emission tomography image updating module is used for carrying out standardization processing on data in the target positron emission tomography image before the target positron emission tomography image and the target electron computed tomography image are fused into a target fusion image, and updating the target positron emission tomography image based on an obtained first standardization processing result;
and the number of the first and second groups,
the first target electronic computed tomography image updating module is used for carrying out standardization processing on data in the target electronic computed tomography image and updating the target electronic computed tomography image based on the obtained second standardization processing result;
and the data in the target positron emission tomography image and the data in the target electron computer tomography image are both in accordance with the standard normal distribution.
Optionally, the tumor gene mutation classification device may further include:
the second target positron emission tomography image updating module is used for segmenting a first target tumor area with a preset size in the target positron emission tomography image according to the position of the target tumor in the target positron emission tomography image after acquiring the target positron emission tomography image containing the target tumor and the target electron computed tomography image corresponding to the target positron emission tomography image, and updating the segmented first target tumor area into the target positron emission tomography image;
and the second target electronic computed tomography image updating module is used for segmenting a second target tumor area with a preset size in the target electronic computed tomography image according to the position of the target tumor in the target electronic computed tomography image, and updating the segmented second target tumor area into the target electronic computed tomography image.
Optionally, the tumor gene mutation classification device may further include a tumor gene mutation classification model obtained by pre-training the following modules:
a gene mutation classification label obtaining module of the target tumor, which is used for obtaining a sample positron emission tomography image containing the target tumor and a sample electron computer tomography image corresponding to the sample positron emission tomography image and obtaining a gene mutation classification label of the target tumor, wherein the gene mutation classification label comprises gene mutation or non-gene mutation;
the sample fusion image fusion module is used for fusing the sample positron emission tomography image and the sample electron computer tomography image into a sample fusion image;
the training sample is used as a module for taking the sample fusion image and the gene mutation classification label as a group of training samples;
and the tumor gene mutation classification model obtaining module is used for training the original gene mutation classification model based on a plurality of groups of training samples to obtain the tumor gene mutation classification model.
The tumor gene mutation classification device provided by the fifth embodiment of the invention acquires a target positron emission tomography image containing a target tumor and a target electron computed tomography image corresponding to the target positron emission tomography image through a tumor gene mutation classification model acquisition module, and acquires a trained tumor gene mutation classification model; fusing the target positron emission tomography image and the target electron computed tomography image into a target fusion image through a target fusion image fusion module; and inputting the target fusion image into the tumor gene mutation classification model through a gene mutation classification result determination module, and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model. The device fuses the target positron emission tomography image and the target electron computer tomography image, so that the fused target fused image has more characteristic information, and the accuracy of tumor gene mutation classification is improved.
The tumor gene mutation classification device provided by the embodiment of the invention can execute the tumor gene mutation classification method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, in the embodiment of the tumor gene mutation classification apparatus, the included units and modules are only divided according to the functional logic, but are not limited to the above division, as long as the corresponding functions can be achieved; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
EXAMPLE six
FIG. 14 illustrates a block diagram of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 14, the electronic device 10 includes at least one processor 11, and a memory communicatively connected to the at least one processor 11, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, and the like, wherein the memory stores a computer program executable by the at least one processor, and the processor 11 can perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from a storage unit 18 into the Random Access Memory (RAM) 13. In the RAM13, various programs and data necessary for the operation of the electronic apparatus 10 can also be stored. The processor 11, the ROM 12, and the RAM13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
A number of components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, or the like; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, or the like. The processor 11 performs the various methods and processes described above, such as the tumor gene mutation classification method.
In some embodiments, the tumor gene mutation classification method may be implemented as a computer program that is tangibly embodied in a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into the RAM13 and executed by the processor 11, one or more steps of the above described method of classification of tumor gene mutations may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the tumor gene mutation classification method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above, reordering, adding or deleting steps, may be used. For example, the steps described in the present invention may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired result of the technical solution of the present invention can be achieved.
The above-described embodiments should not be construed as limiting the scope of the invention. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method for classifying a tumor gene mutation, comprising:
acquiring a target positron emission tomography image containing a target tumor and a target electron computer tomography image corresponding to the target positron emission tomography image, and acquiring a trained tumor gene mutation classification model;
fusing the target positron emission tomography image and the target electron computed tomography image into a target fusion image;
and inputting the target fusion image into the tumor gene mutation classification model, and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model.
2. The method of claim 1, wherein the target positron emission tomography image is a three-dimensional image and the target electron computed tomography image is a three-dimensional image;
the fusing the target positron emission tomography image and the target electron computed tomography image into a target fused image comprises:
slicing the target positron emission tomography image to obtain at least two positron emission tomography slice images, and slicing the target electron computer tomography image to obtain at least two electron computer tomography slice images, wherein the positron emission tomography slice images are two-dimensional images, and the electron computer tomography slice images are two-dimensional images;
and fusing the at least two positron emission tomography slice images with the at least two electron computer tomography slice images to obtain a target fusion image.
3. The method of claim 2, wherein the number of positron emission tomography slice images is the same as the number of electron computed tomography slice images;
the fusing the at least two positron emission tomography slice images with the at least two electron computer tomography slice images to obtain a target fused image comprises:
for each positron emission tomography slice image in the at least two positron emission tomography slice images, fusing the positron emission tomography slice image with an electronic computer tomography slice image corresponding to the positron emission tomography slice image in the at least two electronic computer tomography slice images to obtain a target fusion image;
the inputting the target fusion image into the tumor gene mutation classification model and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model comprises the following steps:
and inputting the obtained at least two target fusion images into the tumor gene mutation classification model, and determining a gene mutation classification result of the target tumor according to an output result of the tumor gene mutation classification model.
4. The method according to claim 3, wherein the inputting the obtained at least two target fusion images into the tumor gene mutation classification model, and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model comprises:
for each target fusion image in the at least two target fusion images, inputting the target fusion image into the tumor gene mutation classification model, and obtaining an initial mutation classification result corresponding to the target fusion image according to an output result of the tumor gene mutation classification model, wherein the initial mutation classification result comprises gene mutation or non-gene mutation;
determining a first number of target fusion images corresponding to the gene mutation and a second number of target fusion images corresponding to the non-gene mutation in the at least two target fusion images, and obtaining a gene mutation classification result of the target tumor according to the first number and the second number.
5. The method of claim 1, further comprising, prior to said fusing said target positron emission tomography image and said target electron computed tomography image into a target fused image:
carrying out standardization processing on data in the target positron emission tomography image, and updating the target positron emission tomography image based on an obtained first standardization processing result;
and the number of the first and second groups,
carrying out standardization processing on data in the target electron computed tomography image, and updating the target electron computed tomography image based on an obtained second standardization processing result;
and the data of the target positron emission tomography image and the data of the target electron computer tomography image are in accordance with standard normal distribution.
6. The method of claim 1, further comprising, after said acquiring a target positron emission tomography image including a target tumor and a target electron computed tomography image corresponding to the target positron emission tomography image:
segmenting a first target tumor region with a preset size in the target positron emission tomography image according to the position of the target tumor in the target positron emission tomography image, and updating the segmented first target tumor region into the target positron emission tomography image;
and segmenting a second target tumor region with the preset size from the target electron computed tomography image according to the position of the target tumor in the target electron computed tomography image, and updating the segmented second target tumor region into the target electron computed tomography image.
7. The method of claim 1, wherein the classification model of tumor gene mutation is pre-trained by the following steps:
acquiring a sample positron emission tomography image containing the target tumor and a sample electron computer tomography image corresponding to the sample positron emission tomography image, and acquiring a gene mutation classification label of the target tumor, wherein the gene mutation classification label comprises a gene mutation or a non-gene mutation;
fusing the sample positron emission tomography image and the sample electron computed tomography image into a sample fused image;
taking the sample fusion image and the gene mutation classification label as a group of training samples;
and training an original gene mutation classification model based on a plurality of groups of training samples to obtain the tumor gene mutation classification model.
8. A tumor gene mutation classification device, comprising:
the tumor gene mutation classification model acquisition module is used for acquiring a target positron emission tomography image containing a target tumor and a target electron computed tomography image corresponding to the target positron emission tomography image, and acquiring a trained tumor gene mutation classification model;
a target fusion image fusion module for fusing the target positron emission tomography image and the target electron computer tomography image into a target fusion image;
and the gene mutation classification result determining module is used for inputting the target fusion image into the tumor gene mutation classification model and determining the gene mutation classification result of the target tumor according to the output result of the tumor gene mutation classification model.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to cause the at least one processor to perform the method of classifying a mutation in a tumor gene of any one of claims 1-7.
10. A computer-readable storage medium having stored thereon computer instructions for causing a processor to perform the method for classifying a tumor gene mutation according to any one of claims 1-7.
CN202211485858.2A 2022-11-24 2022-11-24 Tumor gene mutation classification method and device, electronic equipment and storage medium Pending CN115761360A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211485858.2A CN115761360A (en) 2022-11-24 2022-11-24 Tumor gene mutation classification method and device, electronic equipment and storage medium
PCT/CN2023/133473 WO2024109859A1 (en) 2022-11-24 2023-11-22 Tumor gene mutation classification method and apparatus, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211485858.2A CN115761360A (en) 2022-11-24 2022-11-24 Tumor gene mutation classification method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115761360A true CN115761360A (en) 2023-03-07

Family

ID=85337480

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211485858.2A Pending CN115761360A (en) 2022-11-24 2022-11-24 Tumor gene mutation classification method and device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN115761360A (en)
WO (1) WO2024109859A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024109859A1 (en) * 2022-11-24 2024-05-30 深圳先进技术研究院 Tumor gene mutation classification method and apparatus, electronic device, and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109949288A (en) * 2019-03-15 2019-06-28 上海联影智能医疗科技有限公司 Tumor type determines system, method and storage medium
US11170503B2 (en) * 2019-10-30 2021-11-09 International Business Machines Corporation Systems and methods for detection likelihood of malignancy in a medical image
CN111210441A (en) * 2020-01-02 2020-05-29 苏州瑞派宁科技有限公司 Tumor prediction method and device, cloud platform and computer-readable storage medium
CN111260636B (en) * 2020-01-19 2023-07-25 郑州大学 Model training method and device, image processing method and device, and medium
CN112581458B (en) * 2020-12-24 2024-03-26 清华大学 Image processing method and device
CN115761360A (en) * 2022-11-24 2023-03-07 深圳先进技术研究院 Tumor gene mutation classification method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024109859A1 (en) * 2022-11-24 2024-05-30 深圳先进技术研究院 Tumor gene mutation classification method and apparatus, electronic device, and storage medium

Also Published As

Publication number Publication date
WO2024109859A1 (en) 2024-05-30

Similar Documents

Publication Publication Date Title
Moghadam et al. A morphology focused diffusion probabilistic model for synthesis of histopathology images
CN112070231B (en) Data slicing for machine learning performance testing and improvement
CN108280477B (en) Method and apparatus for clustering images
CN110059697B (en) Automatic lung nodule segmentation method based on deep learning
CN114341952A (en) System and method for processing images of slides to infer biomarkers
CN111368254B (en) Multi-view data missing completion method for multi-manifold regularization non-negative matrix factorization
CN113065614B (en) Training method of classification model and method for classifying target object
CN111291825A (en) Focus classification model training method and device, computer equipment and storage medium
CN108564102A (en) Image clustering evaluation of result method and apparatus
CN108985190B (en) Target identification method and device, electronic equipment and storage medium
CN111369003A (en) Method and device for determining fidelity of quantum bit reading signal
CN110348516B (en) Data processing method, data processing device, storage medium and electronic equipment
CN113362314B (en) Medical image recognition method, recognition model training method and device
WO2024109859A1 (en) Tumor gene mutation classification method and apparatus, electronic device, and storage medium
US20240054639A1 (en) Quantification of conditions on biomedical images across staining modalities using a multi-task deep learning framework
CN116415020A (en) Image retrieval method, device, electronic equipment and storage medium
CN115881304B (en) Risk assessment method, device, equipment and medium based on intelligent detection
CN116468690B (en) Subtype analysis system of invasive non-mucous lung adenocarcinoma based on deep learning
CN112614570A (en) Sample set labeling method, pathological image classification method and classification model construction method and device
Zhang et al. Multiple Morphological Constraints‐Based Complex Gland Segmentation in Colorectal Cancer Pathology Image Analysis
CN108764301A (en) A kind of distress in concrete detection method based on reversed rarefaction representation
CN114972921A (en) Chromosome analysis method, apparatus and storage medium
CN114022284A (en) Abnormal transaction detection method and device, electronic equipment and storage medium
CN113658338A (en) Point cloud tree monomer segmentation method and device, electronic equipment and storage medium
CN113762305A (en) Method and device for determining alopecia type

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination