CN113223614A - Chromosome karyotype analysis method, system, terminal device and storage medium - Google Patents
Chromosome karyotype analysis method, system, terminal device and storage medium Download PDFInfo
- Publication number
- CN113223614A CN113223614A CN202110600361.XA CN202110600361A CN113223614A CN 113223614 A CN113223614 A CN 113223614A CN 202110600361 A CN202110600361 A CN 202110600361A CN 113223614 A CN113223614 A CN 113223614A
- Authority
- CN
- China
- Prior art keywords
- network
- chromosome
- target
- anchor
- karyotype analysis
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000000349 chromosome Anatomy 0.000 title claims abstract description 137
- 238000004458 analytical method Methods 0.000 title claims abstract description 82
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 38
- 238000000605 extraction Methods 0.000 claims abstract description 37
- 238000001514 detection method Methods 0.000 claims abstract description 29
- 238000000034 method Methods 0.000 claims abstract description 28
- 238000010586 diagram Methods 0.000 claims abstract description 22
- 230000004927 fusion Effects 0.000 claims abstract description 19
- 230000011218 segmentation Effects 0.000 claims description 29
- 238000012545 processing Methods 0.000 claims description 22
- 238000012216 screening Methods 0.000 claims description 21
- 238000005070 sampling Methods 0.000 claims description 14
- 238000010276 construction Methods 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 6
- 230000001629 suppression Effects 0.000 claims description 5
- 238000011156 evaluation Methods 0.000 claims description 4
- 238000013527 convolutional neural network Methods 0.000 description 62
- 230000031864 metaphase Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 238000011176 pooling Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- OLBCVFGFOZPWHH-UHFFFAOYSA-N propofol Chemical compound CC(C)C1=CC=CC(C(C)C)=C1O OLBCVFGFOZPWHH-UHFFFAOYSA-N 0.000 description 4
- 229960004134 propofol Drugs 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 238000003062 neural network model Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 239000013598 vector Substances 0.000 description 3
- 210000004027 cell Anatomy 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 210000001766 X chromosome Anatomy 0.000 description 1
- 210000002593 Y chromosome Anatomy 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000001185 bone marrow Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000032823 cell division Effects 0.000 description 1
- 210000002230 centromere Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 210000001726 chromosome structure Anatomy 0.000 description 1
- 238000007635 classification algorithm Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000001000 micrograph Methods 0.000 description 1
- 230000011278 mitosis Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16B—BIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
- G16B20/00—ICT specially adapted for functional genomics or proteomics, e.g. genotype-phenotype associations
- G16B20/20—Allele or variant detection, e.g. single nucleotide polymorphism [SNP] detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Computational Linguistics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Genetics & Genomics (AREA)
- Proteomics, Peptides & Aminoacids (AREA)
- Bioinformatics & Computational Biology (AREA)
- Biotechnology (AREA)
- Evolutionary Biology (AREA)
- Medical Informatics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a chromosome karyotype analysis method, a chromosome karyotype analysis system, terminal equipment and a storage medium, wherein the method comprises the following steps: carrying out feature extraction and feature fusion on the chromosome microscopic image to output a target feature map; generating a plurality of anchor frames according to the target characteristic diagram, and extracting a plurality of regions of interest with the same size from the anchor frames through a preset detection algorithm; constructing a chromosome karyotype analysis model according to the region of interest and the characteristic information; and inputting the chromosome microscopic image to be identified into the chromosome karyotype analysis model, and outputting a corresponding chromosome karyotype analysis result. The invention automatically classifies, segments and analyzes the chromosome karyotype, and improves the accuracy of chromosome karyotype detection.
Description
Technical Field
The invention relates to the field of computer vision and digital image processing, in particular to a chromosome karyotype analysis method, a chromosome karyotype analysis system, terminal equipment and a storage medium.
Background
The karyotype analysis is generally based on the phenotype of chromosomes in the metaphase of mitosis of organism cells, and the chromosomes are paired, ordered and numbered according to the characteristics of the number, the form, the size and the like of the chromosomes, so that a karyotype graph is finally obtained, and a diagnosis and analysis result is obtained according to the information such as the chromosome structure, the chromosome number and the like represented by the karyotype graph. Early chromosome karyotype analysis mainly depends on manual work, which is time-consuming and has high requirements on professional skills of practitioners, so that in order to reduce the workload of doctors and improve the work efficiency, it is necessary to improve the automation and intelligence degree of chromosome karyotype analysis by using a deep learning mode.
In the implementation mode of intelligent chromosome karyotype analysis, the method is mainly divided into two types: the first type adopts a traditional image processing algorithm, and realizes the segmentation and pairing of chromosomes mainly according to the characteristics of the chromosomes such as area, perimeter, centromere index, texture information and the like as classification bases; and a second type of neural network model based on a deep learning algorithm is used for constructing a neural network model for segmentation and classification by establishing a database of chromosomes so as to realize intelligent chromosome karyotype analysis. In the second type of algorithm model, there are multiple implementation manners, most of the existing classification algorithms implement classification matching again based on the segmented chromosomes, the network structure model is complex, the accuracy of classification matching is low, and the possibility of engineering implementation is low.
Therefore, the technical staff in the art needs to solve the problem of how to provide a network model integrating chromosome segmentation and classification, improve classification performance and segmentation performance of chromosome karyotype analysis, and facilitate engineering application of the model.
Disclosure of Invention
The invention aims to provide a chromosome karyotype analysis method, a chromosome karyotype analysis system, terminal equipment and a storage medium, so that the chromosome karyotype can be automatically classified, segmented and analyzed, and the accuracy of chromosome karyotype detection is improved.
The technical scheme provided by the invention is as follows:
the invention provides a chromosome karyotype analysis method, which comprises the following steps:
carrying out feature extraction and feature fusion on the chromosome microscopic image to output a target feature map;
generating a plurality of anchor frames according to the target characteristic diagram, and extracting a plurality of regions of interest with the same size from the anchor frames through a preset detection algorithm;
constructing a chromosome karyotype analysis model according to the region of interest and the characteristic information;
and inputting the chromosome microscopic image to be identified into the chromosome karyotype analysis model, and outputting a corresponding chromosome karyotype analysis result.
The present invention also provides a karyotype analysis system, comprising:
the fusion module is used for performing feature extraction and feature fusion on the chromosome microscopic image to output a target feature map;
the extraction module is used for generating a plurality of anchor frames according to the target characteristic diagram and extracting a plurality of regions of interest with the same size from the anchor frames through a preset detection algorithm;
the construction module is used for constructing a chromosome karyotype analysis model according to the region of interest and the characteristic information;
and the processing module is used for inputting the chromosome microscopic image to be identified into the chromosome karyotype analysis model and outputting a corresponding chromosome karyotype analysis result.
The invention also provides a terminal device, which comprises a processor, a memory and a computer program stored in the memory and capable of running on the processor, wherein the processor is used for executing the computer program stored in the memory and realizing the operation executed by the chromosome karyotype analysis method.
The present invention also provides a storage medium having at least one instruction stored therein, which is loaded and executed by a processor to perform the operations performed by the karyotyping method.
By the chromosome karyotype analysis method, the chromosome karyotype analysis system, the terminal equipment and the storage medium, the chromosome karyotype can be automatically classified, segmented and analyzed, and the accuracy of chromosome karyotype detection is improved.
Drawings
The above features, technical features, advantages and implementations of a karyotype analysis method, system, terminal device and storage medium will be further described in the following detailed description of preferred embodiments with reference to the accompanying drawings.
FIG. 1 is a flow chart of an embodiment of a karyotype analysis method of the present invention.
FIG. 2 is a diagram of a Backbone network structure of the chromosome karyotype analysis method of the present invention;
FIG. 3 is a diagram of a Backbone + FPN network structure of the karyotype analysis method of the present invention;
FIG. 4 is a schematic structural diagram of the Faster R-CNN network in a karyotype analysis method according to the present invention;
FIG. 5 is a schematic structural diagram of the classfier and mask branch structures of the karyotype analysis method of the present invention
FIG. 6 is a diagram of a Mask screening R-CNN network structure according to the karyotype analysis method of the present invention;
FIG. 7 shows the result of chromosome detection by the karyotype analysis method of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. However, it will be apparent to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
For the sake of simplicity, the drawings only schematically show the parts relevant to the present invention, and they do not represent the actual structure as a product. In addition, in order to make the drawings concise and understandable, components having the same structure or function in some of the drawings are only schematically illustrated or only labeled. In the present invention, "one" means not only "only one" but also a case of "more than one".
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items. In addition, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will be made with reference to the accompanying drawings. It is obvious that the drawings in the following description are only some examples of the invention, and that for a person skilled in the art, other drawings and embodiments can be derived from them without inventive effort.
In one embodiment of the present invention, as shown in FIG. 1, a karyotype analysis method includes:
s100, performing feature extraction and feature fusion on the chromosome microscopic image to output a target feature map;
specifically, a microscope image of metaphase of cell division is shot by a trinocular microscope and a camera installed on the microscope to obtain a chromosome microscopic image, specifically, a chromosome smear prepared by a conventional method is placed under the trinocular microscope for observation, and the chromosome smear under the microscope is shot by the camera to obtain the chromosome microscopic image.
The chromosome microscopic image may be a color chromosome microscopic image captured by a color digital camera mounted on a microscope, or a black and white chromosome microscopic image captured by a color digital camera.
And a terminal device such as a computer, a server and the like receives and acquires a plurality of chromosome microscopic images from the camera. Of course, a small number of chromosome microscopic images can be received and acquired from the camera, and a large number of chromosome microscopic images can be obtained by performing data enhancement processing on the chromosome microscopic images. In particular, data enhancement processes include, but are not limited to, geometric transformation enhancement (including, but not limited to flipping, rotating), color transformation enhancement (noise, blurring, color transformation, erasing, padding), and data enhancement through GAN countermeasure networks.
S200, generating a plurality of Anchor frames Anchor according to the target feature map, and extracting a plurality of interesting regions ROI with the same size from the Anchor frames Anchor through a preset detection algorithm;
s300, constructing a chromosome karyotype analysis model Mask Scoring R-CNN according to the ROI and the characteristic information;
s400, inputting the chromosome microscopic image to be identified into the chromosome karyotype analysis model Mask screening R-CNN, and outputting a corresponding chromosome karyotype analysis result.
The invention establishes a chromosome karyotype analysis model Mask ordering R-CNN based on a plurality of chromosome microscopic images and corresponding characteristic information, can analyze and detect the chromosome karyotype in real time, obtains a target characteristic image by extracting and fusing from the chromosome microscopic images, generates an Anchor frame Anchor and determines an interested region ROI, and then, a terminal device constructs the chromosome karyotype analysis model Mask ordering R-CNN according to the interested region ROI and the characteristic information. Therefore, the terminal equipment directly inputs the chromosome microscopic image to be identified into the chromosome karyotype analysis model Mask screening R-CNN, so that the characteristic information of the chromosome to be detected in the chromosome microscopic image to be identified can be accurately determined, the detection precision is high, the classification performance and the segmentation performance of the chromosome karyotype analysis are improved, and convenience is brought to the engineering application of the model.
In one embodiment of the present invention, a method for karyotyping includes:
s110, extracting the characteristics of the chromosome microscopic image based on a characteristic extraction network Backbone to obtain a characteristic diagram;
specifically, this embodiment is an optimized embodiment of the foregoing embodiment, and the same portions in this embodiment as those in the foregoing embodiment are referred to the foregoing embodiment, and are not described in detail herein. The method comprises the steps that a terminal device builds a feature extraction network backhaul which comprises a network formed by a plurality of convolution layers, a batch normalization layer and a pooling layer and is used for carrying out feature extraction on chromosome microscopic images. The network model of the feature extraction network Backbone for image feature extraction has various choices, such as VGGNet, ResNet series, and the like.
S120, fusing the extracted feature maps based on a feature pyramid algorithm (FPN) to output a plurality of target feature maps;
s200, generating a plurality of Anchor frames Anchor according to the target feature map, and extracting a plurality of interesting regions ROI with the same size from the Anchor frames Anchor through a preset detection algorithm;
s300, constructing a chromosome karyotype analysis model Mask Scoring R-CNN according to the ROI and the characteristic information;
s400, inputting the chromosome microscopic image to be identified into the chromosome karyotype analysis model Mask screening R-CNN, and outputting a corresponding chromosome karyotype analysis result.
Specifically, the terminal device builds a feature pyramid algorithm FPN, combines the feature pyramid algorithm FPN with a feature extraction network backhaul, and fuses high-level features and low-level features to build a fused feature extraction fusion network, and the terminal device inputs the feature map into the feature extraction fusion network to output a plurality of target feature maps. Most of the previous target detection algorithms only adopt top-level features for prediction, the low-level feature semantics are less, but the target position information is more accurate, and the high-level feature semantics information is rich, but the position is coarser. Therefore, the feature maps of the high and low layers are fused, so that the feature information of the high and low layers can be considered, and the detection precision is improved.
In one embodiment of the present invention, a method for karyotyping includes:
s110, extracting the characteristics of the chromosome microscopic image based on a characteristic extraction network Backbone to obtain a characteristic diagram;
s121, performing down-sampling processing on each stage of the feature extraction network backhaul, extracting the last layer of each target stage in the feature extraction network backhaul, and constructing and generating a first concerned network from bottom to top; the target stage is a network stage for extracting deep features;
specifically, the terminal device performs downsampling on each stage of the feature extraction network backhaul, and constructs a bottom-up network (i.e., the first network of interest of the present invention).
Taking ResNet101 as an example, the ResNet101 network structure is divided into 5 stages, each stage is subjected to down-sampling once, the last layer of each target stage in ResNet, namely C2-C5 layers, is extracted, and a bottom-up network is constructed.
As shown in fig. 2, the configuration of the Resnet-based network includes 5 layers, conv1, conv2, conv3, conv4 and conv5, taking the Resnet101 network as an example. Wherein the conv1 layers include 64 convolutional layers with the size of 7 × 7 and the convolution kernel moving step s of 2, and a maximum pooling layer with the size of 3 × 3 and the convolution moving step s of 2. The conv2 layer includes three blocks with the same structure, and the convolution layer structure of each block is 1 × 1 × 64, 3 × 3 × 64, and 1 × 1 × 256 in sequence. Similarly, the network structures of conv3 to conv5 are similar to that of conv 2. The batch normalized BN layer and the active layer are connected between different convolutional layers in each block, which are not shown in fig. 2 for simplicity, and the outputs of the last block of each block from conv2 to conv5 are denoted as C2 to C5. The above structure is only a specific embodiment, and it is within the protection scope of the present invention to change only the number of neural network layers or convolutional layer convolution kernels and to use different backhaul.
S122, after convolution processing is carried out on the feature map of each layer in the first concerned network, down-sampling processing is carried out to obtain a second concerned network from top to bottom;
s123, performing up-sampling processing on the feature map obtained at the last layer in the second concerned network to output a target feature map, and fusing the feature map of the target layer in the second concerned network with the feature map of the previous layer in the first concerned network relative to the target layer to output a plurality of target feature maps; the target layers are the remaining layers in the first interest network that are not the last layer.
Specifically, the terminal device performs down-sampling on each stage of the feature extraction network backhaul in the above manner, and after a bottom-up network is constructed and generated for a first concerned network from bottom to top according to a last layer of a target stage, performs down-sampling on a feature map obtained from each layer in the first concerned network from top to bottom according to a feature pyramid algorithm FPN, and constructs a top-down network (i.e., a second concerned network of the present invention). The terminal equipment combines the characteristic pyramid algorithm FPN with the characteristic extraction network Backbone, after a first concerned network and a second concerned network are constructed, the characteristic diagram of the last layer in the second concerned network is subjected to up-sampling processing to output a corresponding target characteristic diagram, and the characteristic diagram of the target layer in the second concerned network and the characteristic diagram of the previous layer in the first concerned network relative to the target layer are fused to output a plurality of target characteristic diagrams. And the feature map pyramid can be constructed by fusing the target feature map output by the last layer of the second concerned network, the previous layer of the first concerned network and the target feature map output by the target layer of the second concerned network.
Illustratively, as shown in fig. 3, C5 is convolved by 1 × 1 and upsampled by a factor of 0.5 to obtain M6, and C5 is convolved by 1 × 1 and downsampled by a factor of 2 to obtain M5. M5 was convolved by a 3X 3 convolution to yield P5. C4 is similarly convolved by 1 × 1 and then pixel-added with M5 to obtain M4, and convolved by 3 × 3 to obtain the final feature fusion layer P4, and similarly, the P2 and P3 layers are obtained. P2-P6 form a feature map pyramid to participate in subsequent operations.
S200, generating a plurality of Anchor frames Anchor according to the target feature map, and extracting a plurality of interesting regions ROI with the same size from the Anchor frames Anchor through a preset detection algorithm;
s300, constructing a chromosome karyotype analysis model Mask Scoring R-CNN according to the ROI and the characteristic information;
s400, inputting the chromosome microscopic image to be identified into the chromosome karyotype analysis model Mask screening R-CNN, and outputting a corresponding chromosome karyotype analysis result.
Specifically, this embodiment is an optimized embodiment of the foregoing embodiment, and the same portions in this embodiment as those in the foregoing embodiment are referred to the foregoing embodiment, and are not described in detail herein. The method comprises the steps that a terminal device builds a feature extraction network backhaul which comprises a network formed by a plurality of convolution layers, a batch normalization layer and a pooling layer and is used for carrying out feature extraction on chromosome microscopic images, a plurality of feature maps are obtained according to different down-sampling, and a bottom-up network from bottom to top is built. And then the terminal equipment combines the feature pyramid algorithm FPN with the feature extraction network Backbone based on the feature pyramid algorithm FPN to construct a top-down network from top to bottom, fuses with the bottom-up network and outputs a target feature map subjected to feature fusion. According to the invention, the characteristic graphs of the high-level and the low-level are fused, so that the characteristic information of the high-level and the low-level can be considered, the identification and classification accuracy of the Mask screening R-CNN of the chromosome karyotype analysis model is improved, and the karyotype analysis and detection precision of the chromosome to be identified in the subsequent chromosome microscopic image to be identified is further improved.
In one embodiment of the present invention, a method for karyotyping includes:
s100, performing feature extraction and feature fusion on the chromosome microscopic image to output a target feature map;
s210, generating a plurality of Anchor frames Anchor according to the target characteristic diagram, and screening candidate suggestion frames Proposal from the Anchor frames Anchor;
s220, carrying out size fixing treatment on the candidate suggestion frame Proposal to obtain a plurality of interested regions ROI with the same size;
s300, constructing a chromosome karyotype analysis model Mask Scoring R-CNN according to the ROI and the characteristic information;
s400, inputting the chromosome microscopic image to be identified into the chromosome karyotype analysis model Mask screening R-CNN, and outputting a corresponding chromosome karyotype analysis result.
Specifically, the terminal device generates a plurality of Anchor frames Anchor on the target characteristic diagram through the constructed RPN according to the Anchor frame Anchor mechanism, and screens out candidate suggestion frames Proposal. The specific RPN network construction process is that the RPN network is trained on the basis of each target characteristic diagram and the corresponding Anchor frame Anchor to obtain the trained RPN network. Because the sizes of the candidate suggestion frames Proposal obtained in the steps are different, and the subsequent construction of the chromosome karyotype analysis module requires inputting the region of interest ROI with fixed size, the candidate suggestion frames Proposal with different sizes are fixed into the region of interest ROI with consistent size through the ROI Align algorithm, and then a plurality of regions of interest ROI with the same size are obtained.
In one embodiment of the present invention, a method for karyotyping includes:
s100, performing feature extraction and feature fusion on the chromosome microscopic image to output a target feature map;
s211, generating a plurality of Anchor frames Anchor on each target characteristic diagram according to preset parameters;
s212, scoring and position correction are carried out on each Anchor frame Anchor, and the Anchor frames Anchor exceeding the image boundary are removed;
s213, screening the Anchor frame Anchor with the confidence coefficient reaching a preset threshold value from the rest Anchor frames Anchor through a non-maximum suppression algorithm to obtain the candidate suggestion frame Proposal;
specifically, based on each fusion feature map and the corresponding Anchor frame Anchor, a ratio IOU of the intersection and union of the frames of the two Anchor frames Anchor is determined, namely the ratio IOU is calculated according to the Anchor frame Anchor and the real marking frame. And classifying the corresponding Anchor frame Anchor into a positive sample Anchor frame Anchor or a negative sample Anchor frame Anchor according to the size of the IOU. And matching all Anchor frames Anchor with the labels in the characteristic information, and judging whether the Anchor frames Anchor contain the samples or not according to the size of the IOU. When the IOU is more than or equal to 0.7, the Anchor frame Anchor contains the sample, namely the Anchor frame Anchor is divided into positive sample Anchor frames Anchor; otherwise, the Anchor frame Anchor does not contain the sample, namely the Anchor frame Anchor is divided into negative sample Anchor frames Anchor.
The specific process of screening the candidate suggestion frame Proposal through the RPN network comprises the following steps: and traversing the feature maps of P2-P6, and generating m multiplied by n Anchor frames Anchor for each pixel point of the feature maps according to the set m areas and the set n length-width ratios. Secondly, inputting P2-P6 into the same RPN, and respectively inputting each feature map into two 1 × 1 convolution branches after passing through a 3 × 3 convolution layer to obtain a score and a position offset of each Anchor frame Anchor, namely the Anchor frame Anchor, wherein the score is the probability that the current Anchor frame Anchor belongs to the background and the foreground respectively. And correcting the position of the Anchor frame Anchor for the first time through the position offset, deleting the overlapped Anchor frame Anchor through the non-maximum inhibition NMS algorithm, and screening to obtain a final candidate suggestion frame Proposal.
The non-maximum value suppression algorithm mainly comprises the following steps:
step1, sorting each Anchor frame Anchor from high to low according to scores;
step2, reserving the Anchor frame Anchor with the highest score, and taking the Anchor frame Anchor as a current candidate suggestion frame Proposal;
step3, traversing the rest Anchor frames Anchor, and deleting the Anchor frame Anchor if the intersection ratio (IOU) of the Anchor frame and the current frame is greater than a set threshold value;
and Step4, selecting one Anchor frame Anchor with the highest score from the rest of the untreated Anchor frames Anchor again, and repeating Step3 until all the Anchor frames Anchor are treated.
S220, carrying out size fixing treatment on the candidate suggestion frame Proposal to obtain a plurality of interested regions ROI with the same size;
specifically, the step of performing size fixing processing on the candidate suggestion frames propofol to obtain the regions of interest ROI with the same size specifically includes calculating a feature map corresponding to each candidate suggestion frame propofol according to the following formula.
Wherein f is0Is a reference value to represent the output of the P4 layer, w is the width of the candidate suggestion frame propofol, and h is the length of the candidate suggestion frame propofol. And then the candidate suggestion frame Proposal can be mapped to the corresponding area of the feature map according to the downsampling multiple from the original image to the corresponding feature map, and the area is fixed to be uniform according to the ROI Align algorithm, wherein the ROI Align algorithm comprises the following specific steps: and traversing each Proporal, dividing the Proporal with the size of h multiplied by w into k multiplied by k units, and if h is divided by k or w is divided by k and cannot be divided completely, not carrying out rounding operation. Depending on the sampling coefficient s, then each cell will sample s × s points. Specifically, each unit is divided into s × s parts, the center point of each part is used as a sampling point, and each sampling point is obtained through bilinear difference calculation. And taking the maximum value of the s × s sampling points as the pooling result of the unit, and finally obtaining the region of interest ROI with the size of k × k.
After the positive sample Anchor frame Anchor and the negative sample Anchor frame Anchor are obtained in the above mode, the RPN is obtained through training according to the positive sample Anchor frame Anchor and the negative sample Anchor frame Anchor. And decoding the information predicted by the RPN network and the corresponding Anchor frame Anchor to generate a group of region of interest ROIs. At the moment, the number of candidate suggestion frames is huge, and in order to improve the quality of a prediction frame, the positive sample Anchor frame Anchor and the negative sample Anchor frame Anchor are randomly sampled.
S300, constructing a chromosome karyotype analysis model Mask Scoring R-CNN according to the ROI and the characteristic information;
s400, inputting the chromosome microscopic image to be identified into the chromosome karyotype analysis model Mask screening R-CNN, and outputting a corresponding chromosome karyotype analysis result.
Specifically, this embodiment is an optimized embodiment of the foregoing embodiment, and the same portions in this embodiment as those in the foregoing embodiment are referred to the foregoing embodiment, and are not described in detail herein. The terminal device pre-constructs an RPN through the process, wherein the RPN has the function of generating an Anchor frame Anchor and screening out a candidate suggestion frame Proposal. Specifically, a series of Anchor frames Anchor are generated on each feature map according to parameter setting, and scoring and initial position correction are performed on each Anchor frame Anchor. And then, the terminal equipment removes the Anchor frames Anchor exceeding the image boundary, and the residual Anchor frames Anchor are screened by a non-maximum suppression algorithm NMS to obtain a certain number of candidate suggestion frame candidate suggestion frames Proposal. And mapping the candidate suggestion boxes Proposal with different sizes into corresponding feature maps, and fixing the mapped regions into the regions of interest ROI with the same size through the region of interest ROI Align with the same size so as to meet the input requirement of a subsequent network.
In one embodiment of the present invention, a method for karyotyping includes:
s100, performing feature extraction and feature fusion on the chromosome microscopic image to output a target feature map;
s200, generating a plurality of Anchor frames Anchor according to the target feature map, and extracting a plurality of interesting regions ROI with the same size from the Anchor frames Anchor through a preset detection algorithm;
s310, classifying and position regressing the ROI based on a classification regression network Classifier, and building and generating a corresponding fast classification network Faster R-CNN;
specifically, the classification regression network Classifier inputs the obtained ROI to the two full-link layer branches, and performs specific classification and regression of the frame position, so as to obtain the classification and confidence of the network on the target represented by the current ROI, and the accurate position of the target in the image. Thus, the construction of the Faster R-CNN network is completed.
Carrying out specific classification and position regression on the ROI by utilizing a classification regression network Classifier so as to complete the construction of the Faster R-CNN network, wherein the steps specifically comprise: the ROI with fixed size is respectively sent to two full-connection layer branches for classification and position refinement, specifically, for each ROI, the full-connection layer for classification outputs a vector with the number of categories being +1, the regression layer outputs a vector with the number of 4 multiplied by 1, the resultant vectors are respectively subjected to softmax scoring, and coordinate decoding is carried out to obtain the category and accurate position information corresponding to the ROI. And removing the candidate suggestion frames Proposal exceeding the boundary from the candidate suggestion frames Proposal corresponding to all the ROI, and processing by NMS (non-maximum suppression algorithm) to obtain a final target detection result. The network construction of the Faster R-CNN is completed, the concrete network structure of the fast R-CNN is shown in FIGS. 4 and 5, and the concrete Classifier module is shown in the Classifier branch of FIG. 6.
S320, adding a Mask prediction branch network Mask Head on the basis of the fast classification network fast R-CNN, and constructing and generating a pixel segmentation network Mask R-CNN;
specifically, the region of interest ROI corresponding to the final candidate suggestion box Proposal is input to a Mask branch, a plurality of convolution layers and deconvolution layers in the Mask branch are passed, a Mask of m × m × C is output for each region of interest ROI, where C is a category number, an ith channel in the Mask is taken out according to a prediction category Ci for the region of interest ROI, the prediction Mask is adjusted to the size of the Proposal of the region of interest ROI Align without the same size, a pixel level segmentation result of the region of interest ROI by the Mask branch is obtained after binarization processing is performed on the Mask according to a certain threshold, and a specific Mask branch structure is shown as a Mask Head branch in fig. 6.
Mask prediction branch network Mask Head is a branch network added on a built Faster R-CNN network, the branch structure uses the network structure idea of FCN for reference, and corresponding binary masks can be obtained by predicting each ROI through the branch. Specifically, in the inference stage, the classified and position-refined ROI obtained in step 6 is mapped to P2 to P5 of the FPN network, and is fixed to a feature map of a × a again by the ROI Align algorithm, and a mask with a size of (2 × a) × (2 × a) × C is output through a series of convolution and deconvolution operations, where C is the number of classes, that is, C mask prediction results are output for each ROI, so that class competition caused when only one channel is output can be reduced. If the classification regression network Classifier classifies the current ROI as CiThen, the category C in the mask is taken outiCorresponding channel, and enlarging the single mask to the ROI pairThe size of the Proposal should be adopted so as to obtain the final mask segmentation result. Thus, the network construction of Mask R-CNN is completed.
S330, adding a Mask IoU (IoU) of a classification evaluation segmentation network on the basis of the Mask R-CNN of the pixel segmentation network, and building and generating a Mask Scoring R-CNN of the chromosome karyotype analysis model;
specifically, a Mask IoU branch network is constructed, the branch network is added on the basis of a Mask R-CNN network structure, the problem that the prediction quality of a Mask in the Mask R-CNN network structure is irrelevant to the confidence of a Classifier on a target class is solved, and the Mask prediction quality is improved. The construction of the Mask scanning R-CNN network is completed, and the specific maskIoU branch structure is shown as the maskIoU Head branch in FIG. 6.
The output of the Mask branch and the input of the Mask branch are cascaded and then input into the Mask IoU branch, Mask IoU of a predicted Mask and Mask IoU of a real Mask are predicted through a convolution layer and a full connection layer in the Mask IoU branch, the Mask IoU and the confidence coefficient of a target class are multiplied to obtain a Mask score, and the value is the prediction quality finally used for evaluating the Mask. According to the above steps, the structure of the Mask screening R-CNN network used in the present invention is shown in FIG. 6.
The Mask IoU branch network is a branch network added on the constructed Mask R-CNN network, and the whole network constructs a Mask scanning R-CNN network. The Mask Scoring R-CNN does not directly use the classification confidence coefficient obtained by a classification regression branch as the Mask R-CNN to evaluate the segmentation quality of the predicted Mask, but uses a Mask IoU branch to learn the IoU score of each predicted Mask and the real Mask, the score is larger, the closer the current predicted Mask and the real Mask are indicated, and meanwhile, the IoU score is multiplied by the classification confidence coefficient of the Classifier to evaluate the segmentation quality of the predicted Mask, so that the model can better explain the prediction quality of the Mask.
S400, inputting the chromosome microscopic image to be identified into the chromosome karyotype analysis model Mask screening R-CNN, and outputting a corresponding chromosome karyotype analysis result.
Specifically, this embodiment is an optimized embodiment of the foregoing embodiment, and the same portions in this embodiment as those in the foregoing embodiment are referred to the foregoing embodiment, and are not described in detail herein. The invention provides an algorithm based on Mask Scoring R-CNN to realize automatic segmentation and classification of chromosomes, which mainly comprises a Faster R-CNN branch, a Mask branch and a Mask IoU branch. The fast R-CNN and the Mask branch form a Mask R-CNN network which is mainly responsible for carrying out target detection and target segmentation on a chromosome image, the Mask R-CNN and the Mask IoU branch form a Mask screening R-CNN network, and the Mask IoU branch scores a predicted Mask again, so that the problem that the quality of the Mask prediction and the confidence of a Classiier module on a target class are irrelevant is corrected to a certain extent.
The algorithm model provided by the invention is trained and verified on 25552 image data sets of metaphase terms. The data set was divided approximately in a 6:2:2 ratio, with 15552 metaphase item images as the training set for the neural network, 5000 images as the test set, and 5000 images as the validation set. The chromosome on the data set is subjected to division and labeling of 24 types (22 types of chromosomes + X chromosome + Y chromosome) of chromosomes in advance under the guidance of a professional doctor. Based on a Mask screening R-CNN neural network model, the chromosome detection result is shown in FIG. 7, the detection accuracy can reach 94.5%, and the recall rate can reach 99.2%.
The invention is used for classifying, matching and pixel-level segmentation of chromosomes in metaphase in blood or bone marrow slices. The method comprises the following steps: constructing a convolutional neural network based on Mask scaling R-CNN, wherein the neural network mainly comprises four parts: the first part is a Backbone network backhaul + FPN with characteristic extraction; the second part is fast R-CNN; the third part is a mask predicted branch; the fourth part is the MaskIoU branch. The first part of FPN is combined with the backhaul to construct a feature map pyramid, and meanwhile, a high-level feature map containing strong semantic features and a low-level feature map containing rich details and position information are fused, so that the detection accuracy of targets with different scales is improved. The second part of the Faster R-CNN is a classic algorithm of two-stage target detection, and the Faster R-CNN mainly comprises convolution layers (conv layers), a regional recommendation Network (RPN) and a classification regression Network Classifier. The whole Faster R-CNN network integrates the feature extraction of the target, the selection of the candidate frame, the frame regression and the target classification into a network structure, the detection precision and the detection efficiency are effectively improved, and the end-to-end target detection is really realized. And a third part of Mask (Mask) prediction branch, wherein the branch is added on the basis of the fast R-CNN to form a Mask R-CNN network for realizing pixel-level segmentation, and the original classification and regression tasks of the fast R-CNN are changed into three tasks of classification, regression and segmentation. The fourth branch MaskIoU is a branch newly added on the network structure of the Mask R-CNN, and the branch is used for solving the problems that the prediction quality of the Mask is not related to the prediction quality of classification regression, and the classification confidence coefficient cannot know the predicted Mask quality and the integrity.
The light-weight characteristic pyramid is designed by combining Backbone + FPN, the Faster R-CNN is designed by combining convolutional layers (conv layers), regional recommendation networks (RPN) and classification regression networks (Classiders), and a mask prediction branch and a MaskIoU branch are sequentially added by combining the fast R-CNN. The method skillfully utilizes the characteristic pyramid structure to eliminate the category ambiguity of the positive sample attribution, and improves the sensitivity of the model to each scale target, thereby improving the detection precision. The invention is applied to chromosome karyotype analysis to improve chromosome segmentation precision and classification accuracy and realize automatic segmentation, classification and pairing of chromosomes.
In one embodiment of the present invention, a karyotype analysis system includes:
the fusion module is used for performing feature extraction and feature fusion on the chromosome microscopic image to output a target feature map;
the extraction module is used for generating a plurality of Anchor frames Anchor according to the target feature map and extracting a plurality of regions of interest ROI with the same size from the Anchor frames Anchor through a preset detection algorithm;
the construction module is used for constructing a chromosome karyotype analysis model Mask Scoring R-CNN according to the ROI and the characteristic information;
and the processing module is used for inputting the chromosome microscopic image to be identified into the chromosome karyotype analysis model Mask screening R-CNN and outputting a corresponding chromosome karyotype analysis result.
Specifically, this embodiment is a system embodiment corresponding to the above method embodiment, and specific effects refer to the above method embodiment, which is not described in detail herein.
Based on the foregoing embodiments, the building module includes:
the classification network creating unit is used for classifying and position regressing the ROI based on a classification regression network Classifier, and building and generating a corresponding fast classification network fast R-CNN;
the segmentation network creating unit is used for adding a Mask prediction branch network Mask Head on the basis of the fast classification network fast R-CNN and building and generating a pixel segmentation network Mask R-CNN;
and the karyotype network creating unit is used for adding a Mask IoU (hierarchical evaluation segmentation network) on the basis of the Mask R-CNN, and building and generating the chromosome karyotype analysis model Mask ordering R-CNN.
Specifically, this embodiment is a system embodiment corresponding to the above method embodiment, and specific effects refer to the above method embodiment, which is not described in detail herein.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of program modules is illustrated, and in practical applications, the above-described distribution of functions may be performed by different program modules, that is, the internal structure of the apparatus may be divided into different program units or modules to perform all or part of the above-described functions. Each program module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one processing unit, and the integrated unit may be implemented in a form of hardware, or may be implemented in a form of software program unit. In addition, the specific names of the program modules are only used for distinguishing the program modules from one another, and are not used for limiting the protection scope of the application.
In one embodiment of the invention, a terminal device comprises a processor and a memory, wherein the memory is used for storing a computer program; and the processor is used for executing the computer program stored on the memory to realize the karyotype analysis method in the corresponding method embodiment.
In an embodiment of the present invention, a storage medium stores at least one instruction, and the instruction is loaded and executed by a processor to implement the operations performed by the embodiments of the karyotype analysis method. For example, the storage medium may be a read-only memory (ROM), a Random Access Memory (RAM), a compact disc read-only memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
They may be implemented in program code that is executable by a computing device such that it is executed by the computing device, or separately, or as individual integrated circuit modules, or as a plurality or steps of individual integrated circuit modules. Thus, the present invention is not limited to any specific combination of hardware and software.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or recited in detail in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
It should be noted that the above embodiments can be freely combined as necessary. The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (10)
1. A method of karyotyping, comprising the steps of:
carrying out feature extraction and feature fusion on the chromosome microscopic image to output a target feature map;
generating a plurality of anchor frames according to the target characteristic diagram, and extracting a plurality of regions of interest with the same size from the anchor frames through a preset detection algorithm;
constructing a chromosome karyotype analysis model according to the region of interest and the characteristic information;
and inputting the chromosome microscopic image to be identified into the chromosome karyotype analysis model, and outputting a corresponding chromosome karyotype analysis result.
2. The karyotype analysis method according to claim 1, wherein the performing feature extraction and feature fusion on the chromosome microscopic image to output the target feature map comprises the steps of:
carrying out feature extraction on the chromosome microscopic image based on a feature extraction network to obtain a feature map;
and fusing the feature maps based on a feature pyramid algorithm to output a plurality of target feature maps.
3. The chromosome karyotyping analysis method according to claim 2, wherein said fusing the feature maps based on the feature pyramid algorithm to output a plurality of target feature maps includes the steps of:
performing downsampling processing on each stage of the feature extraction network, extracting the last layer of each target stage in the feature extraction network, and constructing and generating a first concerned network from bottom to top; the target stage is a network stage for extracting deep features;
after convolution processing is carried out on the feature map of each layer in the first concerned network, down-sampling processing is carried out to obtain a second concerned network from top to bottom;
performing up-sampling processing on the feature map obtained from the last layer in the second concerned network to output a target feature map, and fusing the feature map of the target layer in the second concerned network with the feature map of the previous layer in the first concerned network relative to the target layer to output a plurality of target feature maps;
wherein the target layer is the remaining layers in the first interest network that are not the last layer.
4. The karyotyping method according to claim 1, wherein the generating a plurality of anchor frames from the target feature map, and the extracting a plurality of regions of interest of the same size from the anchor frames by a predetermined detection algorithm comprises the steps of:
generating a plurality of anchor frames according to the target characteristic diagram, and screening candidate suggestion frames from the anchor frames;
and carrying out size fixing processing on the candidate suggestion frame to obtain a plurality of regions of interest with the same size.
5. The karyotyping method according to claim 4, wherein said generating a plurality of anchor frames from said target feature map, and wherein said screening of candidate proposed frames from said anchor frames comprises the steps of:
generating a plurality of anchor frames on each target feature map according to preset parameters;
scoring and position correcting each anchor frame, and removing the anchor frames exceeding the image boundary;
and screening the anchor frames with the confidence coefficient reaching a preset threshold value from the rest anchor frames through a non-maximum suppression algorithm to obtain the candidate suggestion frame.
6. The karyotyping method according to any one of claims 1 to 5, wherein said constructing a karyotyping model based on said region of interest and said characteristic information includes the steps of:
classifying and position regressing the region of interest based on a classification regression network, and building and generating a corresponding rapid classification network;
adding a mask prediction branch network on the basis of the rapid classification network, and constructing and generating a pixel segmentation network;
and adding a classification evaluation segmentation network on the basis of the pixel segmentation network, and constructing and generating the chromosome karyotype analysis model.
7. A karyotyping system, comprising:
the fusion module is used for performing feature extraction and feature fusion on the chromosome microscopic image to output a target feature map;
the extraction module is used for generating a plurality of anchor frames according to the target characteristic diagram and extracting a plurality of regions of interest with the same size from the anchor frames through a preset detection algorithm;
the construction module is used for constructing a chromosome karyotype analysis model according to the region of interest and the characteristic information;
and the processing module is used for inputting the chromosome microscopic image to be identified into the chromosome karyotype analysis model and outputting a corresponding chromosome karyotype analysis result.
8. The karyotyping system according to claim 7, wherein the construction module includes:
the classification network creating unit is used for classifying and position regressing the region of interest based on a classification regression network, and building and generating a corresponding rapid classification network;
the segmentation network creating unit is used for adding a mask prediction branch network on the basis of the rapid classification network and building and generating a pixel segmentation network;
and the karyotype network creating unit is used for adding a classification evaluation segmentation network on the basis of the pixel segmentation network and building and generating the chromosome karyotype analysis model.
9. A terminal device comprising a processor, a memory, and a computer program stored in the memory and executable on the processor, wherein the processor is configured to execute the computer program stored in the memory to perform the operations performed by the karyotype analysis method according to any one of claims 1 to 6.
10. A storage medium having stored therein at least one instruction that is loaded and executed by a processor to perform operations performed by the karyotyping method according to any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110600361.XA CN113223614A (en) | 2021-05-31 | 2021-05-31 | Chromosome karyotype analysis method, system, terminal device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110600361.XA CN113223614A (en) | 2021-05-31 | 2021-05-31 | Chromosome karyotype analysis method, system, terminal device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113223614A true CN113223614A (en) | 2021-08-06 |
Family
ID=77081976
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110600361.XA Pending CN113223614A (en) | 2021-05-31 | 2021-05-31 | Chromosome karyotype analysis method, system, terminal device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113223614A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113658199A (en) * | 2021-09-02 | 2021-11-16 | 中国矿业大学 | Chromosome instance segmentation network based on regression correction |
CN113989502A (en) * | 2021-10-25 | 2022-01-28 | 湖南自兴智慧医疗科技有限公司 | Chromosome segmentation identification method and device based on graph convolution neural network and electronic equipment |
CN115063360A (en) * | 2022-06-09 | 2022-09-16 | 成都华西精准医学产业技术研究院有限公司 | Intelligent interpretation method and system based on virtual dyeing |
CN115188413A (en) * | 2022-06-17 | 2022-10-14 | 广州智睿医疗科技有限公司 | Chromosome karyotype analysis module |
CN118115996A (en) * | 2024-04-30 | 2024-05-31 | 四川大学华西第二医院 | Distributed chromosome karyotype data labeling method based on artificial intelligence algorithm assistance |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200167943A1 (en) * | 2018-11-28 | 2020-05-28 | Nvidia Corporation | 3d plane detection and reconstruction using a monocular image |
CN111401293A (en) * | 2020-03-25 | 2020-07-10 | 东华大学 | Gesture recognition method based on Head lightweight Mask scanning R-CNN |
CN111598030A (en) * | 2020-05-21 | 2020-08-28 | 山东大学 | Method and system for detecting and segmenting vehicle in aerial image |
CN112288706A (en) * | 2020-10-27 | 2021-01-29 | 武汉大学 | Automatic chromosome karyotype analysis and abnormality detection method |
-
2021
- 2021-05-31 CN CN202110600361.XA patent/CN113223614A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200167943A1 (en) * | 2018-11-28 | 2020-05-28 | Nvidia Corporation | 3d plane detection and reconstruction using a monocular image |
CN111401293A (en) * | 2020-03-25 | 2020-07-10 | 东华大学 | Gesture recognition method based on Head lightweight Mask scanning R-CNN |
CN111598030A (en) * | 2020-05-21 | 2020-08-28 | 山东大学 | Method and system for detecting and segmenting vehicle in aerial image |
CN112288706A (en) * | 2020-10-27 | 2021-01-29 | 武汉大学 | Automatic chromosome karyotype analysis and abnormality detection method |
Non-Patent Citations (1)
Title |
---|
董洪义: "《深度学习之pytorch物体检测实战》", 31 March 2020 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113658199A (en) * | 2021-09-02 | 2021-11-16 | 中国矿业大学 | Chromosome instance segmentation network based on regression correction |
CN113658199B (en) * | 2021-09-02 | 2023-11-03 | 中国矿业大学 | Regression correction-based chromosome instance segmentation network |
CN113989502A (en) * | 2021-10-25 | 2022-01-28 | 湖南自兴智慧医疗科技有限公司 | Chromosome segmentation identification method and device based on graph convolution neural network and electronic equipment |
CN113989502B (en) * | 2021-10-25 | 2024-06-07 | 湖南自兴智慧医疗科技有限公司 | Chromosome segmentation recognition method and device based on graph convolution neural network and electronic equipment |
CN115063360A (en) * | 2022-06-09 | 2022-09-16 | 成都华西精准医学产业技术研究院有限公司 | Intelligent interpretation method and system based on virtual dyeing |
CN115188413A (en) * | 2022-06-17 | 2022-10-14 | 广州智睿医疗科技有限公司 | Chromosome karyotype analysis module |
WO2023240820A1 (en) * | 2022-06-17 | 2023-12-21 | 广州智睿医疗科技有限公司 | Chromosome karyotype analysis module |
CN118115996A (en) * | 2024-04-30 | 2024-05-31 | 四川大学华西第二医院 | Distributed chromosome karyotype data labeling method based on artificial intelligence algorithm assistance |
CN118115996B (en) * | 2024-04-30 | 2024-07-12 | 四川大学华西第二医院 | Distributed chromosome karyotype data labeling method based on artificial intelligence algorithm assistance |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111027493B (en) | Pedestrian detection method based on deep learning multi-network soft fusion | |
CN113223614A (en) | Chromosome karyotype analysis method, system, terminal device and storage medium | |
CN109583345B (en) | Road recognition method, device, computer device and computer readable storage medium | |
CN111046880A (en) | Infrared target image segmentation method and system, electronic device and storage medium | |
CN110379020B (en) | Laser point cloud coloring method and device based on generation countermeasure network | |
CN111814902A (en) | Target detection model training method, target identification method, device and medium | |
CN111860439A (en) | Unmanned aerial vehicle inspection image defect detection method, system and equipment | |
CN113221787A (en) | Pedestrian multi-target tracking method based on multivariate difference fusion | |
CN111524137A (en) | Cell identification counting method and device based on image identification and computer equipment | |
CN113822314A (en) | Image data processing method, apparatus, device and medium | |
CN114332473B (en) | Object detection method, device, computer apparatus, storage medium, and program product | |
CN112215217B (en) | Digital image recognition method and device for simulating doctor to read film | |
CN110852327A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN113052170A (en) | Small target license plate recognition method under unconstrained scene | |
CN110837809A (en) | Blood automatic analysis method, blood automatic analysis system, blood cell analyzer, and storage medium | |
CN112991280A (en) | Visual detection method and system and electronic equipment | |
CN116310688A (en) | Target detection model based on cascade fusion, and construction method, device and application thereof | |
CN113706562A (en) | Image segmentation method, device and system and cell segmentation method | |
CN115439456A (en) | Method and device for detecting and identifying object in pathological image | |
CN113177956B (en) | Semantic segmentation method for unmanned aerial vehicle remote sensing image | |
CN111797923A (en) | Training method of image classification model, and image classification method and device | |
CN110889418A (en) | Gas contour identification method | |
CN113963004A (en) | Sampling method and device and electronic equipment | |
CN114463574A (en) | Scene classification method and device for remote sensing image | |
CN112396620A (en) | Image semantic segmentation method and system based on multiple thresholds |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210806 |
|
RJ01 | Rejection of invention patent application after publication |