CN109858428A - ANA flourescent sheet automatic identifying method based on machine learning and deep learning - Google Patents

ANA flourescent sheet automatic identifying method based on machine learning and deep learning Download PDF

Info

Publication number
CN109858428A
CN109858428A CN201910078013.3A CN201910078013A CN109858428A CN 109858428 A CN109858428 A CN 109858428A CN 201910078013 A CN201910078013 A CN 201910078013A CN 109858428 A CN109858428 A CN 109858428A
Authority
CN
China
Prior art keywords
ana
flourescent sheet
caryogram
titre
follows
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910078013.3A
Other languages
Chinese (zh)
Other versions
CN109858428B (en
Inventor
黄琪
魏骁勇
武永康
杨震群
盛爱林
钟奇林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan University
Original Assignee
Sichuan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan University filed Critical Sichuan University
Priority to CN201910078013.3A priority Critical patent/CN109858428B/en
Publication of CN109858428A publication Critical patent/CN109858428A/en
Application granted granted Critical
Publication of CN109858428B publication Critical patent/CN109858428B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of ANA flourescent sheet automatic identifying method based on machine learning and deep learning, belong to machine learning and depth learning technology field, the problem of solution is high to personnel requirement in the prior art by artificial judgment ANA flourescent sheet, is easy to appear erroneous judgement.The present invention is based on machine learning models, and titre model is calculated by data set;Based on deep learning model, caryogram model is calculated by data set;ANA flourescent sheet to be identified is read, i.e., picture to be identified is carried out feature extraction, the multiple characteristic values extracted by picture to be identified;Multiple features are inputted into titre model, the titre predicted;By picture input nucleus pattern type to be identified, the caryogram predicted;Recognition result is obtained according to the caryogram of the titre of prediction and prediction.The titre and caryogram of present invention ANA flourescent sheet for identification.

Description

ANA flourescent sheet automatic identifying method based on machine learning and deep learning
Technical field
A kind of ANA flourescent sheet automatic identifying method based on machine learning and deep learning, for identification ANA flourescent sheet Titre and caryogram belong to machine learning and depth learning technology field.
Background technique
Antinuclear antibodies (anti-nuclear antibody, ANA) is the general name for resisting all nucleic acid and nucleoprotein antibody, packet Include nucleus, cytoplasm, cytoskeleton, the certain ingredients generated in cell division cycle etc..ANA can be used as serologic marker Object.Generally acknowledged bio-sheet material indirect immunofluorescence (indirect immunofluorescence, IIF) is inspection both at home and abroad at present The goldstandard method for surveying ANA has the characteristics that sensibility height, sxemiquantitative.
Although IIF has irreplaceable advantage in detection ANA, it is automated and standardization lags behind other and is immunized Technology.Current tradition ANA flourescent sheet recognition methods is artificial judgment.But such method is more demanding to docimaster, and time-consuming Effort, in fact it could happen that deviation.Even different docimasters may provide different inspection results to same picture.
Summary of the invention
Aiming at the problem that the studies above, the purpose of the present invention is to provide a kind of based on machine learning and deep learning ANA flourescent sheet automatic identifying method solves to be easy to personnel requirement height by artificial judgment ANA flourescent sheet in the prior art The problem of now judging by accident.
In order to achieve the above object, the present invention adopts the following technical scheme:
A kind of ANA flourescent sheet automatic identifying method based on machine learning and deep learning, which is characterized in that following step It is rapid:
S1, it is based on machine learning model, titre model is calculated by data set;
S2, it is based on deep learning model, caryogram model is calculated by data set;
S3, ANA flourescent sheet to be identified is read, i.e., picture to be identified is carried out feature extraction, obtained by picture to be identified To multiple characteristic values of extraction;
S4, multiple characteristic values are inputted into titre model, the titre predicted;
S5, by picture input nucleus pattern type to be identified, the caryogram predicted;
S6, recognition result is obtained according to the titre of prediction and the caryogram of prediction.
Further, the specific steps of the step S1 are as follows:
S1.1, the initial data after being cleaned to the raw data set that the ANA flourescent sheet of acquisition is constituted, after extracting cleaning The titre of each ANA flourescent sheet is concentrated to mark;
S1.2, the characteristic value for extracting the ANA flourescent sheet after titre mark, as data set;
S1.3, according to data set Training Support Vector Machines, obtain titre model.
Further, the specific steps of the step S1.2 are as follows:
S1.2.1, each ANA flourescent sheet is converted to gray scale picture, gray scale picture is the two dimension that a size is X*Y Matrix M calculates all elements average value of two-dimensional matrix M, obtains first characteristic value f1;
S1.2.2, two-dimensional matrix M some specific element represent the gray value of the pixel, the size of value is in 0-255 Between, matrix all elements are counted according to gray-scale intensity grade, i.e., count gray value respectively in 0-10,11-20,21- 30 until the pixel quantity between 241-250, and the quantity for successively obtaining statistics is as characteristic value f2-f26, and size exists Pixel quantity between 251-255 is as characteristic value f27;
S1.2.3, by the texture mean value of all gray scale pictures, texture variance, texture smoothness, texture third moment, texture one Cause property and texture entropy are respectively as characteristic value f28-f33;
S1.2.4, using obtained characteristic value f1-f33 as data set.
Further, the calculation formula in the step S1.2.3 is as follows:
The calculation formula of texture mean value are as follows:
Wherein, L is gray level sum, zkIndicate k-th of gray level, p (zk) it is grey in normalization histogram grey level distribution Degree is zkProbability;
The calculation formula of texture variance are as follows:
The calculation formula of texture smoothness are as follows:
The calculation formula of texture third moment are as follows:
The calculation formula of texture homogeneity are as follows:
The calculation formula of texture entropy are as follows:
Further, the specific steps of the step S1.3 are as follows:
S1.3.1, the characteristic value f1-f33 in data set is extracted as a two-dimensional matrix M2, two-dimensional matrix M2 shape For X*33, wherein X represents the quantity of picture in data set, and 33 representation eigenvalues share 33;
S1.3.2, the titre of all pictures and titre mark are extracted, as matrix M3, the shape of matrix M3 is X* 1;
S1.3.3, according to the data Training Support Vector Machines of two-dimensional matrix M2 and matrix M3, obtain titre model.
Further, the step S2 the specific steps are;
S2.1, the raw data set constituted to the ANA flourescent sheet of acquisition clean, i.e., deletion initial data concentration does not have There is the ANA flourescent sheet of karyotype information;
S2.2, according to step 2.1 raw data set that obtains that treated, extract the RGB three-dimensional matrice of ANA flourescent sheet, And according to the input size requirements of neural network, the shape of the RGB three-dimensional matrice extracted is zoomed in or out;
The three-dimensional matrice that S2.3 is obtained after zooming in or out is as training dataset to the depth based on figure convolution method Neural network is trained, and obtains caryogram model;
Further, initial data is deleted in the step S2.1 concentrate the specific of the ANA flourescent sheet without karyotype information Step are as follows:
17 caryogram are first selected, then judge whether the caryogram for the ANA flourescent sheet that the initial data after cleaning is concentrated contains 17 kinds one or more, if withing a hook at the end, otherwise delete the ANA flourescent sheet.
Further, in the step S2.3 deep neural network based on figure convolution method the last layer, use Sigmoid method output ANA flourescent sheet belongs to the probability of some caryogram, and formula is as follows:
Wherein, the last layer of deep neural network has 17 neurons, respectively represents 17 kinds of caryogram, and x is some mind Input through member, S represent the probability for exporting some caryogram.
Further, LSTM network is additionally used, the deep neural network based on figure convolution method is assisted to carry out caryogram judgement.
The present invention compared with the existing technology, its advantages are shown in:
One, the present invention is based on machine learning and deep learning can automatic identification ANA flourescent sheet, to personnel without technical requirements, There is uniqueness to the inspection result that same picture (ANA flourescent sheet) is provided, predetermined speed is fast, and (general 4-5s can identify one Picture) and stablize;
Two, titre and caryogram prediction of the present invention is attained by higher precision, and can recognize that portion in caryogram prediction Divide the rare type being manually difficult to, in the test of 512 pictures, the probability at least hitting a caryogram is 96.7%; In titre test, the precision of acquirement is 98.4%, wherein obtained caryogram model is only needed by subtle adjustment Other similar cell recognition is used, such as: in Human Protein Atlas (HPA) database, to hybrid guided mode Human protein's picture of formula is classified, and preferable effect is achieved;
Three, the present invention obtained caryogram prediction network structure (i.e. caryogram model) under other equipment as long as using obtaining Data set carries out the re -training study of short time, obtains new model, that is, is applicable to the ANA flourescent sheet that distinct device obtains Caryogram prediction, it is wide to adapt to equipment.
Detailed description of the invention
Fig. 1 is that the titre of ANA flourescent sheet in the present invention marks schematic diagram;
Fig. 2 is to obtain the schematic diagram of characteristic value f1 in the present invention;
Fig. 3 is to obtain the schematic diagram of characteristic value f4 to characteristic value f27 in the present invention;
Fig. 4 is to obtain the schematic diagram of characteristic value f8 to characteristic value f23 in the present invention;
Fig. 5 is the obtained data set schematic diagram in the present invention;
Fig. 6 is that the schematic diagram after caryogram is labeled has been differentiated in the present invention;
Fig. 7 is to have obtained the three-dimensional matrice schematic diagram of RGB channel in the present invention;
Fig. 8 is flow diagram of the invention.
Specific embodiment
Below in conjunction with the drawings and the specific embodiments, the invention will be further described.
The present invention is based on the machine learning models of stability and high efficiency and deep learning model, are predicted using machine learning techniques The titre (titre is divided into feminine gender, 1: 100,1: 320,1: 1000,1: 3200,1: 10000 etc.) of ANA flourescent sheet, utilizes depth (ANA caryogram has the caryogram of habit technological prediction ANA flourescent sheet: homogeneous pattern, spotted type, nucleolar pattern, nuclear membrane type, kinetochore type, epipole Nearly 30 kinds of type, endochylema type etc., and single positive ANA flourescent sheet may have one or more caryogram).This method prediction is quick And stablize, identical prediction result is provided to identical picture.It is specific as follows:
A kind of ANA flourescent sheet automatic identifying method based on machine learning and deep learning, following steps:
S1, it is based on machine learning model, titre model is calculated by data set;Specific steps are as follows:
S1.1, the raw data set that the ANA flourescent sheet of acquisition is constituted is cleaned after (cleaning refers in data mining Face, the data of initial data to investment model need a large amount of processing work), the initial data after extracting cleaning concentrates each ANA The titre of flourescent sheet marks;Such as: the titre manually marked totally six grades are respectively set to six numbers such as 0-5, initial data Collection is an ANA flourescent sheet file, and the name of ANA flourescent sheet file includes artificial judgment as a result, reference format are as follows:+1: 1000 spotted type nucleolar patterns _ ANA_HEp2.jpg, wherein represent the titre of picture mark, spotted type and nucleolar pattern generation at 1: 1000 The caryogram of Table A NA flourescent sheet.Training dataset is constructed with this, as shown in Figure 1.Wherein, (number refers to drop for the titre of mark and number Scale note, i.e. six grades) corresponding relationship is: " feminine gender " corresponding 0, " 1: 100 " corresponding 1, " 1: 320 " corresponds to 2, " 1: 1000 " Corresponding 3, " 1: 3200 " corresponding 4, " 1: 10000 " corresponding 5.
S1.2, the characteristic value for extracting the ANA flourescent sheet after titre mark, as data set;Specific steps are as follows:
S1.2.1, each ANA flourescent sheet is converted to gray scale picture, gray scale picture is the two dimension that a size is X*Y Matrix M calculates all elements average value of two-dimensional matrix M, obtains first characteristic value f1, as shown in Figure 2;
S1.2.2, two-dimensional matrix M some specific element represent the gray value of the pixel, the size of value is in 0-255 Between, matrix all elements are counted according to gray-scale intensity grade, i.e., count gray value respectively in 0-10,11-20,21- 30 until the pixel quantity between 241-250, and the quantity for successively obtaining statistics is as characteristic value f2-f26, and size exists Pixel quantity between 251-255 is as characteristic value f27, as shown in Figure 3;
S1.2.3, by the texture mean value of all gray scale pictures, texture variance, texture smoothness, texture third moment, texture one Cause property and texture entropy, respectively as characteristic value f28-f33, i.e. the textured mean value of institute is made as characteristic value f28, the textured variance of institute It is textured as characteristic value f31, institute as characteristic value f30, the textured third moment of institute to be characterized value f29, all texture smoothness Consistency is as characteristic value f32 and textured entropy as characteristic value f32;Calculation formula is as follows:
The calculation formula of texture mean value are as follows:
Wherein, L is gray level sum, zkIndicate k-th of gray level, p (zk) it is grey in normalization histogram grey level distribution Degree is zkProbability;
The calculation formula of texture variance are as follows:
The calculation formula of texture smoothness are as follows:
The calculation formula of texture third moment are as follows:
The calculation formula of texture homogeneity are as follows:
The calculation formula of texture entropy are as follows:
S1.2.4, using obtained characteristic value f1-f33 as data set, as shown in Figure 5;Wherein, an ANA flourescent sheet obtains The characteristic value arrived is as shown in Figure 4.
S1.3, according to data set Training Support Vector Machines, obtain titre model.Specific steps are as follows:
S1.3.1, the characteristic value f1-f33 in data set is extracted as a two-dimensional matrix M2, two-dimensional matrix M2 shape For X*33, wherein X represents the quantity of picture in data set, and 33 representation eigenvalues share 33;
S1.3.2, the titre of all pictures and titre mark are extracted, as matrix M3, the shape of matrix M3 is X* 1, wherein M3 is that the titre of all pictures marks corresponding one-hot matrix;
S1.3.3, according to the data Training Support Vector Machines of two-dimensional matrix M2 and matrix M3, obtain titre model.
S2, it is based on deep learning model, caryogram model is calculated by data set;The specific steps are;
S2.1, the raw data set constituted to the ANA flourescent sheet of acquisition clean, i.e., deletion initial data concentration does not have There is the ANA flourescent sheet of karyotype information;Such as negative sample, ANA fluorescence title example are as follows: feminine gender (-) ANA_ HEp2.jpg, there is no caryogram, so not all picture has results of karyotype.And the raw data set after cleaning in practice Nearly 30 of middle caryogram can not effectively differentiate because to correspond to sample data volume too small for partial karyotype, therefore finally have chosen 17 kinds of common cores Type, such as spotted type, nuclear membrane type etc., it is one of 17 kinds of caryogram or a variety of that ANA flourescent sheet, which corresponds to caryogram, and corresponding caryogram is just used " 1 " mark, otherwise marks " 0 ", if a kind of corresponding caryogram of ANA flourescent sheet is spotted type, is then marked with " 1 ", otherwise marked " and 0 ", as shown in Figure 6.
Delete the specific steps that initial data concentrates the ANA flourescent sheet without karyotype information are as follows:
17 caryogram are first selected, then judge whether the caryogram for the ANA flourescent sheet that the initial data after cleaning is concentrated contains 17 kinds one or more, if withing a hook at the end, otherwise delete the ANA flourescent sheet.
S2.2, deep neural network study do not need characteristic value, it is only necessary to put into the three-dimensional square of the corresponding RGB channel of picture Battle array, as shown in Figure 7;The input size of deep neural network is fixed size, such as 256*256*3, is needed original graph Then piece scaling obtains rgb matrix to the size again;Therefore it according to step 2.1 raw data set that obtains that treated, extracts The RGB three-dimensional matrice of ANA flourescent sheet, and according to the input size requirements of neural network, the RGB three-dimensional matrice of extraction is put It is big or reduce to get to the rgb matrix zoomed in or out;
S2.3, the three-dimensional matrice obtained after zooming in or out are as training dataset to the depth based on figure convolution method Neural network (being customized neural network model) is trained, and obtains caryogram model;To the depth based on figure convolution method Neural network be trained in identification be known as " multi-tag study ", i.e. the input of individual ANA flourescent sheet may export zero core Type result, a results of karyotype or multiple results of karyotype, therefore the last layer of the deep neural network based on figure convolution method, Belong to the probability of some caryogram using Sigmoid method output ANA flourescent sheet, formula is as follows:
Wherein, the last layer of deep neural network has 17 neurons, respectively represents 17 kinds of caryogram, and x is some mind Input through member, S represent the probability for exporting some caryogram, may determine that whether the ANA samples pictures belong to according to the size of S A certain caryogram.
The present invention additionally uses LSTM network, and the deep neural network based on figure convolution method is assisted to carry out caryogram judgement, LSTM network is usually used in video, and voice etc. has in the sample of front and back correlation.It is rarely used for the knowledge of picture of the present invention Not.The partial karyotype especially having for ANA flourescent sheet is more similar or hidden, is difficult to and differentiates, and LSTM can be abundant Using the side images information of cell caryogram, recognition effect is further increased.
S3, ANA flourescent sheet to be identified is read, i.e., picture to be identified is carried out feature extraction, obtained by picture to be identified Extraction to multiple characteristic values of extraction, characteristic value is identical as the characteristics extraction mode obtained in data set;
S4, multiple characteristic values are inputted into titre model, the titre predicted;
S5, by picture input nucleus pattern type to be identified, the caryogram predicted;
S6, recognition result is obtained according to the titre of prediction and the caryogram of prediction.Such as: for each ANA fluorescence to be predicted Piece predicts titre by available its of step S4, i.e., negative, in 1: 100,1: 320,1: 1000,1: 3200,1: 10000 Some.And pass through its available caryogram of step S5, such as " spotted type nuclear membrane type ", " chromosomal pattern " etc..For prediction The result is that negative ANA sample, is not necessarily to results of karyotype, directly returns the result " feminine gender ".For other titre situations, then need Splice titre and results of karyotype, returns to user together.Such as " 1: 320 chromosomal pattern ", " 1: 1000 spotted type nuclear membrane type " etc. Deng.
The present invention in the prior art " coincidence rate of single caryogram fluorescence mode be 93.46%, compound caryogram fluorescence mode The coincidence rate of main caryogram is 91.78%;The coincidence rate of single caryogram fluorescence mode antibody titer is 90.95%, and compound caryogram is glimmering The coincidence rate of the main caryogram antibody titer of optical mode is compared for 94.03% ", and precision of the invention is higher, and passes through high-precision drop Degree and caryogram prediction combination, it is possible to provide more fully information.In addition in the prediction of caryogram, this set of model may be used as imparting knowledge to students Deng.
The above is only the representative embodiment in the numerous concrete application ranges of the present invention, to protection scope of the present invention not structure At any restrictions.It is all using transformation or equivalence replacement and the technical solution that is formed, all fall within rights protection scope of the present invention it It is interior.

Claims (9)

1. a kind of ANA flourescent sheet automatic identifying method based on machine learning and deep learning, which is characterized in that following steps:
S1, it is based on machine learning model, titre model is calculated by data set;
S2, it is based on deep learning model, caryogram model is calculated by data set;
S3, ANA flourescent sheet to be identified is read, i.e., picture to be identified is carried out feature extraction, mentioned by picture to be identified The multiple characteristic values taken;
S4, multiple characteristic values are inputted into titre model, the titre predicted;
S5, by picture input nucleus pattern type to be identified, the caryogram predicted;
S6, recognition result is obtained according to the titre of prediction and the caryogram of prediction.
2. a kind of ANA flourescent sheet automatic identifying method based on machine learning and deep learning according to claim 1, It is characterized in that, the specific steps of the step S1 are as follows:
S1.1, the initial data concentration after being cleaned to the raw data set that the ANA flourescent sheet of acquisition is constituted, after extracting cleaning The titre of each ANA flourescent sheet marks;
S1.2, the characteristic value for extracting the ANA flourescent sheet after titre mark, as data set;
S1.3, according to data set Training Support Vector Machines, obtain titre model.
3. a kind of ANA flourescent sheet automatic identifying method based on machine learning and deep learning according to claim 2, It is characterized in that, the specific steps of the step S1.2 are as follows:
S1.2.1, each ANA flourescent sheet is converted to gray scale picture, gray scale picture is the two-dimensional matrix that a size is X*Y M calculates all elements average value of two-dimensional matrix M, obtains first characteristic value f1;
S1.2.2, two-dimensional matrix M some specific element represent the gray value of the pixel, the size of value between 0-255, Matrix all elements are counted according to gray-scale intensity grade, i.e., respectively statistics gray value 0-10,11-20,21-30 until Pixel quantity between 241-250, the quantity for successively obtaining statistics is as characteristic value f2-f26, and size is in 251-255 Between pixel quantity as characteristic value f27;
S1.2.3, by the texture mean value of all gray scale pictures, texture variance, texture smoothness, texture third moment, texture homogeneity With texture entropy respectively as characteristic value f28-f33;
S1.2.4, using obtained characteristic value f1-f33 as data set.
4. a kind of ANA flourescent sheet automatic identifying method based on machine learning and deep learning according to claim 3, It is characterized in that, the calculation formula in the step S1.2.3 is as follows:
The calculation formula of texture mean value are as follows:
Wherein, L is gray level sum, zkIndicate k-th of gray level, p (zk) it is that gray scale is in normalization histogram grey level distribution zkProbability;
The calculation formula of texture variance are as follows:
The calculation formula of texture smoothness are as follows:
The calculation formula of texture third moment are as follows:
The calculation formula of texture homogeneity are as follows:
The calculation formula of texture entropy are as follows:
5. a kind of ANA flourescent sheet automatic identifying method based on machine learning and deep learning according to claim 4, It is characterized in that, the specific steps of the step S1.3 are as follows:
S1.3.1, the characteristic value f1-f33 in data set is extracted as a two-dimensional matrix M2, two-dimensional matrix M2 shape is X* 33, wherein X represents the quantity of picture in data set, and 33 representation eigenvalues share 33;
S1.3.2, the titre of all pictures and titre mark are extracted, as matrix M3, the shape of matrix M3 is X*1;
S1.3.3, according to the data Training Support Vector Machines of two-dimensional matrix M2 and matrix M3, obtain titre model.
6. a kind of ANA flourescent sheet automatic identifying method based on machine learning and deep learning according to claim 1, Be characterized in that, the step S2 the specific steps are;
S2.1, the raw data set constituted to the ANA flourescent sheet of acquisition clean, i.e., deletion initial data, which is concentrated, does not have core The ANA flourescent sheet of type information;
S2.2, according to step 2.1 raw data set that obtains that treated, extract the RGB three-dimensional matrice of ANA flourescent sheet, and root According to the input size requirements of neural network, the shape of the RGB three-dimensional matrice extracted is zoomed in or out;
The three-dimensional matrice that S2.3 is obtained after zooming in or out is as training dataset to the depth nerve based on figure convolution method Network is trained, and obtains caryogram model.
7. a kind of ANA flourescent sheet automatic identifying method based on machine learning and deep learning according to claim 6, It is characterized in that, the specific steps that initial data concentrates the ANA flourescent sheet without karyotype information is deleted in the step S2.1 are as follows:
17 caryogram are first selected, then judge whether the caryogram for the ANA flourescent sheet that the initial data after cleaning is concentrated contains 17 kinds It is one or more, if withing a hook at the end, otherwise delete the ANA flourescent sheet.
8. a kind of ANA flourescent sheet automatic identification side based on machine learning and deep learning according to claim 6 or 7 Method, which is characterized in that the last layer of the deep neural network based on figure convolution method in the step S2.3 uses Sigmoid method output ANA flourescent sheet belongs to the probability of some caryogram, and formula is as follows:
Wherein, the last layer of deep neural network has 17 neurons, respectively represents 17 kinds of caryogram, and x is some neuron Input, S represent export some caryogram probability.
9. a kind of ANA flourescent sheet automatic identifying method based on machine learning and deep learning according to claim 8, It is characterized in that, additionally uses LSTM network, the deep neural network based on figure convolution method is assisted to carry out caryogram judgement.
CN201910078013.3A 2019-01-28 2019-01-28 Automatic ANA fluorescent film identification method based on machine learning and deep learning Active CN109858428B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910078013.3A CN109858428B (en) 2019-01-28 2019-01-28 Automatic ANA fluorescent film identification method based on machine learning and deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910078013.3A CN109858428B (en) 2019-01-28 2019-01-28 Automatic ANA fluorescent film identification method based on machine learning and deep learning

Publications (2)

Publication Number Publication Date
CN109858428A true CN109858428A (en) 2019-06-07
CN109858428B CN109858428B (en) 2021-08-17

Family

ID=66896306

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910078013.3A Active CN109858428B (en) 2019-01-28 2019-01-28 Automatic ANA fluorescent film identification method based on machine learning and deep learning

Country Status (1)

Country Link
CN (1) CN109858428B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111403004A (en) * 2020-02-26 2020-07-10 广州和硕信息技术有限公司 Artificial intelligence ANA detection picture and text report system
CN113837255A (en) * 2021-09-15 2021-12-24 中国科学院心理研究所 Methods, devices and media for predicting karyotype classes of cell-based antibodies
CN113870280A (en) * 2021-09-15 2021-12-31 中国科学院心理研究所 Methods, devices and media for predicting karyotype classes of cell-based antibodies
CN114018789A (en) * 2021-10-08 2022-02-08 武汉大学 Acute leukemia typing method based on imaging flow cytometry detection and machine learning
CN116883325A (en) * 2023-06-21 2023-10-13 杭州医策科技有限公司 Immunofluorescence image analysis method and device
CN117575993A (en) * 2023-10-20 2024-02-20 中国医学科学院皮肤病医院(中国医学科学院皮肤病研究所) Processing method and system for titer values based on deep learning

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102122356A (en) * 2011-03-16 2011-07-13 中国人民解放军第二军医大学 Computer-aided method for distinguishing ultrasound endoscope image of pancreatic cancer
CN102667483A (en) * 2010-02-22 2012-09-12 Medipan有限公司 Method and system for disease diagnosis via simultaneous detection of antibodies bound to synthetic and cellular substrates
WO2014177700A1 (en) * 2013-05-02 2014-11-06 Universite D'aix-Marseille Indirect immunofluorescence method for detecting antinuclear autoantibodies.
CN107403201A (en) * 2017-08-11 2017-11-28 强深智能医疗科技(昆山)有限公司 Tumour radiotherapy target area and jeopardize that organ is intelligent, automation delineation method
WO2018025130A2 (en) * 2016-08-02 2018-02-08 Universita' Del Piemonte Orientale Method for inducing and differentiating pluripotent stem cells and uses thereof
CN107727869A (en) * 2017-10-12 2018-02-23 上海川至生物技术有限公司 Kit of antinuclear antibodies and preparation method thereof in a kind of measure serum
CN109117701A (en) * 2018-06-05 2019-01-01 东南大学 Pedestrian's intension recognizing method based on picture scroll product

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102667483A (en) * 2010-02-22 2012-09-12 Medipan有限公司 Method and system for disease diagnosis via simultaneous detection of antibodies bound to synthetic and cellular substrates
CN102122356A (en) * 2011-03-16 2011-07-13 中国人民解放军第二军医大学 Computer-aided method for distinguishing ultrasound endoscope image of pancreatic cancer
WO2014177700A1 (en) * 2013-05-02 2014-11-06 Universite D'aix-Marseille Indirect immunofluorescence method for detecting antinuclear autoantibodies.
WO2018025130A2 (en) * 2016-08-02 2018-02-08 Universita' Del Piemonte Orientale Method for inducing and differentiating pluripotent stem cells and uses thereof
CN107403201A (en) * 2017-08-11 2017-11-28 强深智能医疗科技(昆山)有限公司 Tumour radiotherapy target area and jeopardize that organ is intelligent, automation delineation method
CN107727869A (en) * 2017-10-12 2018-02-23 上海川至生物技术有限公司 Kit of antinuclear antibodies and preparation method thereof in a kind of measure serum
CN109117701A (en) * 2018-06-05 2019-01-01 东南大学 Pedestrian's intension recognizing method based on picture scroll product

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DONATO CASCIO 等: "An Automatic HEp-2 Specimen Analysis System", 《MDPI》 *
PAOLO SODA 等: "Aggregation of Classifiers for Staining Pattern Recognition in Antinuclear Autoantibodies Analysis", 《IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE,》 *
ZHIMIN GAO 等: "HEp-2 Cell Image classification With Deep Convolutional Neural Networks", 《IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS》 *
马荣 等: "ANA滴度、抗ds-DNA滴度检测与自身免疫性疾病相关性及临床应用价值探讨", 《大连医科大学学报》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111403004A (en) * 2020-02-26 2020-07-10 广州和硕信息技术有限公司 Artificial intelligence ANA detection picture and text report system
CN113837255A (en) * 2021-09-15 2021-12-24 中国科学院心理研究所 Methods, devices and media for predicting karyotype classes of cell-based antibodies
CN113870280A (en) * 2021-09-15 2021-12-31 中国科学院心理研究所 Methods, devices and media for predicting karyotype classes of cell-based antibodies
CN113837255B (en) * 2021-09-15 2023-06-13 中国科学院心理研究所 Method, apparatus and medium for predicting cell-based antibody karyotype class
CN114018789A (en) * 2021-10-08 2022-02-08 武汉大学 Acute leukemia typing method based on imaging flow cytometry detection and machine learning
CN116883325A (en) * 2023-06-21 2023-10-13 杭州医策科技有限公司 Immunofluorescence image analysis method and device
CN116883325B (en) * 2023-06-21 2024-04-30 杭州医策科技有限公司 Immunofluorescence image analysis method and device
CN117575993A (en) * 2023-10-20 2024-02-20 中国医学科学院皮肤病医院(中国医学科学院皮肤病研究所) Processing method and system for titer values based on deep learning
CN117575993B (en) * 2023-10-20 2024-05-21 中国医学科学院皮肤病医院(中国医学科学院皮肤病研究所) Processing method and system for titer values based on deep learning

Also Published As

Publication number Publication date
CN109858428B (en) 2021-08-17

Similar Documents

Publication Publication Date Title
CN109858428A (en) ANA flourescent sheet automatic identifying method based on machine learning and deep learning
CN109948425A (en) A kind of perception of structure is from paying attention to and online example polymerize matched pedestrian's searching method and device
CN103345617B (en) Chinese medicine knows method for distinguishing and system thereof
CN109064462A (en) A kind of detection method of surface flaw of steel rail based on deep learning
CN109685141A (en) A kind of robotic article sorting visible detection method based on deep neural network
CN110211108A (en) A kind of novel abnormal cervical cells automatic identifying method based on Feulgen colouring method
CN113128335B (en) Method, system and application for detecting, classifying and finding micro-living ancient fossil image
CN109190643A (en) Based on the recognition methods of convolutional neural networks Chinese medicine and electronic equipment
CN108416774A (en) A kind of fabric types recognition methods based on fine granularity neural network
CN109063649A (en) Pedestrian's recognition methods again of residual error network is aligned based on twin pedestrian
CN109117885A (en) A kind of stamp recognition methods based on deep learning
CN103870816A (en) Plant identification method and device with high identification rate
CN114140665A (en) Dense small target detection method based on improved YOLOv5
CN101655909A (en) Device and method for calculating matching degree
CN107992783A (en) Face image processing process and device
CN110008828A (en) Pairs of constraint ingredient assay measures optimization method based on difference regularization
CN106326914B (en) A kind of more classification methods of pearl based on SVM
CN104484679B (en) Non- standard rifle shooting warhead mark image automatic identifying method
CN113240640B (en) Colony counting method, apparatus and computer readable storage medium
CN103617417A (en) Automatic plant identification method and system
CN109741351A (en) A kind of classification responsive type edge detection method based on deep learning
CN109389170A (en) A kind of gradation type operating condition method for early warning based on 3D convolutional neural networks
CN115937492B (en) Feature recognition-based infrared image recognition method for power transformation equipment
CN104318267B (en) A kind of automatic identification system of Tibetan mastiff pup purity
CN110147840A (en) The weak structure object fine grit classification method divided based on the unsupervised component of conspicuousness

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant