CN114092740A - AI-assisted analysis method for immune lateral flow sensing - Google Patents

AI-assisted analysis method for immune lateral flow sensing Download PDF

Info

Publication number
CN114092740A
CN114092740A CN202111335260.0A CN202111335260A CN114092740A CN 114092740 A CN114092740 A CN 114092740A CN 202111335260 A CN202111335260 A CN 202111335260A CN 114092740 A CN114092740 A CN 114092740A
Authority
CN
China
Prior art keywords
training
training data
network
image
fluorescence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111335260.0A
Other languages
Chinese (zh)
Inventor
王威
郭劲宏
马星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Yunxin Medical Technology Co ltd
Original Assignee
Chengdu Yunxin Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Yunxin Medical Technology Co ltd filed Critical Chengdu Yunxin Medical Technology Co ltd
Priority to CN202111335260.0A priority Critical patent/CN114092740A/en
Publication of CN114092740A publication Critical patent/CN114092740A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6428Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention discloses an AI-assisted analysis method for immune lateral flow sensing, which comprises the steps of collecting fluorescence images, dividing data in fluorescence image data sets into a training data set and a verification data set, amplifying the training data through the characteristic engineering of the training data, obtaining a trained image classification network, and finally obtaining the classification result of data to be detected through the trained image classification network; according to the method, any complicated image preprocessing process is not needed, the robustness and the simplicity of the detection method are greatly improved, the up-conversion fluorescence image representation quantitative information is improved, meanwhile, the diversity of the up-conversion fluorescence image data set is enriched, the generalization capability and the accuracy of an artificial intelligence model for predicting an actual detection result are improved, the method is different from the traditional image processing for quantitative detection, the artificial intelligence based method has high tolerance on environmental noise, and real-time detection can be realized.

Description

AI-assisted analysis method for immune lateral flow sensing
Technical Field
The invention relates to the technical field of biomedical detection and artificial intelligence, in particular to an analytical method of immune lateral flow sensing under the assistance of AI.
Background
The upconversion luminescent nano material (UCNPs) has been widely applied to the fields of point-of-care testing (POCT), food detection, environmental hazard detection and the like by virtue of unique optical characteristics, converts near-infrared excitation light into high-energy visible light or ultraviolet light through an anti-stokes process, can effectively avoid the interference of background fluorescence and scattered light by virtue of the near-infrared excitation characteristic, does not cause the direct excitation of an energy receptor, and has superior optical and chemical properties such as better optical and chemical stability, light emitting adjustability, low cytotoxicity and the like which cannot be compared with other materials due to the fact that the doped lanthanide has higher energy level, so that the upconversion fluorescence resonance energy transfer (UC-FRET) technology taking the UCNPs as an energy donor has great application prospect in the fields of POCT, biosensing, medical diagnosis and the like based on the unique advantages, however, factors such as low luminous efficiency and environmental noise interference have made it a major challenge in widespread applications: the method lacks a general UCNPs quantitative detection strategy which has high accuracy and is suitable for field detection environment.
The existing patent number CN109447185A (microscopic fluorescence image classification method based on deep learning) has the following problems:
1. the early stage of the network based on deep learning needs a large amount of manually labeled training data, which consumes a large amount of manpower, the accuracy of the result of quantitative detection depends on the direct proportion of the amount of the manually labeled training data, and the traditional deep learning method is easy to generate an overfitting phenomenon, so that the generalization capability of the trained network is not strong;
2. the network based on the deep science requires a computing unit with high calculation power in the process of training and reasoning of test data, which means that the designed equipment is not portable.
Transfer learning, which is an important method of artificial intelligence, improves its generalization performance by using empirical parameters learned in a dimensional space in another domain, for example, pre-trained models are used as the starting points of newly proposed models in computer vision tasks, usually these pre-trained models consume huge time resources and computational resources when building neural networks, pre-trained models have learned abundant feature representations based on a large number of images, transfer learning can transfer the learned strong skills to related problems, and training a fine tuning network by transfer learning is faster and simpler than training a network from scratch using randomly initialized weights, and unlike other fields, the research difficulty in the biomedical field is that sufficiently effective medical data cannot be acquired, while introducing transfer learning into the biomedical field can well solve the contradiction between a large amount of data and a small amount of labels, therefore, the invention provides an analytical method of immune lateral flow sensing under the assistance of AI to solve the problems in the prior art.
Disclosure of Invention
In view of the above problems, the present invention aims to provide an analytic method of immune sidestream sensing under AI assistance, which enables an artificial intelligence model based on transfer learning to more easily adopt an edge computing architecture to be deployed in local internet of things devices, improves accuracy of quantitative detection result prediction of an immune sidestream sensor, and perfectly solves important problems to be solved, such as real-time local response, reliable service, data privacy, and the like, which are provided for sensors in the roadside detection and instant inspection fields.
In order to achieve the purpose of the invention, the invention is realized by the following technical scheme: an AI-assisted analytical method for immune lateral flow sensing, comprising the following steps:
step one
Collecting a fluorescence image around the control line after the upconversion fluorescence nanometer material is excited by near infrared through fluorescence detection equipment to obtain the width and height dimensions of a three-channel image around the control line;
step two
Collecting fluorescence images around the test line after the up-conversion fluorescence nano material is excited by near infrared through fluorescence detection equipment to obtain the width and height dimensions of three-channel images around the test line;
step three
Splicing the control line on the same immune lateral flow sensing test strip in the first step and the fluorescent line excited by the test line in the second step together to obtain the width and height of the three-channel image of the spliced fluorescent line;
step four
Correspondingly repeating the first step to the third step frame by collecting image sequences through fluorescence detection equipment to obtain a fluorescence image data set for training, and dividing data in the fluorescence image data set into a training data set and a verification data set;
step five
Training data obtained in the fourth step is amplified through the characteristic engineering of the training data;
step six
Retraining the pre-trained image classification network by using transfer learning based on the training data after feature engineering amplification to obtain a trained image classification network;
step seven
And deploying the network subjected to the transfer learning into local detection equipment, and obtaining a classification result of the data to be detected through the trained image classification network based on the test data obtained in the step four.
The further improvement lies in that: in the fourth step, 80% of the images in the fluorescence image dataset were used for training data and the remaining 20% were used for testing data.
The further improvement lies in that: in the fifth step, the feature engineering method of the training data comprises the following steps:
a1: sequentially adding Gaussian noise, polar coordinate conversion, horizontal overturning, Gaussian smoothing, anticlockwise rotation by 15 degrees and clockwise rotation by 15 degrees to the training image of the training data to generate an amplified training image and obtain amplified training data;
a2: in the training process, a method for preventing overfitting is adopted for training data so as to enhance the generalization capability of the model and avoid overfitting of the model;
a3: the training data obtained in step a2 is input into a pre-trained image classification network.
The further improvement lies in that: the method for preventing overfitting includes randomly reducing and enlarging the training data and randomly cutting the training data.
The further improvement lies in that: in the sixth step, the specific method for retraining the pre-trained image classification network comprises the following steps:
b1: loading a pre-training network;
b2: replacing the final layer;
b3: freezing an initial layer, and setting the learning rate of a shallower network layer to be zero to freeze the weight of the shallower network layer;
b4: training the network, and storing the weight after network training;
b5: and deploying the trained image classification network.
The further improvement lies in that: in the process of retraining the pre-trained image classification network, an absolute error is used as a loss function of the model, and the expression is as follows:
Figure BDA0003350275870000051
wherein L is a loss function, y is a label of the sample,
Figure BDA0003350275870000052
m is the number of samples for the predicted value of the model.
The further improvement lies in that: and seventhly, acquiring an actual verification fluorescent image by using a detection equipment capturing device, performing real-time reasoning on a result by using a trained image classification network, calculating classification accuracy, and finally displaying the classification accuracy on an equipment display screen.
The invention has the beneficial effects that:
1. any complicated image preprocessing process is not needed, so that the robustness and the simplicity of the detection method are greatly improved;
2. through the provided characteristic engineering, the representing quantitative information of the up-conversion fluorescence image is improved, meanwhile, the diversity of the up-conversion fluorescence image data set is enriched, and the generalization capability and the accuracy of the artificial intelligent model for predicting the actual detection result are improved;
3. the contradiction that a large amount of labeled data is needed when the biomedical field is combined with the traditional machine learning is solved, and an artificial intelligent model for quantitative detection with extremely high accuracy can be established by using transfer learning under the condition of only a small amount of data sets;
4. the contradiction that the biomedical field is combined with the traditional machine learning and needs huge calculation cost is solved, so that the model training of common equipment becomes possible;
5. because the artificial intelligence model based on the transfer learning is different from the traditional image processing for quantitative detection, the artificial intelligence based method has higher tolerance to environmental noise and can detect in real time.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is a flow diagram of a method of feature engineering of training data in an embodiment of the present invention;
FIG. 3 is a flow diagram for retraining a pre-trained image classification network in an embodiment of the present invention;
fig. 4 is a schematic diagram of a VGG16 pre-training network in an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," "third," "fourth," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Referring to fig. 1, the present embodiment provides an analytical method of AI-assisted immune lateral flow sensing, which includes the following steps:
step one
Collecting fluorescence images around the control line after the upconversion fluorescence nanometer material is excited by near infrared through fluorescence detection equipment to obtain three channel images around the control line, wherein the width and height of the three channel images are 112px and 224px respectively;
step two
Collecting fluorescence images around the test line after the up-conversion fluorescence nano material is excited by near infrared through fluorescence detection equipment to obtain three channel images around the test line, wherein the width and the height of the three channel images are 112px and 224px respectively;
step three
Splicing the control line on the same immune lateral flow sensing test strip in the first step and the fluorescent line excited by the test line in the second step together to obtain three-channel images of the spliced fluorescent line, wherein the width and the height of the three-channel images are 224px and 224px respectively;
step four
Correspondingly repeating the first step to the third step frame by collecting an image sequence through a fluorescence detection device to obtain a fluorescence image data set for training, dividing data in the fluorescence image data set into a training data set and a verification data set, using 80% of images for training data, using the remaining 20% of images for testing data, and specifically taking four concentrations of ice toxicity quantitative detection of 0.1ng/ml, 1ng/ml, 10ng/ml and 100ng/ml as an example, the constructed data set comprises 40 pictures, wherein 32 training data are obtained, and 8 testing data are obtained;
step five
Obtaining 32 training data in the fourth step, and amplifying the training data through the characteristic engineering of the training data;
referring to fig. 2, the feature engineering method of training data includes the steps of:
a1: sequentially adding Gaussian noise, polar coordinate conversion, horizontal overturning, Gaussian smoothing, anticlockwise rotation by 15 degrees and clockwise rotation by 15 degrees to the training images of 32 pieces of training data to generate amplified training images, and obtaining 192 pieces of amplified training data;
a2: in the training process, in order to enhance the generalization capability of the model and avoid overfitting of the model, a method for preventing overfitting is also adopted, and 192 training data are randomly subjected to 2 methods of random reduction and amplification and random position cutting;
a3: inputting the training data obtained in step A2 into a pre-trained image classification network
Step six
Retraining the pre-trained image classification network by using transfer learning based on the training data after feature engineering amplification to obtain a trained image classification network;
referring to fig. 3, the specific method for retraining the pre-trained image classification network includes the following steps:
b1: loading a pre-training network, referring to fig. 4, taking a VGG16 pre-training network as an example, the selection of the pre-training network structure in this step is not limited to one network such as VGG16, google lenet, ResNet50, MobileNet V2, and the like, and all the networks belong to the pre-training network as long as the deep learning network has learned rich feature representation based on a large number of images, and the detection result of the present invention can be achieved by selecting any one pre-training network;
b2: replacing the final layer, wherein the convolutional layer of the network will extract the image features used by the last learnable layer and the final classification layer to classify the input image, and take VGG16 as an example, randomly initialize the weights of three fully-connected (FC) layers (FC 6, FC7, and FC8 in fig. 4), and finally replace the last fully-connected layer softmax with a new fully-connected layer, wherein the output number is equal to the number of classes in the new data set, which is 4 in this embodiment example;
b3: freezing an initial layer, setting the learning rate of a shallower network layer to zero, freezing the weight of the shallower network layer, taking VGG16 as an example, loading the weights of pre-trained VGG16 networks of "conv 1", "conv 2", "conv 3", "conv 4" and "conv 5", and fixing the weights of the shallower network layers;
b4: training the network, and storing the weight after network training;
b5: deploying the trained image classification network;
in the embodiment, the absolute error is used as the loss function of the model, and the expression is as follows:
Figure BDA0003350275870000091
wherein L is a loss function, y is a label of the sample,
Figure BDA0003350275870000101
m is the predicted value of the model and the number of samples;
step seven
Deploying the network after the transfer learning into a local detection device, obtaining a classification result of the data to be detected through the trained image classification network based on the test data obtained in the step four, or obtaining an actual verification fluorescent image by using a detection device capturing device, performing real-time reasoning on the result by using the trained image classification network, calculating the classification accuracy, and finally displaying the result on a display screen of the device.
The invention constructs a feature engineering (comprising a plurality of manually selected feature dimensions) aiming at immune flow sensing, and is different from the traditional machine learning training process.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (7)

1. An AI-assisted analysis method for immune lateral flow sensing, which is characterized in that: the method comprises the following steps:
step one
Collecting a fluorescence image around the control line after the upconversion fluorescence nanometer material is excited by near infrared through fluorescence detection equipment to obtain the width and height dimensions of a three-channel image around the control line;
step two
Collecting fluorescence images around the test line after the up-conversion fluorescence nano material is excited by near infrared through fluorescence detection equipment to obtain the width and height dimensions of three-channel images around the test line;
step three
Splicing the control line on the same immune lateral flow sensing test strip in the first step and the fluorescent line excited by the test line in the second step together to obtain the width and height of the three-channel image of the spliced fluorescent line;
step four
Correspondingly repeating the first step to the third step frame by collecting image sequences through fluorescence detection equipment to obtain a fluorescence image data set for training, and dividing data in the fluorescence image data set into a training data set and a verification data set;
step five
Training data obtained in the fourth step is amplified through the characteristic engineering of the training data;
step six
Retraining the pre-trained image classification network by using transfer learning based on the training data after feature engineering amplification to obtain a trained image classification network;
step seven
And deploying the network subjected to the transfer learning into local detection equipment, and obtaining a classification result of the data to be detected through the trained image classification network based on the test data obtained in the step four.
2. The AI-assisted analytical method for immune lateral flow sensing according to claim 1, wherein: in the fourth step, 80% of the images in the fluorescence image dataset were used for training data and the remaining 20% were used for testing data.
3. The AI-assisted analytical method for immune lateral flow sensing according to claim 1, wherein: in the fifth step, the feature engineering method of the training data comprises the following steps:
a1: sequentially adding Gaussian noise, polar coordinate conversion, horizontal overturning, Gaussian smoothing, anticlockwise rotation by 15 degrees and clockwise rotation by 15 degrees to the training image of the training data to generate an amplified training image and obtain amplified training data;
a2: in the training process, a method for preventing overfitting is adopted for training data so as to enhance the generalization capability of the model and avoid overfitting of the model;
a3: the training data obtained in step a2 is input into a pre-trained image classification network.
4. The AI-assisted analytical method for immune lateral flow sensing according to claim 3, wherein: the method for preventing overfitting includes randomly reducing and enlarging the training data and randomly cutting the training data.
5. The AI-assisted analytical method for immune lateral flow sensing according to claim 1, wherein: in the sixth step, the specific method for retraining the pre-trained image classification network comprises the following steps:
b1: loading a pre-training network;
b2: replacing the final layer;
b3: freezing an initial layer, and setting the learning rate of a shallower network layer to be zero to freeze the weight of the shallower network layer;
b4: training the network, and storing the weight after network training;
b5: and deploying the trained image classification network.
6. The AI-assisted analytical method for immune lateral flow sensing according to claim 5, wherein: in the process of retraining the pre-trained image classification network, an absolute error is used as a loss function of the model, and the expression is as follows:
Figure FDA0003350275860000031
wherein L is a loss function, y is a label of the sample,
Figure FDA0003350275860000032
m is the number of samples for the predicted value of the model.
7. The AI-assisted analytical method for immune lateral flow sensing according to claim 1, wherein: and seventhly, acquiring an actual verification fluorescent image by using a detection equipment capturing device, performing real-time reasoning on a result by using a trained image classification network, calculating classification accuracy, and finally displaying the classification accuracy on an equipment display screen.
CN202111335260.0A 2021-11-11 2021-11-11 AI-assisted analysis method for immune lateral flow sensing Pending CN114092740A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111335260.0A CN114092740A (en) 2021-11-11 2021-11-11 AI-assisted analysis method for immune lateral flow sensing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111335260.0A CN114092740A (en) 2021-11-11 2021-11-11 AI-assisted analysis method for immune lateral flow sensing

Publications (1)

Publication Number Publication Date
CN114092740A true CN114092740A (en) 2022-02-25

Family

ID=80300052

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111335260.0A Pending CN114092740A (en) 2021-11-11 2021-11-11 AI-assisted analysis method for immune lateral flow sensing

Country Status (1)

Country Link
CN (1) CN114092740A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110378252A (en) * 2019-06-28 2019-10-25 浙江大学 A kind of distress in concrete recognition methods based on depth migration study
CN110458043A (en) * 2019-07-20 2019-11-15 中国船舶重工集团公司第七二四研究所 A kind of SAR target identification method based on transfer learning and the output of full articulamentum
CN110596368A (en) * 2019-08-15 2019-12-20 深圳市亿立方生物技术有限公司 Fluorescence immunoassay appearance
CN111476283A (en) * 2020-03-31 2020-07-31 上海海事大学 Glaucoma fundus image identification method based on transfer learning
CN112071423A (en) * 2020-09-07 2020-12-11 上海交通大学 Machine learning-based immunochromatography concentration detection method and system
CN112287839A (en) * 2020-10-29 2021-01-29 广西科技大学 SSD infrared image pedestrian detection method based on transfer learning
CN112560079A (en) * 2020-11-03 2021-03-26 浙江工业大学 Hidden false data injection attack detection method based on deep belief network and transfer learning
CN112819093A (en) * 2021-02-24 2021-05-18 浙江工业大学 Man-machine asynchronous recognition method based on small data set and convolutional neural network
CN112819076A (en) * 2021-02-03 2021-05-18 中南大学 Deep migration learning-based medical image classification model training method and device
CN113313109A (en) * 2021-05-13 2021-08-27 中国计量大学 Semi-quantitative analysis method of fluorescence immunochromatographic test paper

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110378252A (en) * 2019-06-28 2019-10-25 浙江大学 A kind of distress in concrete recognition methods based on depth migration study
CN110458043A (en) * 2019-07-20 2019-11-15 中国船舶重工集团公司第七二四研究所 A kind of SAR target identification method based on transfer learning and the output of full articulamentum
CN110596368A (en) * 2019-08-15 2019-12-20 深圳市亿立方生物技术有限公司 Fluorescence immunoassay appearance
CN111476283A (en) * 2020-03-31 2020-07-31 上海海事大学 Glaucoma fundus image identification method based on transfer learning
CN112071423A (en) * 2020-09-07 2020-12-11 上海交通大学 Machine learning-based immunochromatography concentration detection method and system
CN112287839A (en) * 2020-10-29 2021-01-29 广西科技大学 SSD infrared image pedestrian detection method based on transfer learning
CN112560079A (en) * 2020-11-03 2021-03-26 浙江工业大学 Hidden false data injection attack detection method based on deep belief network and transfer learning
CN112819076A (en) * 2021-02-03 2021-05-18 中南大学 Deep migration learning-based medical image classification model training method and device
CN112819093A (en) * 2021-02-24 2021-05-18 浙江工业大学 Man-machine asynchronous recognition method based on small data set and convolutional neural network
CN113313109A (en) * 2021-05-13 2021-08-27 中国计量大学 Semi-quantitative analysis method of fluorescence immunochromatographic test paper

Similar Documents

Publication Publication Date Title
Czimmermann et al. Visual-based defect detection and classification approaches for industrial applications—a survey
CN109033998B (en) Remote sensing image ground object labeling method based on attention mechanism convolutional neural network
CN112990280B (en) Class increment classification method, system, device and medium for image big data
CN105335756B (en) A kind of image classification method and image classification system based on Robust Learning model
CN108416370A (en) Image classification method, device based on semi-supervised deep learning and storage medium
Cai et al. Airport detection using end-to-end convolutional neural network with hard example mining
CN111179244B (en) Automatic crack detection method based on cavity convolution
CN104794489A (en) Deep label prediction based inducing type image classification method and system
CN109886141A (en) A kind of pedestrian based on uncertainty optimization discrimination method again
CN110377727B (en) Multi-label text classification method and device based on multi-task learning
Zhuang et al. Online color classification system of solid wood flooring based on characteristic features
Cerman et al. A mobile recognition system for analog energy meter scanning
CN111860545B (en) Image sensitive content identification method and system based on weak detection mechanism
Li et al. Vehicle detection in remote sensing images using denoizing-based convolutional neural networks
Rizk et al. Drone-based water level detection in flood disasters
CN114399763B (en) Single-sample and small-sample micro-body paleobiological fossil image identification method and system
Zhang et al. Urban area extraction by regional and line segment feature fusion and urban morphology analysis
Yang et al. Research on subway pedestrian detection algorithms based on SSD model
CN111199539A (en) Crack detection method based on integrated neural network
Guo et al. A slimmer network with polymorphic and group attention modules for more efficient object detection in aerial images
Zhao et al. Fabric Surface Defect Detection Using SE-SSDNet
Wang et al. Artificial intelligence reinforced upconversion nanoparticle-based lateral flow assay via transfer learning
CN113111917A (en) Zero sample image classification method and device based on dual self-encoders
CN114092740A (en) AI-assisted analysis method for immune lateral flow sensing
CN112085540A (en) Intelligent advertisement pushing system and method based on artificial intelligence technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination