CN114170518A - Tea tree freezing injury assessment method and system based on computer vision - Google Patents
Tea tree freezing injury assessment method and system based on computer vision Download PDFInfo
- Publication number
- CN114170518A CN114170518A CN202111555624.6A CN202111555624A CN114170518A CN 114170518 A CN114170518 A CN 114170518A CN 202111555624 A CN202111555624 A CN 202111555624A CN 114170518 A CN114170518 A CN 114170518A
- Authority
- CN
- China
- Prior art keywords
- tea tree
- injury
- tea
- freezing
- freezing injury
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 235000009024 Ceanothus sanguineus Nutrition 0.000 title claims abstract description 86
- 235000015459 Lycium barbarum Nutrition 0.000 title claims abstract description 86
- 230000006378 damage Effects 0.000 title claims abstract description 70
- 230000008014 freezing Effects 0.000 title claims abstract description 66
- 238000007710 freezing Methods 0.000 title claims abstract description 66
- 208000027418 Wounds and injury Diseases 0.000 title claims abstract description 56
- 208000014674 injury Diseases 0.000 title claims abstract description 56
- 238000000034 method Methods 0.000 title claims abstract description 26
- 241000366182 Melaleuca alternifolia Species 0.000 title 1
- 240000003553 Leptospermum scoparium Species 0.000 claims abstract description 85
- 238000011156 evaluation Methods 0.000 claims abstract description 12
- 238000007781 pre-processing Methods 0.000 claims abstract description 6
- 241001122767 Theaceae Species 0.000 claims abstract 3
- 230000009746 freeze damage Effects 0.000 claims description 35
- 238000012549 training Methods 0.000 claims description 19
- 238000013528 artificial neural network Methods 0.000 claims description 14
- 238000013135 deep learning Methods 0.000 claims description 7
- 238000012360 testing method Methods 0.000 claims description 6
- 238000000605 extraction Methods 0.000 claims description 5
- 238000010801 machine learning Methods 0.000 claims description 5
- 238000012545 processing Methods 0.000 claims description 4
- 238000013138 pruning Methods 0.000 claims description 4
- 238000004458 analytical method Methods 0.000 claims description 3
- 230000009467 reduction Effects 0.000 claims description 3
- 230000009286 beneficial effect Effects 0.000 abstract description 2
- 244000269722 Thea sinensis Species 0.000 description 19
- 239000010410 layer Substances 0.000 description 7
- 238000001514 detection method Methods 0.000 description 6
- 235000006468 Thea sinensis Nutrition 0.000 description 4
- 238000013527 convolutional neural network Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 239000002243 precursor Substances 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000003449 preventive effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 239000002356 single layer Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a tea tree freezing injury assessment method and system based on computer vision. The evaluation method comprises the following steps: the method comprises the steps of collecting tea tree leaf freezing injury images through an RGB camera, preprocessing the images, identifying and classifying the tea tree freezing injury leaves by adopting a double-layer algorithm, segmenting and classifying the tea tree freezing injury leaves for the first time through an Faster RCNN network in a first-layer algorithm, inputting segmented pictures into a second-layer algorithm SVM for the second time classification, grading the tea tree freezing injury leaves according to the obtained different classified freezing injury leaves of the tea tree, and evaluating the whole freezing injury degree of the tea tree. The method can identify the freezing injury of the tea tree leaves, can grade the freezing injury degree of the tea trees, solves the problems of strong subjectivity and long time consumption of manual observation and evaluation of the freezing injury of the tea garden, improves the precision of the model, and is beneficial to improvement of the management efficiency of the tea garden.
Description
Technical Field
The invention belongs to the field of computer vision, and particularly relates to a tea tree freezing injury assessment method and system based on computer vision.
Background
Tea tree (Camellia sinensis L.) is one of the most important commercial crops in china, india, srilanca, kenya and other countries. However, with the continuous change of global climate and the vulnerability of tea trees to low temperature disasters, large areas of tea trees are frozen, greatly reducing the yield and crop distribution in cold regions. Therefore, it is necessary to understand the freezing damage condition of the tea garden and analyze the cause of the freezing damage to propose preventive measures. Then, post-disaster recovery is performed to minimize the loss of tea production.
The process of tea tree freezing is often manifested on the leaves first, and only in extreme cold conditions will the roots be damaged and die. Therefore, the identification and classification of the leaf freezing injury have important significance for the evaluation and prevention of the tea garden freezing injury. At present, the identification of the degree of freezing injury of a tea garden mainly depends on manual observation and calculation, and has strong subjectivity and long time consumption. However, in recent years, the development of target detection algorithms based on deep learning is rapid, and a series of frames appear to provide a basis for accurate target positioning and detection. For example, R-CNN is a precursor to deep learning object detection application, which combines traditional machine learning with deep learning, followed by optimization algorithms for target detection networks, such as SPP net, Fast R-CNN, YOLO and SSD.
Disclosure of Invention
The invention provides a tea tree freezing injury assessment method and system based on computer vision. According to the method, firstly, frozen tea leaves are firstly classified by using a fast R-CNN detection algorithm, then, the tea leaves are classified for the second time by using an SVM algorithm, so that the accuracy of the tea leaves is further improved, then, different graded frozen leaves of the tea trees are scored, and finally, the whole frozen damage degree of the tea trees is evaluated according to the scoring condition.
In order to realize the purpose of the invention, the invention adopts the following technical scheme to realize:
the invention provides a tea tree freezing injury assessment method based on computer vision, which comprises the following steps:
s1: collecting a tea tree leaf freeze injury image;
s2: preprocessing the tea tree leaf freeze injury image in the step S1;
s3: segmenting and grading tea tree frozen injury leaves for the first time by combining the preprocessed frozen injury images;
s4: carrying out secondary grading on the segmented freezing injury image;
s5: and combining the grading results of the steps S3 and S4 to obtain the grading conditions of the freeze injury leaves of the tea trees in different grades, and evaluating the whole freeze injury degree of the tea trees according to the grading conditions.
Further, the step S2 of preprocessing the tea tree leaf freeze injury image specifically includes:
s21: grading the freezing damage degree of the tea trees by using the tea tree leaf freezing damage image;
s22: adjusting the resolution of the tea tree leaf freeze injury image;
s23: utilizing MATLAB software to mark the adjusted freezing injury image to obtain a training set;
s24: and storing the marked tag data in a mat format, converting the tag data into a table data set format, and inputting the table data set format into the neural network.
Further, in step S22, the resolution of the tea tree leaf freeze injury image is adjusted to 1800 × 1800.
Further, the step S3 is to divide and grade the tea tree frozen leaves for the first time by using the Faster RCNN network, and the specific steps are as follows:
s31: dividing the frozen leaves of the tea trees into three grades to prepare an input data set;
s32: dividing an input data set into a training set and a test set according to a ratio of 9: 1;
s33: inputting the training set into different neural networks, and extracting the freeze injury characteristics of the tea tree freeze injury leaves;
s34: comparing results of different neural networks under the same learning rate, selecting an optimal feature extraction network and inputting the optimal feature extraction network into a fast RCNN network;
s35: inputting the training set in the step S32 into the optimal Faster RCNN network in the step S34, and training at different learning rates;
s36: according to step S35, the best learning rate is found, and then the test set in step S32 is input into the Faster RCNN network to verify the classification result.
Further, the neural network includes AlexNet, VGG19, and Resnet50, which have total training times of 5 hours, 7 hours, and 23 hours, respectively.
Further, the maximum iteration, the learning rate, and the learning rate reduction factor selected in step S36 are 20, 0.0001, and 0.001, respectively.
Further, the specific step of step S4 is:
s41: extracting the freeze injury characteristics of the segmented freeze injury image through HOG;
s42: and (4) putting the images with the extracted features into an SVM network for one-to-one classification.
Further, in the step S5, the first-stage freezing injury leaf score is 0.5, the second-stage freezing injury leaf score is 0.3, and the third-stage freezing injury leaf score is 0.2, and the final score is obtained by multiplying the number of the first-stage freezing injury leaf score, the second-stage freezing injury leaf score, and the third-stage freezing injury leaf score by the score and adding the scores.
This discovery still provides a tea tree freeze injury evaluation system based on computer vision, includes:
the acquisition system comprises an RGB camera and is used for acquiring a tea tree leaf freeze injury image in winter;
a processing system to perform the following operations: inputting the collected freezing damage images into a neural network for deep learning and machine learning, identifying and grading tea tree leaves, and automatically grading the whole tea tree according to the freezing damage leaves;
and the analysis system is used for performing overall evaluation on the freezing injury condition of the tea trees according to the score of the processing system and guiding the overwintering and pruning management of the tea trees after the overwintering.
Compared with the prior art, the invention has the advantages and beneficial effects that:
1. the method realizes the integral evaluation of the tea tree freezing injury condition through deep learning and machine learning, solves the problems of strong subjectivity and long time consumption of manual observation and evaluation of the tea garden freezing injury, realizes the automatic analysis and evaluation of computer vision, and has important significance for guiding the overwintering and the pruning management of the tea garden after the overwintering.
2. The tea tree freezing injury leaf is identified and segmented through the double-layer neural network, the accuracy of the model is improved compared with that of a single-layer algorithm, and a thought is provided for analyzing images in the field of computer vision.
Drawings
FIG. 1 is a flow chart of a tea tree freezing injury assessment method based on computer vision according to the invention.
FIG. 2 is an image of a tea tree leaf freeze injury marker.
Fig. 3 general framework of the first layer algorithm, fast RCNN.
Fig. 4 shows the principle of the SVM as a second layer algorithm.
FIG. 5 identification classification of frozen leaves of tea plant.
Figure 6 assessment of freezing of tea plant as a whole (fourth stage).
Detailed Description
The technical solution of the present invention will be further described in detail with reference to the following specific examples.
Example 1
A tea tree freezing injury assessment method based on computer vision is disclosed, a flow chart is shown in figure 1, and the method comprises the following steps:
firstly, tea tree freezing injury image acquisition:
in 12 months of 2020, images were collected at the institute of tea in sunshine city, Shandong province (35 ° 40 'in northern latitude, 119 ° 33' in east longitude and 23 m in altitude). And acquiring a tea crown image by using CANON-EOS 6D. The image is stored in JPEG format with a resolution set to 3024 × 4032. The shooting angle and distance are random.
And secondly, preprocessing the obtained image of the freeze injury of the tea tree.
Firstly, the freezing damage degree of the tea trees is graded into mild degree, moderate degree and severe degree. Table 1 shows the grading of the degree of freeze injury of tea trees. Then, the resolution of the training image was adjusted to 1800 × 1800, and 640 enhanced photographs were labeled with image labeller 9.2 in MATLAB software to obtain a training set. The first, second and third levels are marked with red, orange and blue colors respectively, and fig. 2 is a marked freeze injury image. Finally, the tag data is stored in a mat format. They are converted into a tabular data set format and input into a neural network.
Table 1: grading condition of degree of frozen injury of tea tree
Thirdly, segmenting and grading the tea tree frozen leaves for the first time by a fast RCNN network:
(1) the leaves of tea tree frozen injury are divided into three grades to prepare an input data set.
(2) The input data set is divided into a training set and a test set according to a 9:1 ratio.
(3) And inputting the training set into three different neural networks AlexNet, VGG19 and Resnet50, and extracting the freeze injury characteristics of the tea leaves. Table 2 lists these three feature extractors, including their number of layers and performance.
Table 2: characteristics of depth feature extractor and its performance in ImageNet challenges
(4) The three neural networks compare the recognition results under the same learning rate, and the optimal feature extraction network is selected and input into the fast RCNN. The results are shown in table 3, where Resnet50 network has better recognition than AlexNet and VGG 19. Fig. 5 shows the hierarchical effect of the Resnet50 network.
Table 3: accuracy based on three different network fast RCNN models
(5) And inputting the training set into an optimal Faster RCNN, and training at different learning rates. FIG. 3 shows the overall framework of fast RCNN.
(6) The finally selected maximum iteration, learning rate and learning rate reduction factors are 20, 0.0001 and 0.001 respectively. The total training time for AlexNet, Resnet50, and VGG19 was 5, 7, and 23 hours, respectively. The test set is then entered into the Faster RCNN for validation of the classification results. To evaluate the accuracy of the model, evaluated by accuracy and recall, the equations are (1) and (2):
the method comprises the steps of determining the number of positive samples in a positive sample, determining the number of negative samples in a negative sample, and determining the number of positive samples in a negative sample.
Step four, inputting the divided photos into an SVM network for secondary grading:
inputting the image captured by the first layer algorithm into a second layer algorithm SVM network, specifically, extracting the freeze injury characteristics of the image through HOG, and then putting the image after the characteristics are extracted into the SVM network for one-to-one classification. Fig. 4 shows a principle diagram of one-to-one classification by SVM. The results are shown in table 4, the SVM algorithm can better classify the frozen leaves, the accuracy is 93.10%, and the total accuracy is 97.52%.
Table 4: SVM neural network model
Step five, integrally evaluating the frozen degree of the tea trees:
and according to the third step and the fourth step, obtaining the grading condition score of the frozen leaves of the tea trees. And then inputting the image of the whole pearl tea tree into a model obtained by a double-layer algorithm, and grading the freezing injury degree of the whole pearl tea tree. The classification method is shown in formula (3):
wherein X1 is the number of the first stage freeze injury leaves, X2 is the number of the second stage freeze injury leaves, and X3 is the number of the third stage freeze injury leaves.
Table 5 shows the freeze injury grading criteria for tea trees. FIG. 6 is an image of the freeze injury of the whole pearl tea plant.
Table 5: tea tree freezing injury grading standard
The invention relates to a method for evaluating the freezing degree of tea trees in a tea garden based on computer vision and guiding the overwintering protection and pruning of the tea trees after overwintering. The traditional evaluation tea tree degree of freezing relies on manual means, experience to judge, and the condition of erroneous judgement appears easily, and detection efficiency is low. Therefore, the method combines the RGB image with deep learning and machine learning to be introduced into the evaluation of the degree of freezing of the tea garden so as to improve the management efficiency of the tea garden.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions.
Claims (9)
1. A tea tree freezing injury assessment method based on computer vision is characterized by comprising the following steps:
s1: collecting a tea tree leaf freeze injury image;
s2: preprocessing the tea tree leaf freeze injury image in the step S1;
s3: segmenting and grading tea tree frozen injury leaves for the first time by combining the preprocessed frozen injury images;
s4: carrying out secondary grading on the segmented freezing injury image;
s5: and combining the grading results of the steps S3 and S4 to obtain the grading conditions of the freeze injury leaves of the tea trees in different grades, and evaluating the whole freeze injury degree of the tea trees according to the grading conditions.
2. The tea tree freezing injury assessment method based on computer vision according to claim 1, wherein the specific steps of preprocessing the tea tree leaf freezing injury image in the step S2 are as follows:
s21: grading the freezing damage degree of the tea trees by using the tea tree leaf freezing damage image;
s22: adjusting the resolution of the tea tree leaf freeze injury image;
s23: utilizing MATLAB software to mark the adjusted freezing injury image to obtain a training set;
s24: and storing the marked tag data in a mat format, converting the tag data into a table data set format, and inputting the table data set format into the neural network.
3. The method for assessing the frost damage of tea trees based on computer vision as claimed in claim 1, wherein the resolution of the image of the frost damage of tea leaves in step S22 is 1800 x 1800.
4. The tea tree freezing injury assessment method based on computer vision according to claim 1, wherein said step S3 is to divide and grade tea tree freezing injury leaves for the first time by using fast RCNN network, and comprises the following steps:
s31: dividing the frozen leaves of the tea trees into three grades to prepare an input data set;
s32: dividing an input data set into a training set and a test set according to a ratio of 9: 1;
s33: inputting the training set into different neural networks, and extracting the freeze injury characteristics of the tea tree freeze injury leaves;
s34: comparing results of different neural networks under the same learning rate, selecting an optimal feature extraction network and inputting the optimal feature extraction network into a fast RCNN network;
s35: inputting the training set in the step S32 into the optimal Faster RCNN network in the step S34, and training at different learning rates;
s36: according to step S35, the best learning rate is found, and then the test set in step S32 is input into the Faster RCNN network to verify the classification result.
5. The computer vision-based tea tree freezing injury assessment method according to claim 4, wherein said neural network comprises AlexNet, VGG19 and Resnet50, which have total training time of 5 hours, 7 hours and 23 hours, respectively.
6. The computer vision based tea tree freezing injury assessment method according to claim 4, wherein the maximum iteration, learning rate and learning rate reduction factors selected in the step S36 are 20, 0.0001 and 0.001 respectively.
7. The tea tree freezing injury assessment method based on computer vision as claimed in claim 1, wherein the specific steps of step S4 are:
s41: extracting the freeze injury characteristics of the segmented freeze injury image through HOG;
s42: and (4) putting the images with the extracted features into an SVM network for one-to-one classification.
8. The computer vision based tea tree freezing injury assessment method according to claim 1, wherein in step S5, the primary freezing injury leaf score is 0.5, the secondary freezing injury leaf score is 0.3, the tertiary freezing injury leaf score is 0.2, and the final score is the number of the three multiplied by the score and added.
9. A tea tree freeze injury evaluation system based on computer vision is characterized by comprising:
the acquisition system comprises an RGB camera and is used for acquiring a tea tree leaf freeze injury image in winter;
a processing system to perform the following operations: inputting the collected freezing damage images into a neural network for deep learning and machine learning, identifying and grading tea tree leaves, and automatically grading the whole tea tree according to the freezing damage leaves;
and the analysis system is used for performing overall evaluation on the freezing injury condition of the tea trees according to the score of the processing system and guiding the overwintering and pruning management of the tea trees after the overwintering.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111555624.6A CN114170518A (en) | 2021-12-17 | 2021-12-17 | Tea tree freezing injury assessment method and system based on computer vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111555624.6A CN114170518A (en) | 2021-12-17 | 2021-12-17 | Tea tree freezing injury assessment method and system based on computer vision |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114170518A true CN114170518A (en) | 2022-03-11 |
Family
ID=80487332
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111555624.6A Pending CN114170518A (en) | 2021-12-17 | 2021-12-17 | Tea tree freezing injury assessment method and system based on computer vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114170518A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115061407A (en) * | 2022-07-01 | 2022-09-16 | 西北农林科技大学 | Frost early warning prevention and control system and method based on Internet of things |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106203506A (en) * | 2016-07-11 | 2016-12-07 | 上海凌科智能科技有限公司 | A kind of pedestrian detection method based on degree of depth learning art |
CN108364006A (en) * | 2018-01-17 | 2018-08-03 | 超凡影像科技股份有限公司 | Medical Images Classification device and its construction method based on multi-mode deep learning |
CN109801256A (en) * | 2018-12-15 | 2019-05-24 | 华南理工大学 | A kind of image aesthetic quality appraisal procedure based on area-of-interest and global characteristics |
CN110232406A (en) * | 2019-05-28 | 2019-09-13 | 厦门大学 | A kind of liquid crystal display panel CF image identification method based on statistical learning |
CN111583198A (en) * | 2020-04-23 | 2020-08-25 | 浙江大学 | Insulator picture defect detection method combining FasterR-CNN + ResNet101+ FPN |
CN112633212A (en) * | 2020-12-29 | 2021-04-09 | 长沙湘丰智能装备股份有限公司 | Tea tender shoot grade identification and classification method based on computer vision |
-
2021
- 2021-12-17 CN CN202111555624.6A patent/CN114170518A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106203506A (en) * | 2016-07-11 | 2016-12-07 | 上海凌科智能科技有限公司 | A kind of pedestrian detection method based on degree of depth learning art |
CN108364006A (en) * | 2018-01-17 | 2018-08-03 | 超凡影像科技股份有限公司 | Medical Images Classification device and its construction method based on multi-mode deep learning |
CN109801256A (en) * | 2018-12-15 | 2019-05-24 | 华南理工大学 | A kind of image aesthetic quality appraisal procedure based on area-of-interest and global characteristics |
CN110232406A (en) * | 2019-05-28 | 2019-09-13 | 厦门大学 | A kind of liquid crystal display panel CF image identification method based on statistical learning |
CN111583198A (en) * | 2020-04-23 | 2020-08-25 | 浙江大学 | Insulator picture defect detection method combining FasterR-CNN + ResNet101+ FPN |
CN112633212A (en) * | 2020-12-29 | 2021-04-09 | 长沙湘丰智能装备股份有限公司 | Tea tender shoot grade identification and classification method based on computer vision |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115061407A (en) * | 2022-07-01 | 2022-09-16 | 西北农林科技大学 | Frost early warning prevention and control system and method based on Internet of things |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107016405B (en) | A kind of pest image classification method based on classification prediction convolutional neural networks | |
CN106951836B (en) | crop coverage extraction method based on prior threshold optimization convolutional neural network | |
WO2022184088A1 (en) | Flood sensitivity risk assessment method based on ensemble learning | |
CN108898085B (en) | Intelligent road disease detection method based on mobile phone video | |
CN110717903A (en) | Method for detecting crop diseases by using computer vision technology | |
CN108416985B (en) | Geological disaster monitoring and early warning system and method based on image recognition | |
CN104881865A (en) | Forest disease and pest monitoring and early warning method and system based on unmanned plane image analysis | |
CN103530638B (en) | Method for pedestrian matching under multi-cam | |
CN107808375B (en) | Merge the rice disease image detecting method of a variety of context deep learning models | |
CN107680098A (en) | A kind of recognition methods of sugarcane sugarcane section feature | |
CN111462058B (en) | Method for rapidly detecting effective rice ears | |
CN116563205A (en) | Wheat spike counting detection method based on small target detection and improved YOLOv5 | |
CN116310338A (en) | Single litchi red leaf tip segmentation method based on examples and semantic segmentation | |
CN114170518A (en) | Tea tree freezing injury assessment method and system based on computer vision | |
CN107886497A (en) | A kind of wheat flag leaf automatic nondestructive detection and measuring method based on image | |
CN115082785A (en) | Method for distinguishing odontoptosis characteristics of rabbit-shaped degraded meadow of plateau mice | |
CN112528726B (en) | Cotton aphid pest monitoring method and system based on spectral imaging and deep learning | |
CN116052141B (en) | Crop growth period identification method, device, equipment and medium | |
CN117333784A (en) | Wheat scab disease index estimation method based on unmanned aerial vehicle image | |
CN116721385A (en) | Machine learning-based RGB camera data cyanobacteria bloom monitoring method | |
CN116310842A (en) | Soil saline-alkali area identification and division method based on remote sensing image | |
CN115205701A (en) | Method for extracting sample based on rice phenological characteristics | |
CN116524344A (en) | Tomato string picking point detection method based on RGB-D information fusion | |
CN102621075A (en) | Method for automatically detecting rice heading stage | |
Li et al. | Image recognition and empirical application of desert plant species based on convolutional neural network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |