CN113343886A - Tea leaf identification grading method based on improved capsule network - Google Patents

Tea leaf identification grading method based on improved capsule network Download PDF

Info

Publication number
CN113343886A
CN113343886A CN202110695923.3A CN202110695923A CN113343886A CN 113343886 A CN113343886 A CN 113343886A CN 202110695923 A CN202110695923 A CN 202110695923A CN 113343886 A CN113343886 A CN 113343886A
Authority
CN
China
Prior art keywords
tea
capsule network
capsule
method based
improved
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110695923.3A
Other languages
Chinese (zh)
Inventor
黄海松
陈星燃
胡鹏飞
范青松
韩正功
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guizhou University
Original Assignee
Guizhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guizhou University filed Critical Guizhou University
Priority to CN202110695923.3A priority Critical patent/CN113343886A/en
Publication of CN113343886A publication Critical patent/CN113343886A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention discloses a tea leaf identification grading method based on an improved capsule network, which comprises the steps of dividing tea leaves of known varieties and grades into a plurality of units, sequentially placing the units on a white bottom plate, and acquiring n pieces of tea leaf image data of each category through a camera; performing binarization processing and various data enhancement processing on the tea image data based on the Otsu method, and establishing a tea image database according to the tea grade label; deepening an original capsule network structure and combining with the residual block to construct an improved capsule network model suitable for tea grading; importing data into the improved capsule network model in the tea image database for training to obtain a tea grading model; and paving the tea leaves to be classified on the white bottom plate, shooting by using the camera to acquire image data to be recognized, and inputting the image data into the trained tea leaf classification model to acquire a recognition result of the grade of the tea leaves. The method improves the capability of the model for processing the small sample data set, and reduces the risks of overfitting and the like.

Description

Tea leaf identification grading method based on improved capsule network
Technical Field
The invention relates to the technical field of image recognition, in particular to a tea leaf recognition grading method based on an improved capsule network.
Background
At present, when tea quality is graded, sensory judgment is mainly carried out on the color and the texture of tea by manpower, so that the grading of the tea is easily influenced by factors such as physiological conditions, working experience, environmental conditions and the like. And the efficiency of classifying the tea leaves by manpower is low, the result is unstable, and the method is not suitable for the current mechanized and large-scale tea leaf production.
In recent years, with the development of computer technology and deep learning technology, researchers at home and abroad use the deep learning technology to identify and classify tea leaves, and carry out extensive research.
Most of previous researches adopt a method based on a convolutional neural network to carry out tea image recognition, and have the following problems: (1) the convolution network has translation invariance, and the spatial relationship among the characteristics can be ignored when the tea image is processed, so that the characteristic information is lost; (2) in order to train the translation invariants of the convolutional neural network, different filters are needed to learn each observation angle of the tea image, and a large amount of data is needed to train. However, the requirement for acquiring the tea image is severe, and the problem of insufficient data amount often exists, so that the traditional convolutional neural network is poor in robustness on small data, and phenomena such as overfitting are easy to occur.
Disclosure of Invention
This section is for the purpose of summarizing some aspects of embodiments of the invention and to briefly introduce some preferred embodiments. In this section, as well as in the abstract and the title of the invention of this application, simplifications or omissions may be made to avoid obscuring the purpose of the section, the abstract and the title, and such simplifications or omissions are not intended to limit the scope of the invention.
The present invention has been made in view of the above-mentioned conventional problems.
Therefore, the technical problem solved by the invention is as follows: the traditional convolutional neural network transfers characteristics in a scalar form, a large number of data sets are needed for training the translation invariants of the convolutional nerves, and the robustness of a model on small data is poor; the size of the common convolutional layer is fixed, so that the reading of sensing fields with different sizes cannot be realized simultaneously; traditional convolutional neural networks rarely enable the learning of features that are difficult to classify.
In order to solve the technical problems, the invention provides the following technical scheme: dividing known varieties and grades of tea leaves into a plurality of units, sequentially placing the units on a white bottom plate, and acquiring n pieces of tea leaf image data of each category through a camera; performing binarization processing and various data enhancement processing on the tea image data based on the Otsu method, and establishing a tea image database according to the tea grade label; deepening an original capsule network structure and combining with the residual block to construct an improved capsule network model suitable for tea grading; importing data into the improved capsule network model in the tea image database for training to obtain a tea grading model; and paving the tea leaves to be classified on the white bottom plate, shooting by using the camera to acquire image data to be recognized, and inputting the image data into the trained tea leaf classification model to acquire a recognition result of the grade of the tea leaves.
As a preferred embodiment of the tea identification grading method based on the improved capsule network, the tea identification grading method of the invention comprises the following steps: the method comprises the steps that when tea sample images are collected, A4 paper is used as a white bottom plate, an LED light source with the power of 3w is selected, the distance between a camera lens and the tea samples is kept at 17cm, and the tea samples occupy 3/4 of camera pictures during shooting.
As a preferred embodiment of the tea identification grading method based on the improved capsule network, the tea identification grading method of the invention comprises the following steps: and the binarization processing comprises the steps of converting the collected tea image data from the RBG image into a gray image, binarizing the tea image by adopting an Otsu method, and cutting the tea and the bottom plate by a square.
As a preferred embodiment of the tea identification grading method based on the improved capsule network, the tea identification grading method of the invention comprises the following steps: the multiple data enhancement processing comprises the steps of carrying out random rotation, random cutting, random horizontal turning and characteristic standardization operation on the segmented tea images in sequence.
As a preferred embodiment of the tea identification grading method based on the improved capsule network, the tea identification grading method of the invention comprises the following steps: the method comprises the steps of building a single residual block network structure, a multi-size feature extraction convolution layer and an original capsule network structure.
As a preferred embodiment of the tea identification grading method based on the improved capsule network, the tea identification grading method of the invention comprises the following steps: and replacing the single convolution layer for feature extraction in the original capsule network structure by using the residual block, the multi-size feature extraction convolution layer, the two common convolution layers and the three pooling layers to obtain the improved capsule network.
As a preferred embodiment of the tea identification grading method based on the improved capsule network, the tea identification grading method of the invention comprises the following steps: the method comprises the steps of extracting characteristics of a tea image through combination of three groups of residual blocks and a plurality of groups of convolution pooling layers, and transmitting the characteristics into a main capsule layer; the characteristics are transmitted to the main capsule layer; adjusting the characteristic shape through a channel of the two-dimensional convolution processing characteristic to convert the characteristic shape from a scalar form to a vector form capsule so as to simultaneously reserve local position information and attribute information of the characteristic; after the processing is completed, the features are transferred into the digital capsule layer.
As a preferred embodiment of the tea identification grading method based on the improved capsule network, the tea identification grading method of the invention comprises the following steps: the digital capsule layer comprises sub-capsules of each layer, and a predicted value of a father capsule is obtained by processing through a compression function after the sub-capsules of each layer are subjected to weighted summation with a corresponding weight matrix; after the coupling coefficient is determined through a dynamic routing mechanism, the predicted value obtains the actual output of the father capsule based on the coupling coefficient; where the length of a capsule represents the likelihood of the existence of a certain category and each dimension represents a certain characteristic of that category.
The invention has the beneficial effects that: according to the method, the characteristic that the capsule network extracts and stores the characteristics in a vector form is utilized, so that the model has translation invariance and translation metamorphism at the same time, more characteristics can be extracted on the same number of data sets compared with a convolutional neural network, the capability of the model for processing a small sample data set is improved, and risks such as overfitting are reduced; a residual block is introduced, and a nonlinear transformation of input and input is superposed into output by using a jump connection directly connected across layers, so that the flow of the gradient is not hindered, and the capability of the model for resisting the disappearance of the gradient is improved; the combination of multilayer convolution and pooling is combined with multi-size feature extraction to replace the original single convolution layer extraction features and transmit the features into the main capsule layer, so that the network structure is deepened, and the model can extract the multi-size features in the tea image; by introducing the Focal local Loss function, the model can be more focused on samples which are difficult to classify in training through gradient back propagation.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise. Wherein:
fig. 1 is a schematic flow chart of a tea leaf identification grading method based on an improved capsule network according to a first embodiment of the invention;
fig. 2 is a schematic view of a gray scale process of a tea identification grading method based on an improved capsule network according to a first embodiment of the present invention;
fig. 3 is a schematic diagram of a binarization process of the tea identification grading method based on the improved capsule network according to the first embodiment of the invention;
fig. 4 is a schematic diagram of a square cut image of a tea leaf identification and classification method based on an improved capsule network according to a first embodiment of the present invention;
fig. 5 is a schematic diagram of a modified capsule network model of a tea leaf identification grading method based on a modified capsule network according to a first embodiment of the invention;
fig. 6 is a schematic diagram of a residual block of a tea identification grading method based on an improved capsule network according to a first embodiment of the invention;
fig. 7 is a schematic diagram of a multi-dimensional feature extraction convolution layer of a tea leaf identification and classification method based on an improved capsule network according to a first embodiment of the invention;
FIG. 8 is a schematic diagram of the accuracy of the improved capsule network model based on the tea leaf classification method based on the improved capsule network according to the second embodiment of the present invention;
fig. 9 is a schematic diagram of the loss value of the improved capsule network model of the tea identification classification method based on the improved capsule network according to the second embodiment of the present invention;
FIG. 10 is a diagram illustrating the accuracy of a modified capsule network model of a tea leaf classification method based on a modified capsule network according to a second embodiment of the present invention compared with the conventional method;
fig. 11 is a graph showing a loss value comparison between an improved capsule network model and a conventional method of a tea leaf identification and classification method based on an improved capsule network according to a second embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, specific embodiments accompanied with figures are described in detail below, and it is apparent that the described embodiments are a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making creative efforts based on the embodiments of the present invention, shall fall within the protection scope of the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways than those specifically described and will be readily apparent to those of ordinary skill in the art without departing from the spirit of the present invention, and therefore the present invention is not limited to the specific embodiments disclosed below.
Furthermore, reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one implementation of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments.
The present invention will be described in detail with reference to the drawings, wherein the cross-sectional views illustrating the structure of the device are not enlarged partially in general scale for convenience of illustration, and the drawings are only exemplary and should not be construed as limiting the scope of the present invention. In addition, the three-dimensional dimensions of length, width and depth should be included in the actual fabrication.
Meanwhile, in the description of the present invention, it should be noted that the terms "upper, lower, inner and outer" and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation and operate, and thus, cannot be construed as limiting the present invention. Furthermore, the terms first, second, or third are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
The terms "mounted, connected and connected" in the present invention are to be understood broadly, unless otherwise explicitly specified or limited, for example: can be fixedly connected, detachably connected or integrally connected; they may be mechanically, electrically, or directly connected, or indirectly connected through intervening media, or may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Example 1
Referring to fig. 1 to 7, for a first embodiment of the present invention, there is provided a tea leaf identification grading method based on an improved capsule network, including:
s1: the tea leaves of known varieties and grades are divided into a plurality of units, the units are sequentially placed on a white bottom plate, and n pieces of tea leaf image data of each category are acquired through a camera. Wherein, it is required to be noted that:
when tea sample images are collected, A4 paper is used as a white bottom plate, an LED light source with the power of 3w is selected, the distance between a camera lens and the tea samples is kept at 17cm, and the tea samples occupy 3/4 of camera pictures during shooting.
S2: and performing binarization processing and various data enhancement processing on the tea image data based on the Otsu method, and establishing a tea image database according to the tea grade label. The steps to be explained are as follows:
the binarization processing comprises the steps of converting collected tea image data from an RBG image into a gray image, binarizing the tea image by adopting an Otsu method, and cutting the tea and the bottom plate by square cutting;
the multiple data enhancement processing comprises the steps of carrying out random rotation, random cutting, random horizontal turning and characteristic standardization operation on the segmented tea images in sequence.
S3: deepening the original capsule network structure and combining with the residual block to construct an improved capsule network model suitable for tea grading. Among them, it is also to be noted that:
building a single residual block network structure, a multi-size feature extraction convolution layer and an original capsule network structure;
the improved capsule network is obtained by replacing the single convolution layer for feature extraction in the original capsule network structure with a residual block, a multi-size feature extraction convolution layer, two common convolution layers and three pooling layers.
S4; and importing data into the improved capsule network model in the tea image database for training to obtain a tea grading model. What should be further described in this step is:
extracting characteristics of the tea image through the combination of the three groups of residual blocks and the plurality of groups of convolution pooling layers, and transmitting the characteristics into the main capsule layer;
the characteristics are transmitted to the main capsule layer;
adjusting the characteristic shape through a channel of the two-dimensional convolution processing characteristic to convert the characteristic shape from a scalar form to a vector form capsule so as to simultaneously reserve local position information and attribute information of the characteristic;
after the processing is completed, the features are transferred into the digital capsule layer.
Specifically, the digital capsule layer includes:
weighting and summing each layer of sub-capsules with the corresponding weight matrix, and then processing by a compression function to obtain a predicted value of the parent capsule;
after the coupling coefficient is determined by a dynamic routing mechanism, the actual output of the father capsule is obtained by a predicted value based on the coupling coefficient;
where the length of a capsule represents the likelihood of the existence of a certain category and each dimension represents a certain characteristic of that category.
The routing mechanism is as follows:
Figure BDA0003127884170000061
Figure BDA0003127884170000062
Figure BDA0003127884170000063
wherein, aijFor a vote between a predicted value and an output value,
Figure BDA0003127884170000064
the predicted value of the output of the capsule i to the lower-layer capsule j, the coupling coefficient bijThe score of the capsule on each predicted value is determined by a softmax function, and the Squash is a compression function and plays a role in normalization.
Updating model parameters by utilizing a Focal local Loss function and an Adam optimizer;
the Focal local Loss function is expressed as follows:
FL(pt)=-(1-pt)γlg(pt)
wherein p istAs the probability that a sample belongs to a certain class, gamma is the focusing parameter, (1-p)t) Gamma is called modulation factor, gradient is reversedDuring transmission, the weight matrix of the model corresponding to the samples which are easy to classify is reduced, and the samples which are difficult to classify are learned;
and finishing training until the Focal local Loss value and the accuracy of the improved capsule network model tend to be stable.
S5: and (3) paving the tea leaves to be classified on a white bottom plate, shooting by using a camera to acquire image data to be recognized, and inputting the image data into a trained tea leaf classification model to acquire a tea leaf grade recognition result.
Preferably, this embodiment also needs to be described in that a conventional convolutional neural network transfers features in a scalar form, a large number of data sets will be needed to train the translation invariants of the convolutional nerves, and the robustness of the model is often poor on small data; the size of the common convolutional layer is fixed, so that the reading of sensing fields with different sizes cannot be realized simultaneously; traditional convolutional neural networks rarely enable the learning of features that are difficult to classify.
The capsule network transmits the characteristics in a vector form, so that the network has translation invariance and translation isomorphism and a data set with the same capacity, and more characteristics are extracted by using the capsule network, so that the network has stronger capacity of processing small samples; by combining the residual idea, the capability of the model for resisting gradient disappearance can be improved by virtue of the jump connection of cross-layer direct connection; a multi-dimensional feature extraction convolutional layer is introduced.
The tea features of different grades are slightly different, so that the characteristics of different size perception fields can be fused by performing multi-scale feature extraction on the tea image, and the distinguishability of the model on the tea of different grades is improved; by introducing the Focal local Loss function, the weight matrix corresponding to the samples which are easy to classify can be reduced by the model during gradient back propagation, and the samples which are difficult to classify are heavily learned.
Example 2
Referring to fig. 8-11, a second embodiment of the present invention is shown, which is an experimental result comparison of the improved capsule network and other conventional techniques selected for the tea leaf image test set.
The convolutional neural network is a method generally adopted in the field of traditional tea identification grading by traditional deep learning, features are transmitted in the network in a scalar form, a large number of data sets are needed for training translation invariants required by convolutional neural network classification, and tea images are difficult to acquire and collect, so that the convolutional neural network has low classification precision in tea identification grading, and the phenomenon of model overfitting is serious.
Compared with the traditional method, the method has stronger capability of resisting model overfitting and processing the small sample data set; in this example, the following methods are adopted: the identification and ranking effects of VGG16, VGG19, Resnet50 traditional convolutional neural networks and the present method on tea image datasets are compared.
And (3) testing environment: all networks are built through Python 3.6, realized by a Pythrch deep learning framework, accelerated on an NVDIA GeForce RTX 3060 display card, in the training stage, the epoch is uniformly set to be 100, the routing is set to be 3, the batch _ size is 30, the learning rate is 0.001, the momentum is 0.9, a random gradient descent optimizer is used for carrying out the testing stage, the epoch is set to be 20, and other conditions are kept unchanged.
The method comprises the steps of establishing a data set by utilizing the collected tea images, carrying out grading test on the tea images by utilizing the method and three groups of traditional convolutional neural networks respectively, recording the accuracy and the Focal Loss value obtained in the test stage, and drawing a corresponding change curve.
In the testing stage, the accuracy of the method is increased rapidly in the initial stage, higher classification accuracy can be achieved more rapidly, and the whole accuracy curve is obviously higher than other curves corresponding to the convolutional neural network method, so that the method is proved to have stronger capability of resisting model overfitting than other methods and be better at processing small sample data sets.
In the whole testing process, the method achieves the accuracy of 98.3 percent and the loss value of 0.03392, shows the highest classification precision and the best model robustness in all models, and demonstrates the performance and the rationality of the method.
It should be noted that the above-mentioned embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, which should be covered by the claims of the present invention.

Claims (8)

1. A tea leaf identification grading method based on an improved capsule network is characterized by comprising the following steps: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
dividing known varieties and grades of tea leaves into a plurality of units, sequentially placing the units on a white bottom plate, and acquiring n pieces of tea leaf image data of each category through a camera;
performing binarization processing and various data enhancement processing on the tea image data based on the Otsu method, and establishing a tea image database according to the tea grade label;
deepening an original capsule network structure and combining with the residual block to construct an improved capsule network model suitable for tea grading;
importing data into the improved capsule network model in the tea image database for training to obtain a tea grading model;
and paving the tea leaves to be classified on the white bottom plate, shooting by using the camera to acquire image data to be recognized, and inputting the image data into the trained tea leaf classification model to acquire a recognition result of the grade of the tea leaves.
2. The tea identification grading method based on improved capsule network as claimed in claim 1, wherein: the method comprises the steps that when tea sample images are collected, A4 paper is used as a white bottom plate, an LED light source with the power of 3w is selected, the distance between a camera lens and the tea samples is kept at 17cm, and the tea samples occupy 3/4 of camera pictures during shooting.
3. The tea identification grading method based on improved capsule network according to claim 1 or 2, characterized in that: and the binarization processing comprises the steps of converting the collected tea image data from the RBG image into a gray image, binarizing the tea image by adopting an Otsu method, and cutting the tea and the bottom plate by a square.
4. The tea identification grading method based on improved capsule network according to claim 3, characterized in that: the multiple data enhancement processing comprises the steps of carrying out random rotation, random cutting, random horizontal turning and characteristic standardization operation on the segmented tea images in sequence.
5. The tea identification grading method based on improved capsule network according to claim 4, characterized in that: the method comprises the steps of building a single residual block network structure, a multi-size feature extraction convolution layer and an original capsule network structure.
6. The tea identification grading method based on improved capsule network according to claim 5, characterized in that: and replacing the single convolution layer for feature extraction in the original capsule network structure by using the residual block, the multi-size feature extraction convolution layer, the two common convolution layers and the three pooling layers to obtain the improved capsule network.
7. The tea identification grading method based on improved capsule network according to claim 6, characterized in that: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
extracting characteristics of the tea image through the combination of the three groups of residual blocks and the plurality of groups of convolution pooling layers, and transmitting the characteristics into the main capsule layer;
the characteristics are transmitted to the main capsule layer;
adjusting the characteristic shape through a channel of the two-dimensional convolution processing characteristic to convert the characteristic shape from a scalar form to a vector form capsule so as to simultaneously reserve local position information and attribute information of the characteristic;
after the processing is completed, the features are transferred into the digital capsule layer.
8. The tea identification grading method based on improved capsule network as claimed in claim 7, wherein: the digital capsule layer comprises a plurality of layers,
weighting and summing each layer of sub-capsules with the corresponding weight matrix, and then processing by a compression function to obtain a predicted value of the parent capsule;
after the coupling coefficient is determined through a dynamic routing mechanism, the predicted value obtains the actual output of the father capsule based on the coupling coefficient;
where the length of a capsule represents the likelihood of the existence of a certain category and each dimension represents a certain characteristic of that category.
CN202110695923.3A 2021-06-23 2021-06-23 Tea leaf identification grading method based on improved capsule network Pending CN113343886A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110695923.3A CN113343886A (en) 2021-06-23 2021-06-23 Tea leaf identification grading method based on improved capsule network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110695923.3A CN113343886A (en) 2021-06-23 2021-06-23 Tea leaf identification grading method based on improved capsule network

Publications (1)

Publication Number Publication Date
CN113343886A true CN113343886A (en) 2021-09-03

Family

ID=77478021

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110695923.3A Pending CN113343886A (en) 2021-06-23 2021-06-23 Tea leaf identification grading method based on improved capsule network

Country Status (1)

Country Link
CN (1) CN113343886A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114283303A (en) * 2021-12-14 2022-04-05 贵州大学 Tea leaf classification method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110009097A (en) * 2019-04-17 2019-07-12 电子科技大学 The image classification method of capsule residual error neural network, capsule residual error neural network
CN110533004A (en) * 2019-09-07 2019-12-03 哈尔滨理工大学 A kind of complex scene face identification system based on deep learning
CN111145174A (en) * 2020-01-02 2020-05-12 南京邮电大学 3D target detection method for point cloud screening based on image semantic features
CN111241958A (en) * 2020-01-06 2020-06-05 电子科技大学 Video image identification method based on residual error-capsule network
CN111414971A (en) * 2020-03-27 2020-07-14 南京工业大学 Finished product tea type and grade identification method based on convolutional neural network
CN112651025A (en) * 2021-01-20 2021-04-13 广东工业大学 Webshell detection method based on character-level embedded code
CN112869711A (en) * 2021-01-19 2021-06-01 华南理工大学 Automatic sleep staging and migration method based on deep neural network

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110009097A (en) * 2019-04-17 2019-07-12 电子科技大学 The image classification method of capsule residual error neural network, capsule residual error neural network
CN110533004A (en) * 2019-09-07 2019-12-03 哈尔滨理工大学 A kind of complex scene face identification system based on deep learning
CN111145174A (en) * 2020-01-02 2020-05-12 南京邮电大学 3D target detection method for point cloud screening based on image semantic features
CN111241958A (en) * 2020-01-06 2020-06-05 电子科技大学 Video image identification method based on residual error-capsule network
CN111414971A (en) * 2020-03-27 2020-07-14 南京工业大学 Finished product tea type and grade identification method based on convolutional neural network
CN112869711A (en) * 2021-01-19 2021-06-01 华南理工大学 Automatic sleep staging and migration method based on deep neural network
CN112651025A (en) * 2021-01-20 2021-04-13 广东工业大学 Webshell detection method based on character-level embedded code

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114283303A (en) * 2021-12-14 2022-04-05 贵州大学 Tea leaf classification method

Similar Documents

Publication Publication Date Title
CN110555446A (en) Remote sensing image scene classification method based on multi-scale depth feature fusion and transfer learning
CN109063649B (en) Pedestrian re-identification method based on twin pedestrian alignment residual error network
CN113807215B (en) Tea tender shoot grading method combining improved attention mechanism and knowledge distillation
CN111414942A (en) Remote sensing image classification method based on active learning and convolutional neural network
CN110427990A (en) A kind of art pattern classification method based on convolutional neural networks
CN106991382A (en) A kind of remote sensing scene classification method
CN110991549A (en) Countermeasure sample generation method and system for image data
CN110413924A (en) A kind of Web page classification method of semi-supervised multiple view study
CN103761295B (en) Automatic picture classification based customized feature extraction method for art pictures
CN106228166B (en) The recognition methods of character picture
CN112347970B (en) Remote sensing image ground object identification method based on graph convolution neural network
CN109344891A (en) A kind of high-spectrum remote sensing data classification method based on deep neural network
CN109063754A (en) A kind of remote sensing image multiple features combining classification method based on OpenStreetMap
CN105718932A (en) Colorful image classification method based on fruit fly optimization algorithm and smooth twinborn support vector machine and system thereof
CN103020265A (en) Image retrieval method and system
CN109002463A (en) A kind of Method for text detection based on depth measure model
CN101894263A (en) Computer-aided classification system and classification method for discriminating mapped plant species based on level set and local sensitivity
CN115359353A (en) Flower identification and classification method and device
CN105354600A (en) Automatic classification method for sandstone microsections
CN113343886A (en) Tea leaf identification grading method based on improved capsule network
CN113850311A (en) Long-tail distribution image identification method based on grouping and diversity enhancement
CN116012653A (en) Method and system for classifying hyperspectral images of attention residual unit neural network
CN116543269B (en) Cross-domain small sample fine granularity image recognition method based on self-supervision and model thereof
CN103049570B (en) Based on the image/video search ordering method of relevant Preserving map and a sorter
CN110852398B (en) Aphis gossypii glover recognition method based on convolutional neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination