CN110070069A - A kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction - Google Patents

A kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction Download PDF

Info

Publication number
CN110070069A
CN110070069A CN201910362329.5A CN201910362329A CN110070069A CN 110070069 A CN110070069 A CN 110070069A CN 201910362329 A CN201910362329 A CN 201910362329A CN 110070069 A CN110070069 A CN 110070069A
Authority
CN
China
Prior art keywords
classification
convolutional neural
neural networks
tea
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910362329.5A
Other languages
Chinese (zh)
Inventor
仲元红
张靖怡
张顺
马心怡
张钊源
周昭坤
成欣雨
张静
黄关
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University
Original Assignee
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University filed Critical Chongqing University
Priority to CN201910362329.5A priority Critical patent/CN110070069A/en
Publication of CN110070069A publication Critical patent/CN110070069A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • G06F18/2414Smoothing the distance, e.g. radial basis function networks [RBFN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/48Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables

Abstract

The invention discloses a kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction, include the following steps: the tealeaves data to be sorted for obtaining electronic tongue sensor acquisition;Tealeaves data to be sorted are converted into time-frequency characteristics figure;Time-frequency characteristics figure input convolutional neural networks are obtained into Classification of Tea information.Tealeaves data to be sorted are converted to time-frequency characteristics figure by the present invention, to make full use of advantage of the convolutional neural networks in image procossing, are selected with network parameter training instead of manual feature, are improved the efficiency of Classification of Tea;Also, in the prior art using feature extraction and Classification and Identification as two independent structures compared with, feature extraction and Classification and Identification are integrated into same framework by the present invention, keep classifying quality more accurate.

Description

A kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction
Technical field
The present invention relates to electronic tongues field more particularly to a kind of tealeaves based on convolutional neural networks Automatic Feature Extraction point Class method.
Background technique
Tea is one of most popular drink in the world, and sampling tea has long history in China.In tealeaves containing theanine, The ingredients such as choline, inositol, folic acid, adjustable human health.When actually tasting, that mellow and fragrant taste Road stimulates the taste bud of people.The external factor for influencing tea mouthfeel has very much: tealeaves quantity, the tea making time, brews tea making apparatus Water quality and the storing mode of tealeaves etc., the joint effect of these factors make tea not have some flavor.The ingredient of tea is extremely complex, Most important of which is that tea polyphenols, amino acid, alkaloid and other aromatic substances, therefore different chemical structures and concentration has Machine compound also has a significant impact to the quality of tea.Classification of Tea is with a wide range of applications, it is to the new and old, true and false of tealeaves Etc. assessment in terms of quality play an important role.In addition, the identification of tealeaves and the analysis for being classified as Trace Elements in Tea content Feasible method is provided, is also drunk for the health of tealeaves and provides guarantee, while being also very likely the following Intelligent tea kettle Design provides reference.Traditional tea quality is tested and assessed using different analysis instruments, such as high performance liquid chromatography, gas phase Chromatography and plasma atomic emission spectrometer etc..But these methods need a large amount of technical staff, original material and enough Financial support, cause assess efficiency it is extremely low.The electronic tongues of sensor technology system have precision height, and easy to operate, detection is fast The advantages that fast, the appearance of sensor technology substantially increase the efficiency of tea quality detection.With the sense of taste for gas analysis The invention of the electronic sensors such as sensor and electronic nose, the devices such as electronic tongues start the analysis for being widely used in complex fluid sample. Electronic tongues can be used for monitoring the production cycle of beverage as a kind of modern intelligent-induction instrument, it is simple and quick and low in cost, It is had great potential in terms of the assessment of beverage quality.
Feature extraction is the key component of electronic tongue system.Existing feature extracting method is all manually to be arranged, no Only feature is difficult to select, and stability is poor.Due to lacking autonomous, efficient, accurate feature extracting method, electronic tongue system Using and development be greatly limited.In order to more effectively extract feature, many profession researchers attempt through various numbers Transform method compression sampling data are learned to obtain effective feature.Some uses have the wavelet transform of sliding window (DWT) energy in different frequency bands is extracted as feature;Some extracts feature using discrete cosine transform (DCT), and selects Select characteristic value of some coefficients as Classification of Tea;Sampled data is converted to matrix by some, and is singular value by matrix decomposition (SVD), several singular values are selected as feature to realize Classification of Tea.Although these mathematic(al) manipulation methods are in terms of feature selecting It has made some improvements, but still needs to be manually set to selection characteristic, so as to cause inefficiency.
Therefore, the efficiency for carrying out Classification of Tea using electronic tongues how is improved, becomes those skilled in the art and is badly in need of solution Certainly the problem of.
Summary of the invention
Therefore, problem to be solved of the present invention is how to improve the efficiency that Classification of Tea is carried out using electronic tongues.
In order to solve the above-mentioned technical problem, present invention employs the following technical solutions:
A kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction, includes the following steps:
S1, the tealeaves data to be sorted for obtaining electronic tongue sensor acquisition;
S2, tealeaves data to be sorted are converted into time-frequency characteristics figure;
S3, time-frequency characteristics figure input convolutional neural networks are obtained into Classification of Tea information.
Preferably, convolutional neural networks include feature extraction layer and Feature Mapping classification layer, and step S3 includes:
S301, time-frequency characteristics figure input feature vector extract layer is obtained into time-frequency figure characteristic value;
S302, time-frequency figure characteristic value input feature vector map classification layer is obtained into Classification of Tea information.
Preferably, tealeaves data to be sorted are converted to by time-frequency characteristics figure using short time discrete Fourier transform.
It is preferably based on formula:
Spectrogram { X (m, n) }=| X (m, n) |2
Tealeaves data to be sorted are converted into time-frequency characteristics figure.
Preferably, during tealeaves data to be sorted are converted to time-frequency characteristics figure using short time discrete Fourier transform, It uses length for the sliding window of Ws, and partly overlaps between continuous window.
Preferably, the activation primitive of convolutional neural networks is rectification linear function, and pond layer is to maximize function.
In conclusion the invention discloses a kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction, Include the following steps: the tealeaves data to be sorted for obtaining electronic tongue sensor acquisition;Tealeaves data to be sorted are converted into time-frequency Characteristic pattern;Time-frequency characteristics figure input convolutional neural networks are obtained into Classification of Tea information.The present invention turns tealeaves data to be sorted Be changed to time-frequency characteristics figure, to make full use of advantage of the convolutional neural networks in image procossing, with network parameter training instead of Manual feature selecting improves the efficiency of Classification of Tea;Also, in the prior art using feature extraction and Classification and Identification as two A independent structure is compared, and feature extraction and Classification and Identification are integrated into same framework by the present invention, keeps classifying quality more quasi- Really.
Detailed description of the invention
In order to keep the purposes, technical schemes and advantages of invention clearer, the present invention is made into one below in conjunction with attached drawing The detailed description of step, in which:
Fig. 1 is a kind of one kind of the Classification of Tea method based on convolutional neural networks Automatic Feature Extraction disclosed by the invention The flow chart of specific embodiment;
Fig. 2 is the schematic diagram of the general structure of electronic tongue based on voltammetry system;
Fig. 3 is the structure chart of method flow of the invention;
Fig. 4 (a) is the scanning voltage figure of different frequency, and Fig. 4 (b) is typical response sample datagram;
Fig. 5 to Fig. 7 be during the test using Hanning window after 100 iteration and 5 cross validation methods, Hamming window and The average nicety of grading of Gaussian window letter.
Fig. 8 to Figure 13 respectively shows original response, peak value inflection point, wavelet transform, discrete cosine transform, singular value It decomposes and two dimensional character of the invention is distributed.
Specific embodiment
The present invention is described in further detail with reference to the accompanying drawing.
As shown in Figure 1, the invention discloses a kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction, Include the following steps:
S1, the tealeaves data to be sorted for obtaining electronic tongue sensor acquisition;
S2, tealeaves data to be sorted are converted into time-frequency characteristics figure;
S3, time-frequency characteristics figure input convolutional neural networks are obtained into Classification of Tea information.
In general, electronic tongue system can be divided into several types according to data measuring method, such as potentiometry, voltammetry, peace Training method etc..Different types of working electrode, such as bare electrode, modified electrode and biology can be used in different types of electronic tongues Sensor.In general, the material of bare electrode is gold, silver and palladium;Modified electrode is by double phthalic acid salt compounds or conduction The carbon of polymer treatment is starched;Biosensor is containing the carbon Biocomposite material or graphene of enzyme and different metal catalyst electricity Pole.Although different types of electronic tongue system needs to collect data with different modes, their feature extraction and mode Recognition methods is similar.The present invention selects widely used volt-ampere to verify the validity of feature extraction and classification method Method electronic tongue system collects data.
Fig. 2 is the general structure of electronic tongue based on voltammetry system, including controller, multifrequency pulse voltage generating circuit, permanent electricity Position circuit, three electrode modules, micro-current sensing circuit and low-pass filter.Three electrode modules are by working electrode (WE), with reference to electricity Pole (RE) and to electrode (CE) form, it is the hardcore part of electronic tongue system.Different electrode materials has different The details of electrode sensor array used in model is described in detail in response characteristic, table 1.The core controller of microcontroller is STM32 controller, it can convert driving external circuit by DA and generate such as different voltage waveform of multifrequency pulse voltage wave. Using the micro-current sensing circuit for being connected to WE, voltage signal is converted the current to, then by filter circuit processing signal Clutter.Finally, digital voltage signal can be obtained by AD conversion after amplifying voltage, thus to the sound from different WE Induction signal is sampled.
Table 1
In electronic tongue system, feature extraction is a particularly important and relatively difficult to achieve part.Due to each sensor Corresponding time series signal (tealeaves data to be sorted) is indicated by a large amount of measured value, evaluates tea with electronic tongue system The quality of leaf is a difficult task.Therefore, we it is necessary to elder generations extracted from large data sets one it is important or relevant Then feature sends it to pattern classifier again and classifies.
Empirically, the feature being hidden in time series includes the feature of time domain and transform domain.Temporal signatures are ratios It is relatively intuitive, for example, response signal size and mathematics characteristic point position.Feature in transform domain is more complicated, as frequency domain, Feature etc. after matrix decomposition.However, the feature extracting method for only manually selecting filter can mention for traditional method Feature is taken, but problem is that they are difficult to adapt to the scene changed and stability is poor.
Key idea of the invention is that time series is converted to time-frequency characteristics figure, to make full use of convolutional neural networks to exist Advantage in image characteristics extraction.The structure of method flow of the invention is as shown in Figure 3.Firstly, by sensor measurement point (to minute Class tealeaves data) time-frequency characteristics figure is converted to, the details of time-frequency characteristics figure is then hidden in study convolutional neural networks automatically.
In the present invention, neural network uses Multilayer filter structure, and the parameter of each filter is true in training Fixed, different parameters can actually be interpreted as a kind of feature extracting method.Therefore, this Automatic Feature Extraction method can be with It is interpreted as a variety of representative or unknown characteristics extracting method combination, can preferably complete classification task in this way.It is more important , the present invention unified feature extraction and classifying step under single architecture, it is ensured that the outstanding property of feature extraction and classification Energy.It in the training process, is that input obtains shallow-layer by the way that different size of core is arranged with different classes of tealeaves sampled signal Convolutional neural networks utilize back-propagation algorithm training network.
In conclusion tealeaves data to be sorted are converted to time-frequency characteristics figure by the present invention, to make full use of convolutional Neural net Advantage of the network in image procossing, is omitted manual feature selecting and network parameter training process, improves the effect of Classification of Tea Rate;Also, in the prior art using feature extraction and Classification and Identification as two independent structures compared with, the present invention proposes feature It takes and is integrated into same framework with Classification and Identification, keep classifying quality more accurate.
When it is implemented, convolutional neural networks include feature extraction layer and Feature Mapping classification layer, step S3 includes:
S301, time-frequency characteristics figure input feature vector extract layer is obtained into time-frequency figure characteristic value;
S302, time-frequency figure characteristic value input feature vector map classification layer is obtained into Classification of Tea information.
Convolutional neural networks are a kind of for analyzing the depth feed forward-fuzzy control of visual pattern, use convolutional Neural Network can automatically extract feature.Compared with traditional artificial neural network, the neuron of convolutional neural networks with next layer Neuron is simultaneously non-fully connected, but part connection.Meanwhile the parameter of convolution kernel is that weight is shared.Benefit from locality connection Shared with weight, convolutional neural networks have the function of the scalability of height and powerful.In general, CNN is in image procossing 1) aspect, which has the advantage that, shares convolution kernel, high dimensional data processing speed is fast.2) it is instructed when training without manually selecting feature Practice weight extraction feature.3) good classification effect.
According to the actual conditions of application, the structure of convolutional neural networks is slightly different, as AlexNet, VGG, GoogLeNet,ResNet.In general, convolutional neural networks include feature extraction layer and Feature Mapping classification layer.Feature extraction layer packet Include multiple convolutional layers, active coating and full articulamentum.The purpose of convolution is to extract the different characteristic of input, and active coating enhances decision letter Several and entire neural network non-linear, pond layer constantly reduces the size of data space, so as to control overfitting.? The end of network carries out analysis ratio to the difference of prediction result and actual result in Feature Mapping and classification layer using loss function It is right.In the present invention, in order to improve the generalization ability of network and prevent overfitting, present invention uses dropout layers, and increase Regularization constraint is added.
When it is implemented, tealeaves data to be sorted are converted to time-frequency characteristics figure using short time discrete Fourier transform.
In present invention, it is desirable to replacing tealeaves data to be sorted with time-frequency characteristic pattern in an appropriate manner.Non-stationary signal Common time-frequency representation is Wigner distribution (WVD), short time discrete Fourier transform (STFT) and Gabor transformation.In view of electronic tongues system The characteristics of response signal transient state of uniting mutation, therefore present invention selection extracts time-frequency characteristic using STFT.
When it is implemented, being based on formula:
Spectrogram { X (m, n) }=| X (m, n) |2
Tealeaves data to be sorted are converted into time-frequency characteristics figure.
In above formula, STFT (m, n), which indicates to use, is Fu in short-term with length for time series of the window function to length for n of m In leaf transformation, X (m, n) indicates that the obtained time-frequency characteristics figure of Short Time Fourier Transform, s (n) indicate the sensor acquisition of electronic tongues Electrode sample signal, w (n-m) indicate window function, wherein-m indicate window function translation distance, m indicate window function size, N indicates the length of time series, and spectrogram { X (m, n) } indicates the amplitude square intensity of frequency spectrum.
Therefore, one-dimensional electrodes sampled signal (tealeaves data to be sorted) is transformed to the two dimension letter comprising time and frequency characteristic Number, especially contain the instantaneous frequency information of sequence.Due to being two-dimensional complex number, we use square general of logarithm size It is converted to two dimensional character image (time-frequency characteristics figure).
When it is implemented, in the process that tealeaves data to be sorted are converted to time-frequency characteristics figure using short time discrete Fourier transform In, use length for the sliding window of Ws, and partly overlap between continuous window.
In the present invention, during STFT, length is used as the sliding window of Ws.In order to coordinate adjacent window apertures signal it Between computational complexity and consistency, we make to overlap between continuous window.As shown in figure 3, each window movement Distance is L, and usual L is equal to Ws/2.
When it is implemented, the activation primitive of convolutional neural networks is rectification linear function, pond layer is to maximize function
Common activation primitive has: sigmoid function, tanh function and Relu rectify linear function.Sigmoid and tanh Gradient it is very gentle in zone of saturation, close to 0, it is easy to which the problem of causing gradient to disappear slows down convergence rate.And Relu Gradient be in most cases constant, facilitate solve deep layer network convergence problem.Layer common function in pond has Value function and maximization function, mean function more retain background information, maximize function and more retain texture information, because We have selected maximization function for this.
Below for using a kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction disclosed by the invention The example for carrying out Experimental comparison with other Classification of Tea methods in the prior art:
Practical five picked from different provinces grow tea.They are fragrant solomonseal rhizome tea (Chongqing in China) respectively, elegant young tea leaves (China's weight Celebrating), extra-strong tea (Chinese Quanzhou), Biluochun tea (Chinese Suzhou) and West Lake tea (Hangzhou China).For every kind of tealeaves, we It selects 1g high-precision electronic scale to weigh, is brewed 10 minutes with 100ml boiling water, original tea liquid is obtained by screen filtration tealeaves.
In an experiment, former tea liquid is mixed with water, we have obtained 100%, 50% and 25% three kind of tea liquid concentration.In order to Ensure the reliability tested, we have collected 340 groups of sampled datas of every group of tea liquid concentration, every kind of working electrode 5100 is obtained The sampled data of group (340 groups of every kind of concentration of 3 concentration * of every kind of tealeaves of 5 kinds of tealeaves *).As described above, we used three kinds of works Make electrode (gold, silver, palladium), therefore obtains 5100*3 group sample data in total.In order to accelerate the convergence rate of classifier, same The normalization between (0,1) is carried out to every group of 5100 sample data group under one working electrode.
Electronic tongue system is made of three working electrodes, a reference electrode and a counterelectrode.It is being measured for the first time Before, electrode is polished with cloth and grounds travel.In the pilot process measured twice, working electrode is placed in distilled water simultaneously Electricity consumption chemical cleaning method cleans 1 minute, dries with a cloth, and is washed with distilled water to electrode and reference electrode and dry with filter paper.
The multiple dimensioned scanning signal of multifrequency is generated using the big pulse voltammetry of multifrequency (MLAPV).Specifically, we use three A frequency signal, i.e. 1Hz, every kind of frequency of 2Hz and 2.5Hz 1.0V, 0.8V, 0.6V, 0.4V, 0.2V, -0,2V, - 10 pulses are generated under 0.4V, -0.6V, -0.8V, -1.0V voltage, in order to avoid between different frequency signals during the reaction Interference, after the completion of the signal scanning of every kind of frequency, stop scanning signal 5 seconds.In sampling step, we are by sample rate 1KHz is set as to obtain more details.Therefore, we are in 1 hertz of scan frequency down-sampling, 10000 points, at 2 hertz 5000 points of scan frequency down-sampling, in 2.5 hertz of scan frequency down-samplings, 4000 points.
Fig. 4 illustrates scanning voltage and corresponding typical response voltage.Fig. 4 (a) is the scanning voltage of different frequency, Fig. 4 It (b) is typical response sample data.One in Fig. 4 (a) shares three sections of histograms, is horizontal line between every section, indicates scanning Signal stops 5 seconds.Three sections of histograms are successively from left to right 1Hz, 2Hz and 2.5Hz scanning signal in Fig. 4 (a), in Fig. 4 (b) It is from left to right three sections of response signals, respectively corresponds 1Hz, 2Hz and 2.5Hz scanning signal.The function of this Experimental comparison is extracted Completed on same server with classification experiments, be equipped with 32G RAM, NVIDIA GeForce GTX Titan GPU and 64 bit manipulation system of Linux.The model uses Pytorch framework establishment, and programming language is Python.All datagraphics are all It is to be drawn with MATLAB.
In general, in order to it is automatic, comprehensively extract feature, need enough network structure depth.In view of electronic tongues system The sampled data of system is two-dimensional, it means that includes that the hiding feature in time-frequency is enriched unlike real image, therefore The structure of convolutional neural networks does not need too complicated in the present invention.Experimental result confirms not only will not using deep layer network structure Performance is improved, and more time overheads can be brought.Finally, we select the network structure comprising three convolutional layers.It is rolling up In lamination 1, we are exported using the input of 3 channels and 32 channels that kernel size is 3*3;In convolutional layer 2, we use kernel The input of 32 channels and the output of 64 channels that size is 3*3;In convolutional layer 3, we are defeated using 64 channels that kernel size is 3*3 Enter and is exported with 64 channels;Activation primitive and pond layer functions are respectively to rectify linear function (ReLU) and maximization function;Finally It is the n*64 layer (size of n by window size determine) that is fully connected to 64 layers of progress Feature Mapping.
In order to find suitable window function type and window size, default value is set by convolutional neural networks parameter (dropout=0.5;L2 normalization;Batch size=64;Lose function and correspond to softmax), and at first of model verifying Divide the window function using different type and size.Fig. 5, Fig. 6 and Fig. 7 are shown uses 100 iteration and 5 during the test After secondary cross validation method, the average nicety of grading of different windows function and window size.We choose three typical window letters Number, i.e. Hanning window, Hamming window and Gauss function.For each window function, using four kinds of sizes, respectively 64,128,256, 512.In Fig. 5, Hanning window realizes most preferably averagely nicety of grading using the window that size is 256, and precision is close to 99.5%. In Fig. 6, Hamming window realizes most preferably averagely nicety of grading using the window that size is 128, and precision is close to 99.8%.Scheming In 7, Gaussian window realizes most preferably averagely nicety of grading using the window that size is 256, and precision is close to 98%.
In order to preferably verify the stability of network, we have carried out comparative experiments to the relevant parameter of network.First After the experimental verification in stage, we select size to extract time-frequency characteristics figure for 256 Hanning window function.Then in network batches The parameter selection of amount size, regularization and different loss functions is tested.
As shown in table 2, we have evaluated convolutional neural networks using different training and test batch, different losses Function and whether use L2 regularization term when performance.Wherein, batch size is respectively 16,32,64, and loss function is to intersect Entropy loss softmax and multistage hinge lose SVM.In the list where L2 normalization, "Yes" indicates regularization, and "No" indicates There is no regularization.The experimental results showed that, regardless of batch size and loss function, test is accurate when using regularization term Rate is above 99.5%.When batch size is 32, loss function softmax, full accuracy 99.9%.When without using just When then changing, measuring accuracy is relatively low.
Table 2
We have carried out comparative experiments to the data set comprising 5100 five tea samples.Compared with the conventional method, of the invention The model of proposition achieves optimal effect.Other technologies are usually using feature extraction and Classification and Identification as two independent knots Structure.In terms of feature extraction, it has been proposed that such as original response feature, peak value inflection point, wavelet transform (DWT), discrete The technologies such as cosine transform (DCT), singular value decomposition (SVD);In terms of pattern-recognition, such as support vector machines has been used (SVM), several classifiers such as k neighbour (K-NN) and random forest (RF) classify to tea.Our method is a kind of close As end-to-end feature extraction and pattern recognition model, feature extraction and Classification and Identification are integrated into same framework.
In view of all feature extracting methods are obtained using high dimensional feature data, in order to be best understood from spy The distribution situation of sign, we compress feature using Principal Component Analysis.Fragrant solomonseal rhizome tea is represented with red, blue represents elegant bud Tea, blue-green represent Biluochun tea, and black represents West Lake Dragon Well tea, and green represents extra-strong tea.Fig. 8 to Figure 13 is respectively shown Original response, peak value inflection point, wavelet transform, discrete cosine transform, singular value decomposition and two dimensional character distribution of the invention. Compared with the other technologies during characteristic visual is shown: PC1 (69.6%) and PC2 (16.34%) derives from original response, PC1 (50.56%) and PC2 (34.99%) derives from peak value inflection point, and PC1 (71.18%) and PC2 (24.02%) derive from DWT, PC1 (54.2%) SVD, PC1 (84.48%) and PC2 (7.87%) are derived from and derives from DCT, proposed method with PC2 (27.66%) PCA X-Y scheme in PC1 (75.93%) and PC2 (17.96%), function distribution influence in terms of achieve significant progress, from And improve nicety of grading.Meanwhile we used the classifiers such as SVM, KNN and RF to be compared.After parameter optimization, I Obtained the optimal classification precision of comparison algorithm in table 3.As shown in table 3, for reference method, SVD feature extracting method exists Effect is preferable in SVM classifier, and about 98.83%, original response feature extraction effect is poor, and about 80%.As we As discussed in the classification performance of model part, using CNN-AFE method, average recognition rate reaches as high as 99.9%, when In loss function be added L2 regularization term when, no matter batch size and loss function size, test accuracy rate 99.5% with On.
Table 3
In conclusion a kind of Classification of Tea side based on convolutional neural networks Automatic Feature Extraction proposed by the invention Method keeps classifying quality more accurate.It is analyzed according to 5100 groups of data from the sample survey, classification accuracy is up to 99.9%.With existing ginseng The technology of examining is compared, and model proposed by the present invention has several advantages that (1) large sample collection (5 5100 samples grown tea) Nicety of grading is 99.9%;(2) feature extracting method is fast and convenient, without manually selecting the approximate feature end to end of feature (3) It extracts and taxonomic structure makes system more precise and high efficiency, be conducive to as widely application in the future.
Finally, it is stated that the above examples are only used to illustrate the technical scheme of the present invention and are not limiting, although passing through ginseng According to the preferred embodiment of the present invention, invention has been described, it should be appreciated by those of ordinary skill in the art that can To make various changes to it in the form and details, without departing from the present invention defined by the appended claims Spirit and scope.

Claims (6)

1. a kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction, which comprises the steps of:
S1, the tealeaves data to be sorted for obtaining electronic tongue sensor acquisition;
S2, tealeaves data to be sorted are converted into time-frequency characteristics figure;
S3, time-frequency characteristics figure input convolutional neural networks are obtained into Classification of Tea information.
2. as described in claim 1 based on the Classification of Tea method of convolutional neural networks Automatic Feature Extraction, which is characterized in that Convolutional neural networks include feature extraction layer and Feature Mapping classification layer, and step S3 includes:
S301, time-frequency characteristics figure input feature vector extract layer is obtained into time-frequency figure characteristic value;
S302, time-frequency figure characteristic value input feature vector map classification layer is obtained into Classification of Tea information.
3. as described in claim 1 based on the Classification of Tea method of convolutional neural networks Automatic Feature Extraction, which is characterized in that Tealeaves data to be sorted are converted to by time-frequency characteristics figure using short time discrete Fourier transform.
4. as claimed in claim 3 based on the Classification of Tea method of convolutional neural networks Automatic Feature Extraction, which is characterized in that Based on formula:
Spectrogram { X (m, n) }=| X (m, n) |2
Tealeaves data to be sorted are converted into time-frequency characteristics figure.
5. as claimed in claim 3 based on the Classification of Tea method of convolutional neural networks Automatic Feature Extraction, which is characterized in that During tealeaves data to be sorted are converted to time-frequency characteristics figure using short time discrete Fourier transform, use length for the cunning of Ws Dynamic window, and partly overlap between continuous window.
6. as described in claim 1 based on the Classification of Tea method of convolutional neural networks Automatic Feature Extraction, which is characterized in that The activation primitive of convolutional neural networks is rectification linear function, and pond layer is to maximize function.
CN201910362329.5A 2019-04-30 2019-04-30 A kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction Pending CN110070069A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910362329.5A CN110070069A (en) 2019-04-30 2019-04-30 A kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910362329.5A CN110070069A (en) 2019-04-30 2019-04-30 A kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction

Publications (1)

Publication Number Publication Date
CN110070069A true CN110070069A (en) 2019-07-30

Family

ID=67369910

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910362329.5A Pending CN110070069A (en) 2019-04-30 2019-04-30 A kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction

Country Status (1)

Country Link
CN (1) CN110070069A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110555487A (en) * 2019-09-14 2019-12-10 贵州省茶叶研究所 fresh tea leaf identification and classification method and system based on convolutional neural network
CN110850020A (en) * 2019-11-11 2020-02-28 中国药科大学 Traditional Chinese medicine identification method based on artificial intelligence
CN111967507A (en) * 2020-07-31 2020-11-20 复旦大学 Discrete cosine transform and U-Net based time sequence anomaly detection method
CN114636736A (en) * 2021-11-08 2022-06-17 滁州怡然传感技术研究院有限公司 Electronic tongue white spirit detection method based on AIF-1DCNN

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101071117A (en) * 2007-03-23 2007-11-14 浙江大学 Electrochemical electronic tongue based on wide-bard pulse voltammetry
CN101311711A (en) * 2007-05-25 2008-11-26 浙江工商大学 Intelligent chemical analysis system for liquid sample
CN101957342A (en) * 2009-07-20 2011-01-26 杭州晟迈智能科技有限公司 Volt-ampere electronic tongue
CN103412030A (en) * 2013-03-27 2013-11-27 河南工业大学 Grease detection method based on volt-ampere type electronic tongue
CN103760217A (en) * 2014-01-09 2014-04-30 杭州电子科技大学 Three-electrode-based high-precision blood alcohol concentration test circuit
CN107818366A (en) * 2017-10-25 2018-03-20 成都力创昆仑网络科技有限公司 A kind of lungs sound sorting technique, system and purposes based on convolutional neural networks
CN108564005A (en) * 2018-03-26 2018-09-21 电子科技大学 A kind of human body tumble discrimination method based on convolutional neural networks

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101071117A (en) * 2007-03-23 2007-11-14 浙江大学 Electrochemical electronic tongue based on wide-bard pulse voltammetry
CN101311711A (en) * 2007-05-25 2008-11-26 浙江工商大学 Intelligent chemical analysis system for liquid sample
CN101957342A (en) * 2009-07-20 2011-01-26 杭州晟迈智能科技有限公司 Volt-ampere electronic tongue
CN103412030A (en) * 2013-03-27 2013-11-27 河南工业大学 Grease detection method based on volt-ampere type electronic tongue
CN103760217A (en) * 2014-01-09 2014-04-30 杭州电子科技大学 Three-electrode-based high-precision blood alcohol concentration test circuit
CN107818366A (en) * 2017-10-25 2018-03-20 成都力创昆仑网络科技有限公司 A kind of lungs sound sorting technique, system and purposes based on convolutional neural networks
CN108564005A (en) * 2018-03-26 2018-09-21 电子科技大学 A kind of human body tumble discrimination method based on convolutional neural networks

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MAHUYA BHATTACHARYYA BANERJEE等: "Black tea classification employing feature fusion of E-Nose and E-Tongue responses", 《JOURNAL OF FOOD ENGINEERING》 *
曾雪琼等: "基于卷积神经网络的时频图像识别研究", 《机械与电子》 *
王中明: "《信号与系统》", 31 January 2019 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110555487A (en) * 2019-09-14 2019-12-10 贵州省茶叶研究所 fresh tea leaf identification and classification method and system based on convolutional neural network
CN110850020A (en) * 2019-11-11 2020-02-28 中国药科大学 Traditional Chinese medicine identification method based on artificial intelligence
CN110850020B (en) * 2019-11-11 2022-03-29 中国药科大学 Traditional Chinese medicine identification method based on artificial intelligence
CN111967507A (en) * 2020-07-31 2020-11-20 复旦大学 Discrete cosine transform and U-Net based time sequence anomaly detection method
CN114636736A (en) * 2021-11-08 2022-06-17 滁州怡然传感技术研究院有限公司 Electronic tongue white spirit detection method based on AIF-1DCNN

Similar Documents

Publication Publication Date Title
CN110070069A (en) A kind of Classification of Tea method based on convolutional neural networks Automatic Feature Extraction
Chen et al. Classification of different varieties of Oolong tea using novel artificial sensing tools and data fusion
Lu et al. Classification of Camellia (Theaceae) species using leaf architecture variations and pattern recognition techniques
CN105956624B (en) Mental imagery brain electricity classification method based on empty time-frequency optimization feature rarefaction representation
CN109784242A (en) EEG Noise Cancellation based on one-dimensional residual error convolutional neural networks
CN106909784A (en) Epileptic electroencephalogram (eeg) recognition methods based on two-dimentional time-frequency image depth convolutional neural networks
CN109934089A (en) Multistage epileptic EEG Signal automatic identifying method based on supervision gradient lifter
CN110309811A (en) A kind of hyperspectral image classification method based on capsule network
CN110059565A (en) A kind of P300 EEG signal identification method based on improvement convolutional neural networks
CN114533086B (en) Motor imagery brain electrolysis code method based on airspace characteristic time-frequency transformation
Ren et al. Estimation of Congou black tea quality by an electronic tongue technology combined with multivariate analysis
CN106568907A (en) Chinese mitten crab freshness damage-free detection method based on semi-supervised identification projection
CN108256579A (en) A kind of multi-modal sense of national identity quantization measuring method based on priori
CN103235087A (en) Identification method of origin of oolong tea based on multi-sensor information fusion
Chen et al. Classification of vinegar with different marked ages using olfactory sensors and gustatory sensors
CN108042132A (en) Brain electrical feature extracting method based on DWT and EMD fusions CSP
CN112270255B (en) Electroencephalogram signal identification method and device, electronic equipment and storage medium
CN102998350B (en) Method for distinguishing edible oil from swill-cooked dirty oil by electrochemical fingerprints
CN113033066A (en) Method for establishing near infrared spectrum identification model of sargassum fusiforme production area, strain and cultivation mode and identification method
CN103488868B (en) A kind of method of the intelligent smell discrimination model for setting up honey quality difference
Liu et al. Application of stable isotopic and elemental composition combined with random forest algorithm for the botanical classification of Chinese honey
CN114578963A (en) Electroencephalogram identity recognition method based on feature visualization and multi-mode fusion
CN105760872B (en) A kind of recognition methods and system based on robust image feature extraction
Shi et al. Object-dependent sparse representation for extracellular spike detection
CN106923825A (en) Brain electricity allowance recognition methods and device based on frequency domain and phase space

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination