CN116933174A - Relay protection device defect grading method based on cyclic neural network - Google Patents
Relay protection device defect grading method based on cyclic neural network Download PDFInfo
- Publication number
- CN116933174A CN116933174A CN202310691781.2A CN202310691781A CN116933174A CN 116933174 A CN116933174 A CN 116933174A CN 202310691781 A CN202310691781 A CN 202310691781A CN 116933174 A CN116933174 A CN 116933174A
- Authority
- CN
- China
- Prior art keywords
- neural network
- defect
- relay protection
- grading
- cyclic neural
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000007547 defect Effects 0.000 title claims abstract description 68
- 125000004122 cyclic group Chemical group 0.000 title claims abstract description 44
- 238000000034 method Methods 0.000 title claims abstract description 36
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 19
- 238000003062 neural network model Methods 0.000 claims abstract description 24
- 238000012360 testing method Methods 0.000 claims abstract description 12
- 210000002569 neuron Anatomy 0.000 claims abstract description 9
- 238000007781 pre-processing Methods 0.000 claims abstract description 8
- 230000007774 longterm Effects 0.000 claims abstract description 7
- 238000004880 explosion Methods 0.000 claims abstract description 5
- 230000007246 mechanism Effects 0.000 claims abstract description 5
- 230000006870 function Effects 0.000 claims description 16
- 238000012549 training Methods 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 7
- 230000002950 deficient Effects 0.000 claims description 6
- 230000011218 segmentation Effects 0.000 claims description 6
- 238000011156 evaluation Methods 0.000 claims description 5
- 238000012545 processing Methods 0.000 claims description 5
- 238000012795 verification Methods 0.000 claims description 4
- 230000004913 activation Effects 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 abstract description 2
- 230000008034 disappearance Effects 0.000 abstract 1
- 230000000694 effects Effects 0.000 abstract 1
- 230000008569 process Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013145 classification model Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/237—Lexical tools
- G06F40/242—Dictionaries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/289—Phrasal analysis, e.g. finite state techniques or chunking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
- G06N3/0442—Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y04—INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
- Y04S—SYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
- Y04S10/00—Systems supporting electrical power generation, transmission or distribution
- Y04S10/50—Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Probability & Statistics with Applications (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The application relates to the technical field of relay protection device defect analysis, and provides a relay protection device defect grading method based on a cyclic neural network. The method comprises the following steps: preprocessing data based on a relay protection operation defect record of a regional power grid to obtain a defect grading data set, capturing long-term dependency in a sequence through a cyclic neuron, solving the problems of gradient disappearance and gradient explosion by using a gating mechanism, and finally obtaining a text prediction result of identification vectorization; based on the selected prediction parameters, inputting the test set into the target cyclic neural network model to obtain a defect grading recognition result, and finally evaluating the model effect by using the accuracy, recall and F1 score. The application improves the classification accuracy of the grading prediction result of the defect record text.
Description
Technical Field
The application relates to the technical field of relay protection device defect analysis, in particular to a relay protection device defect grading method based on a cyclic neural network.
Background
In the long-term operation process of the relay protection equipment, a large amount of defect text data is recorded and accumulated through means of inspection, test and the like. After the text data are stored in the system, the text data are usually used as recorded data, a large amount of valuable information contained in the text data cannot be discovered, and meanwhile, a large amount of equipment defect grade classification work needs to be completed manually, so that the efficiency is low, the workload is high, and certain sub-health defects with strong ambiguity are often in an embarrassing situation that accurate judgment is difficult, so that the classification accuracy is affected.
With the rapid development of Chinese text processing technology, it has become possible to automatically classify a large number of defect texts in relay protection defect processing by using a Chinese text classification technology based on deep learning. The application of the technology not only can make a disc of defect text information resources of the power system and lay a foundation for further use of the defect text information resources, but also can reduce or even eliminate the workload of manual classification, and can ensure that defects can be timely processed and reported while improving the classification accuracy.
Disclosure of Invention
The application aims to provide a method for comprehensively considering operation and maintenance assistance of an actual protection device and classification and grading of defect text of a relay protection device, which can better realize vectorization of the relay protection defect text, capture long-term dependency in a sequence based on cyclic neurons, train a neural network by using data to obtain a defect classification model suitable for actual application, so that the classification of the protection defect data can be quickly completed by using a computer
Therefore, the application provides a relay protection device defect grading method based on a cyclic neural network, which is characterized by comprising the following steps:
s1, preprocessing data, namely preprocessing the data based on a relay protection operation defect record of a regional power grid to obtain a defect grading data set, wherein the defect grading data set comprises a training set, a testing set and a verification set;
s2, establishing a cyclic neural network model, wherein the cyclic neural network model comprises an input layer, a cyclic hiding layer and an output layer. Inputting the vectorized text data in the S1 into a cyclic neural network model, and carrying out model iteration for preset times until convergence to obtain a target cyclic neural network model.
S3, inputting the test set in the S1 into the cyclic neural network model based on the trained model to obtain a defect grading recognition result, and evaluating the model performance.
2. The relay protection device defect grading method according to claim 1, wherein S1 comprises:
s11, based on relay protection operation defect records of a certain regional power grid, performing word segmentation processing on text data by adopting jieba word segmentation, decomposing the text into elements with word or character levels, and then combining a relay protection defect dictionary, removing stop words and irrelevant words, and establishing a vocabulary. Where jieba is a packet of word functions based on the python programming language.
S12, vectorizing the text data in the previous step, converting the text data into numerical data representation, and simultaneously performing sequence filling to enable the lengths of the constructed data sets to be consistent.
3. The relay protection device defect grading method according to claim 1, wherein the S2 includes:
s21, constructing an input layer of the cyclic neural network, wherein the input layer is responsible for transmitting the preprocessed vectorized text data to a hidden layer
S22, constructing a circulating hidden layer of a circulating neural network, wherein the circulating hidden layer comprises circulating neurons, can capture long-term dependency in a sequence, and simultaneously adopts an LSTM structure to solve the problems of gradient hours and gradient explosion through a gating mechanism.
S23, constructing an output layer of the cyclic neural network, wherein the output layer uses neurons with a Softmax activation function to output defect grading classification, and the method comprises the following steps of
Z in i And C is the number of output nodes, namely the classified category number, for the output value of the ith node.
4. The method for grading defects of a relay protection device according to claim 3, wherein the probability distribution of the class to which the vectorized text belongs ranges between [0,1], and is 1.
The method for grading defects of a relay protection device according to claim 3, wherein the cross entropy loss function comprises the following calculation formula:
wherein M represents the number of categories, y ic Representing a sign function, taking 1 if the true class of sample i is equal to c, or taking 0, p ic Indicating the probability that the observation sample i belongs to category c.
6. The relay protection device defect grading method according to claim 1, wherein the S3 includes:
s31, adopting accuracy, recall rate and F1 score to represent the prediction precision of a target cyclic neural network model;
s32, inputting the test set into the target cyclic neural network model to obtain a defect grading identification result.
7. The method for grading defects of a relay protection device according to claim 6, wherein the obtaining of the model evaluation index comprises the following calculation formula:
where TP represents the number of correctly rated defective text, FP represents the number of incorrectly rated defective text, and FN represents the number of undetected text.
Compared with the prior art, the application has the beneficial effects that:
1. according to the method, the softmax function is adopted to calculate, probability distribution of the category to which the vectorized text belongs is output, and the classification accuracy of the grading prediction result of the defect record text is improved.
2. The application captures the long-term dependency relationship in the text sequence through the circulating neurons, and simultaneously effectively solves the problems of small gradient and gradient explosion by using a gating mechanism.
3. According to the application, through selecting the accuracy rate, the recall rate and the F1 fraction of a series of model evaluation index parameters, the prediction precision of the target cyclic product neural network model is ensured, and the defects are ensured to be timely processed and reported.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a relay protection device defect grading method based on a cyclic neural network.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The defect grading method of the relay protection device based on the cyclic neural network provided by the application is described in detail below with reference to the accompanying drawings.
FIG. 1 is a flow chart of a relay protection device defect grading method based on a cyclic neural network.
As shown in fig. 1, the relay protection device defect grading method includes:
s1, preprocessing data, namely preprocessing the data based on a relay protection operation defect record of a regional power grid to obtain a defect grading data set, wherein the defect grading data set comprises a training set, a testing set and a verification set;
s11, based on relay protection operation defect records of a certain regional power grid, performing word segmentation processing on text data by adopting jieba word segmentation, decomposing the text into elements with word or character levels, and then combining a relay protection defect dictionary, removing stop words and irrelevant words, and establishing a vocabulary. Where jieba is a packet of word functions based on the python programming language.
S12, vectorizing the text data in the previous step, converting the text data into numerical data representation, and simultaneously performing sequence filling to enable the lengths of the constructed data sets to be consistent.
The data set after the data preprocessing is completed is specifically shown in table 1
Table 1 defect grading dataset
Data set | Training set | Test set | Verification set | Totalizing |
Crisis (Label 0) | 972 | 301 | 290 | 1563 |
Serious (Label 1) | 705 | 238 | 188 | 1131 |
General (Label 2) | 723 | 239 | 242 | 1204 |
Totalizing | 2400 | 778 | 720 | 3898 |
S2, establishing a cyclic neural network model, wherein the cyclic neural network model comprises an input layer, a cyclic hiding layer and an output layer. Inputting the vectorized text data in the S1 into a cyclic neural network model, and carrying out model iteration for preset times until convergence to obtain a target cyclic neural network model.
S21, constructing an input layer of the cyclic neural network, wherein the input layer is responsible for transmitting the preprocessed vectorized text data to a hidden layer
S22, constructing a circulating hidden layer of a circulating neural network, wherein the circulating hidden layer comprises circulating neurons, can capture long-term dependency in a sequence, and simultaneously adopts an LSTM structure to solve the problems of gradient hours and gradient explosion through a gating mechanism.
S23, constructing an output layer of the cyclic neural network, wherein the output layer uses neurons with a Softmax activation function to output defect grading classification, and the method comprises the following steps of
Z in i And C is the number of output nodes, namely the classified category number, for the output value of the ith node.
The cross entropy loss function includes the following calculation formula:
wherein M represents the number of categories, y ic Representing a sign function, taking 1 if the true class of sample i is equal to c, or taking 0, p ic Indicating the probability that the observation sample i belongs to category c.
S3, inputting the test set in the S1 into the cyclic neural network model based on the trained model to obtain a defect grading recognition result, and evaluating the model performance.
S31, adopting accuracy, recall rate and F1 score to represent the prediction precision of a target cyclic neural network model;
s32, inputting the test set into the target cyclic neural network model to obtain a defect grading identification result.
Specifically, the obtaining of the model evaluation index includes the following calculation formula:
where TP represents the number of correctly rated defective text, FP represents the number of incorrectly rated defective text, and FN represents the number of undetected text.
In order to comprehensively evaluate the accuracy of the training model, batch_size=128 in the training process is set, after each batch is trained for one round, the optimal model is taken out to be used as a basic model for the next round of training, and the following results are obtained through 1 ten thousands of times of training:
P=0.6875
R=0.2353
F=0.3380
according to the method, the softmax function is adopted to calculate, probability distribution of the category to which the vectorized text belongs is output, and the classification accuracy of the vectorized text prediction result is improved; the characteristic part containing the set quantitative information entropy in the text is extracted, so that the loss function is converged to a preset value, the calculation speed is high, and the situation of sinking into a local optimal solution is avoided; by selecting a series of model evaluation parameters of accuracy, recall and F1 score, the prediction precision of the target cyclic neural network model is ensured, and the defect can be timely processed and reported.
Any combination of the above optional solutions may be adopted to form an optional embodiment of the present application, which is not described herein.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
Claims (7)
1. The utility model provides a relay protection device defect grading method based on cyclic neural network which is characterized in that the method comprises the following steps:
s1, preprocessing data, namely preprocessing the data based on a relay protection operation defect record of a regional power grid to obtain a defect grading data set, wherein the defect grading data set comprises a training set, a testing set and a verification set;
s2, establishing a cyclic neural network model, wherein the cyclic neural network model comprises an input layer, a cyclic hiding layer and an output layer. Inputting the vectorized text data in the S1 into a cyclic neural network model, and carrying out model iteration for preset times until convergence to obtain a target cyclic neural network model.
S3, inputting the test set in the S1 into the cyclic neural network model based on the trained model to obtain a defect grading recognition result, and evaluating the model performance.
2. The relay protection device defect grading method according to claim 1, wherein S1 comprises:
s11, based on relay protection operation defect records of a certain regional power grid, performing word segmentation processing on text data by adopting jieba word segmentation, decomposing the text into elements with word or character levels, and then combining a relay protection defect dictionary, removing stop words and irrelevant words, and establishing a vocabulary. Where jieba is a packet of word functions based on the python programming language.
S12, vectorizing the text data in the previous step, converting the text data into numerical data representation, and simultaneously performing sequence filling to enable the lengths of the constructed data sets to be consistent.
3. The relay protection device defect grading method according to claim 1, wherein the S2 includes:
s21, constructing an input layer of the cyclic neural network, wherein the input layer is responsible for transmitting the preprocessed vectorized text data to a hidden layer
S22, constructing a circulating hidden layer of a circulating neural network, wherein the circulating hidden layer comprises circulating neurons, can capture long-term dependency in a sequence, and simultaneously adopts an LSTM structure to solve the problems of gradient hours and gradient explosion through a gating mechanism.
S23, constructing an output layer of the cyclic neural network, wherein the output layer uses neurons with a Softmax activation function to output defect grading classification, and the method comprises the following steps of
Z in i And C is the number of output nodes, namely the classified category number, for the output value of the ith node.
4. The method for grading defects of a relay protection device according to claim 3, wherein the probability distribution of the class to which the vectorized text belongs ranges between [0,1], and is 1.
5. The method for grading defects of a relay protection device according to claim 3, wherein the cross entropy loss function comprises the following calculation formula:
wherein M represents the number of categories, y ic Representing a sign function, taking 1 if the true class of sample i is equal to c, or taking 0, p ic Indicating the probability that the observation sample i belongs to category c.
6. The relay protection device defect grading method according to claim 1, wherein the S3 includes:
s31, adopting accuracy, recall rate and F1 score to represent the prediction precision of a target cyclic neural network model;
s32, inputting the test set into the target cyclic neural network model to obtain a defect grading identification result.
7. The method for grading defects of a relay protection device according to claim 6, wherein the obtaining of the model evaluation index comprises the following calculation formula:
where TP represents the number of correctly rated defective text, FP represents the number of incorrectly rated defective text, and FN represents the number of undetected text.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310691781.2A CN116933174A (en) | 2023-06-13 | 2023-06-13 | Relay protection device defect grading method based on cyclic neural network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310691781.2A CN116933174A (en) | 2023-06-13 | 2023-06-13 | Relay protection device defect grading method based on cyclic neural network |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116933174A true CN116933174A (en) | 2023-10-24 |
Family
ID=88393273
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310691781.2A Pending CN116933174A (en) | 2023-06-13 | 2023-06-13 | Relay protection device defect grading method based on cyclic neural network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116933174A (en) |
-
2023
- 2023-06-13 CN CN202310691781.2A patent/CN116933174A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106895975B (en) | Bearing fault diagnosis method based on Stacked SAE deep neural network | |
CN112858473B (en) | Turnout switch blade damage state monitoring method based on feature fusion | |
CN111882446B (en) | Abnormal account detection method based on graph convolution network | |
CN109165284A (en) | A kind of financial field human-computer dialogue intension recognizing method based on big data | |
CN110188047B (en) | Double-channel convolutional neural network-based repeated defect report detection method | |
CN111274814B (en) | Novel semi-supervised text entity information extraction method | |
CN111767398A (en) | Secondary equipment fault short text data classification method based on convolutional neural network | |
CN111767397A (en) | Electric power system secondary equipment fault short text data classification method | |
CN106681305A (en) | Online fault diagnosing method for Fast RVM (relevance vector machine) sewage treatment | |
CN116229380A (en) | Method for identifying bird species related to bird-related faults of transformer substation | |
CN112836809A (en) | Device characteristic extraction method and fault prediction method of convolutional neural network based on differential feature fusion | |
CN109063983A (en) | A kind of natural calamity loss real time evaluating method based on social media data | |
CN107766560A (en) | The evaluation method and system of customer service flow | |
CN112035345A (en) | Mixed depth defect prediction method based on code segment analysis | |
CN114662405A (en) | Rock burst prediction method based on few-sample measurement and ensemble learning | |
CN111737993B (en) | Method for extracting equipment health state from fault defect text of power distribution network equipment | |
CN117150232B (en) | Large model non-time sequence training data quality evaluation method | |
CN112559741B (en) | Nuclear power equipment defect record text classification method, system, medium and electronic equipment | |
CN117370574A (en) | Defect analysis method for improving power main equipment knowledge graph embedding model performance | |
CN117037841A (en) | Acoustic signal hierarchical cavitation intensity identification method based on hierarchical transition network | |
CN111507534A (en) | Student learning data-based predictive analysis algorithm for knowledge point mastering conditions | |
CN116992362A (en) | Transformer fault characterization feature quantity screening method and device based on Xia Puli value | |
CN106991171A (en) | Topic based on Intelligent campus information service platform finds method | |
CN116933174A (en) | Relay protection device defect grading method based on cyclic neural network | |
CN111160756A (en) | Scenic spot assessment method and model based on secondary artificial intelligence algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication |