CN113780304B - Substation equipment image retrieval method and system based on neural network - Google Patents

Substation equipment image retrieval method and system based on neural network Download PDF

Info

Publication number
CN113780304B
CN113780304B CN202110909340.6A CN202110909340A CN113780304B CN 113780304 B CN113780304 B CN 113780304B CN 202110909340 A CN202110909340 A CN 202110909340A CN 113780304 B CN113780304 B CN 113780304B
Authority
CN
China
Prior art keywords
vector
feature library
distance
feature
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110909340.6A
Other languages
Chinese (zh)
Other versions
CN113780304A (en
Inventor
许尧
夏熠
许旵鹏
彭明智
陈知丰
张冬晛
柏跃润
燕亭
王晓东
马欢
潘军
樊振东
穆云龙
刘显祖
李梦琪
陈练
冯维刚
熊少华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WUHAN ZHONGYUAN HUADIAN SOFTWARE CO Ltd
Super High Voltage Branch Of State Grid Anhui Electric Power Co ltd
Original Assignee
WUHAN ZHONGYUAN HUADIAN SOFTWARE CO Ltd
Super High Voltage Branch Of State Grid Anhui Electric Power Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WUHAN ZHONGYUAN HUADIAN SOFTWARE CO Ltd, Super High Voltage Branch Of State Grid Anhui Electric Power Co ltd filed Critical WUHAN ZHONGYUAN HUADIAN SOFTWARE CO Ltd
Priority to CN202110909340.6A priority Critical patent/CN113780304B/en
Publication of CN113780304A publication Critical patent/CN113780304A/en
Application granted granted Critical
Publication of CN113780304B publication Critical patent/CN113780304B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y04INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
    • Y04SSYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
    • Y04S10/00Systems supporting electrical power generation, transmission or distribution
    • Y04S10/50Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a substation equipment image retrieval method and system based on a neural network, and belongs to the technical field of image retrieval. The searching method comprises the following steps: constructing a database; constructing and training an AlexNet-FC network model to obtain a first classification accuracy; the extracted image features form a first feature library; extracting SIFT features to construct a visual dictionary; generating a second feature library according to the SIFT features and the visual dictionary; constructing a three-layer BP neural network model, and training the three-layer BP neural network to obtain a second classification accuracy; extracting image features from an image to be retrieved to form a first vector; extracting SIFT features of the images to be retrieved to obtain a second vector; calculating a distance between the first vector and the first feature library; calculating a distance between the second vector and the second feature library; calculating a final distance between the image to be retrieved and the classified image according to the formula (1) and the formula (2); the first plurality of classified images of the final distance are selected from among the classified images in order from small to large as the retrieval result.

Description

Substation equipment image retrieval method and system based on neural network
Technical Field
The application relates to the technical field of image retrieval, in particular to a substation equipment image retrieval method and system based on a neural network.
Background
Along with the development of electric power, the intelligent level of the transformer substation is improved, higher requirements are put forward on the safety and reliability of an electric power system, and real-time video monitoring is widely applied in the electric power industry, so that huge amounts of image data information can be generated, the data are subjected to manual analysis, the working efficiency is low, fatigue is easy to generate, misjudgment is generated, the real-time requirements cannot be met, the problem in operation cannot be found in time, and therefore an intelligent analysis system is urgently needed, and the image retrieval of transformer substation equipment is automatically realized.
The analysis of the existing substation equipment image retrieval technology comprises the following steps: an image retrieval method adopting convolutional neural network learning is adopted. The method comprises the steps of firstly learning a convolutional neural network model, then extracting features of an input image and database features by the convolutional neural network after learning, then calculating the distance between the features of the input image and the features of the database, and further screening out a proper image.
Disclosure of Invention
The embodiment of the application aims to provide a substation equipment image retrieval method and system based on a neural network, and the retrieval method and system can improve the accuracy of substation equipment image retrieval.
In order to achieve the above object, an embodiment of the present application provides a substation equipment image retrieval method based on a neural network, including:
constructing a database of substation equipment images with category information;
constructing an AlexNet-FC network model based on AlexNet, and training the AlexNet-FC network model by adopting the database to obtain a first classification accuracy;
image features extracted from the database by adopting the first 7 layers of the AlexNet-FC network model form a first feature library;
extracting SIFT features in the database to construct a visual dictionary;
generating a second feature library according to the SIFT features and the visual dictionary;
constructing a three-layer BP neural network model, and training the three-layer BP neural network by adopting the second feature library to obtain a second classification accuracy;
image features extracted from images to be retrieved by adopting the first 7 layers of the AlexNet-FC network model form a first vector;
extracting SIFT features of the images to be retrieved according to the visual dictionary to obtain second vectors;
calculating a distance between the first vector and the first feature library;
calculating a distance between the second vector and the second feature library;
calculating the final distance between the image to be retrieved and the classified image according to the formula (1) and the formula (2),
wherein D is the final distance, ac1 is the first accuracy, ac2 is the second accuracy, D a D is the distance between the first vector and the first feature library b D, the distance between the second vector and the second feature library is d i For Euclidean distance f between the first vector and the ith feature vector in the first feature library i The Euclidean distance between the second vector and the ith feature vector in the second feature library is obtained, and n is the number of the feature vectors in the second feature library;
the first plurality of classified images of the final distance are selected from the classified images in order from small to large as a search result.
Optionally, the constructing the AlexNet-FC network model based on AlexNet includes:
the AlexNet-FC network is 8 layers and 6 th layerThe number of the nodes is 2048, the number of the nodes at the layer 8 is the number of the device categories N, and the number of the nodes at the layer 7 isPersonal (S)>Representing an upward rounding.
Optionally, the three-layer BP neural network model includes an input layer, a hidden layer and an output layer, wherein the number of neurons of the input layer is equal to the number M of words in the visual dictionary, the number of neurons of the output layer is the class number N of the device, and the number of neurons of the hidden layer isPersonal (S)>Representing an upward rounding.
Optionally, the calculating the distance between the first vector and the first feature library includes:
calculating the distance between the first vector and the first feature library according to formula (3), formula (4) and formula (5),
F a =(d 1 ,d 2 ,…d j ,…,d n ),1≤j≤n (4)
wherein D is a F for the distance between the first vector and the first feature library a Representing the Euclidean distance set of the first vector and all feature vectors in the first feature library, ||F a || 2 Representing Euclidean distance setsF of combination a L2 norm, x i Y, which is the i-th element in the first vector i An ith element of the jth feature vector in the first feature library, m is the length of the first vector, d j And the Euclidean distance between the first vector and the jth feature vector in the first feature library.
Optionally, the calculating the distance between the second vector and the second feature library includes:
calculating the distance between the second vector and the second feature library according to formula (6), formula (7) and formula (8),
F b =(f 1 ,f 2 ,…f j ,…,f n ),1≤j≤n (7)
wherein D is b F for the distance between the second vector and the second feature library b Representing the Euclidean distance set of the second vector and all feature vectors in the second feature library, ||F b || 2 Representing European distance set F b L2 norm, p i Q being the ith element in the second vector i I element of the j feature vector in the second feature library, l is the length of the second vector, f j And the Euclidean distance between the second vector and the jth feature vector in the second feature library.
In another aspect, the present application also provides a neural network-based substation equipment image retrieval system, the retrieval system comprising a processor configured to perform a retrieval method as described in any one of the above.
Through the technical scheme, the substation equipment image retrieval method and system based on the neural network provided by the application have the advantages that on one hand, the image features extracted through the convolutional neural network are combined with the image features extracted through the BOW construction method, so that the accuracy of image feature expression is improved; on the other hand, the accuracy of the neural network classifier is calculated by the two features (SIFT feature and BOW feature), and the calculation difference of the image distance under the two features is fused when the final classification probability is calculated, so that the accuracy of image retrieval is improved.
Additional features and advantages of embodiments of the application will be set forth in the detailed description which follows.
Drawings
The accompanying drawings are included to provide a further understanding of embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain, without limitation, the embodiments of the application. In the drawings:
fig. 1 is a flowchart of a neural network-based substation equipment image retrieval method according to one embodiment of the present application.
Detailed Description
The following describes the detailed implementation of the embodiments of the present application with reference to the drawings. It should be understood that the detailed description and specific examples, while indicating and illustrating the application, are not intended to limit the application.
In the embodiments of the present application, unless otherwise indicated, terms of orientation such as "upper, lower, top, bottom" are used generally with respect to the orientation shown in the drawings or with respect to the positional relationship of the various components with respect to one another in the vertical, vertical or gravitational directions.
In addition, if there is a description of "first", "second", etc. in the embodiments of the present application, the description of "first", "second", etc. is for descriptive purposes only and is not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In addition, the technical solutions of the embodiments may be combined with each other, but it is necessary to base that the technical solutions can be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be considered to be absent and not within the scope of protection claimed in the present application.
Fig. 1 is a flowchart of a substation equipment image retrieval method based on a neural network according to an embodiment of the present application. In this fig. 1, the search method may include:
in step S10, a database of substation equipment images with category information is constructed;
in step S11, constructing an AlexNet-FC network model based on AlexNet, and training the AlexNet-FC network model by adopting a database to obtain a first classification accuracy;
in step S12, image features extracted from the database by the first 7 layers of the AlexNet-FC network model are adopted to form a first feature library;
in step S13, SIFT features in the database are extracted to construct a visual dictionary;
in step S14, a second feature library is generated according to SIFT features and a visual dictionary;
in step S15, a three-layer BP neural network model is built, and a second feature library is adopted to train the three-layer BP neural network so as to obtain a second classification accuracy;
in step S16, image features extracted from the images to be retrieved by adopting the first 7 layers of the AlexNet-FC network model form a first vector;
in step S17, SIFT features of the image to be retrieved are extracted according to the visual dictionary to obtain a second vector;
in step S18, a distance between the first vector and the first feature library is calculated;
in step S19, a distance between the second vector and the second feature library is calculated;
in step S20, the final distance between the image to be retrieved and the classified image is calculated according to the formula (1) and the formula (2),
wherein D is the final distance, ac1 is the first accuracy, ac2 is the second accuracy, D a D is the distance between the first vector and the first feature library b D is the distance between the second vector and the second feature library i For Euclidean distance, f, between the first vector and the ith feature vector in the first feature library i The Euclidean distance between the second vector and the ith feature vector in the second feature library, and n is the number of feature vectors in the second feature library;
in step S21, the first plurality of classification images of the final distance are selected from the classification images in order from small to large as the search result.
In this method as shown in fig. 1, step S10 may be used to build a database of substation equipment images, thereby providing a training and testing dataset for subsequent neural network training. Specifically, the step S10 may be to collect a plurality of substation device images, then manually mark a category of each substation device image, and finally associate each substation device image with a corresponding category to obtain the database.
Step S11 can be used for constructing an AlexNet-FC network model based on AlexNet, and training the AlexNet-FC network model by adopting a preset database to obtain the first classification accuracy. For this AlexNet-FC network model, it may be a structure known to those skilled in the art. However, in a preferred example of the present application, the AlexNet-FC network may be composed of 5 convolutional layers and 3 fully-connected layers in series, and the activation function of at least one of the 3 fully-connected layers may be a sigmoid function, the number of layer 6 nodes may be 2048, the number of layer 8 nodes may be the class number N of the device, and the number of layer 7 nodes may bePersonal (S)>Representing an upward rounding.
Step S12 may be used to extract a first feature library of the database. Specifically, the image features of the database are extracted using the AlexNet-FC network model described above, and the image features extracted from the first 7 layers are combined to construct the first feature library.
Step S13 may be used to construct a visual dictionary. Through the visual dictionary, SIFT features in the database can be considered in the subsequent classification process, so that the classification accuracy and efficiency are improved. Specifically, the step S13 may be to extract SIFT features of the database to form a feature set, and then perform K-means clustering on the feature set to obtain the visual dictionary. In extracting SIFT features, a large number of experiments prove that the final classification effect is optimal when the Hessian threshold is set to be 500.
Based on the visual dictionary constructed in step S13, step S14 may further perform statistics on SIFT features of the database, to generate a second feature library of the database. Specifically, the step S14 may be to construct a set of BOW feature vectors of the database according to SIFT features of the database in combination with the visual dictionary, and aggregate all the BOW feature vectors to obtain the second feature library.
Step S15 can be used for constructing a three-layer BP neural network, so that a neural network is newly constructed outside the constructed AlexNet-FC network to classify the second feature library, and the problem of inaccurate local classification caused by single neural network classification is avoided. In particular, the three-layer BP neural network may include an input layer, a hidden layer, and an output layer. Wherein the number of neurons of the input layer can be equal to the number M of words in the visual dictionary, the number of neurons of the output layer can be the category number N of the equipment, and the number of neurons of the hidden layer can bePersonal (S)>Representing an upward rounding.
In step S16, image features extracted from the image to be retrieved using the first 7 layers of the AlexNet-FC network model constitute a first vector. Steps S17 to S19 calculate the distance between the first vector and the first feature library and the distance between the second vector and the second feature library, respectively. The specific manner of calculating the two distances may be in a variety of forms known to those skilled in the art. In a preferred example provided by the present application, however, the way to calculate the distance between the first vector and the first feature library may be, for example, calculating the distance between the first vector and the first feature library according to equation (3), equation (4) and equation (5),
F a =(d 1 ,d 2 ,…d j ,…,d n ),1≤j≤n (4)
wherein D is a F is the distance between the first vector and the first feature library a The Euclidean distance set of the first vector and all the feature vectors in the first feature library is represented, |F a || 2 Representing the Euclidean distance set F a L2 norm, x i Y, which is the i-th element in the first vector i The ith element of the jth feature vector in the first feature library, m is the length of the first vector, d j Is the Euclidean distance between the first vector and the jth feature vector in the first feature library.
The distance between the second vector and the second feature library may be calculated e.g. according to equation (6), equation (7) and equation (8),
F b =(f 1 ,f 2 ,…f j ,…,f n ),1≤j≤n (7)
wherein D is b F is the distance between the second vector and the second feature library b Representing the Euclidean distance set of the second vector and all feature vectors in the second feature library, |F b || 2 Representing European distance set F b L2 norm, p i Is the ith element, q, in the second vector i I element of the j-th feature vector in the second feature library, l is the length of the second vector, f j Is the Euclidean distance between the second vector and the jth feature vector in the second feature library.
In another aspect, the present application also provides a neural network-based substation equipment image retrieval system, the retrieval system comprising a processor configured to perform a retrieval method as described in any one of the above.
Through the technical scheme, the substation equipment image retrieval method and system based on the neural network provided by the application have the advantages that on one hand, the image features extracted through the convolutional neural network are combined with the image features extracted through the SIFT+BOW construction method, so that the accuracy of image feature expression is improved; on the other hand, the accuracy of the neural network classifier is calculated by the two features (the feature extracted by the convolutional neural network and the feature extracted by the SIFT+BOW method), and when the final classification probability is calculated, the calculation difference of the image distance under the two features is fused, so that the accuracy of image retrieval is improved.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises an element.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.

Claims (6)

1. The substation equipment image retrieval method based on the neural network is characterized by comprising the following steps of:
constructing a database of substation equipment images with category information;
constructing an AlexNet-FC network model based on AlexNet, and training the AlexNet-FC network model by adopting the database to obtain a first classification accuracy;
image features extracted from the database by adopting the first 7 layers of the AlexNet-FC network model form a first feature library;
extracting SIFT features in the database to construct a visual dictionary;
generating a second feature library according to the SIFT features and the visual dictionary;
constructing a three-layer BP neural network model, and training the three-layer BP neural network by adopting the second feature library to obtain a second classification accuracy;
image features extracted from images to be retrieved by adopting the first 7 layers of the AlexNet-FC network model form a first vector;
extracting SIFT features of the images to be retrieved according to the visual dictionary to obtain second vectors;
calculating a distance between the first vector and the first feature library;
calculating a distance between the second vector and the second feature library;
calculating the final distance between the image to be retrieved and the classified image according to the formula (1) and the formula (2),
wherein D is the final distance, ac1 is the first accuracy, ac2 is the second accuracy, D a D is the distance between the first vector and the first feature library b D, the distance between the second vector and the second feature library is d i For Euclidean distance f between the first vector and the ith feature vector in the first feature library i The Euclidean distance between the second vector and the ith feature vector in the second feature library is obtained, and n is the number of the feature vectors in the second feature library;
the first plurality of classified images of the final distance are selected from the classified images in order from small to large as a search result.
2. The retrieval method according to claim 1, wherein the constructing an AlexNet-FC network model based on AlexNet comprises:
the AlexNet-FC network is 8 layers, the number of the nodes at the layer 6 is 2048, the number of the nodes at the layer 8 is the number of the equipment categories N, and the number of the nodes at the layer 7 isPersonal (S)>Representing an upward rounding.
3. The method of claim 1, wherein the three-layer BP neural networkThe complex model comprises an input layer, a hidden layer and an output layer, wherein the number of neurons of the input layer is equal to the number M of words in the visual dictionary, the number of neurons of the output layer is the class number N of the equipment, and the number of neurons of the hidden layer isPersonal (S)>Representing an upward rounding.
4. The retrieval method of claim 1, wherein the calculating the distance between the first vector and the first feature library comprises:
calculating the distance between the first vector and the first feature library according to formula (3), formula (4) and formula (5),
F a =(d 1 ,d 2 ,…d j ,…,d n ),1≤j≤n (4)
wherein D is a F for the distance between the first vector and the first feature library a Representing the Euclidean distance set of the first vector and all the feature vectors in the first feature library, |F a2 Representing the Euclidean distance set F a L2 norm, x i Y, which is the i-th element in the first vector i An ith element of the jth feature vector in the first feature library, m is the length of the first vector, d j And the Euclidean distance between the first vector and the jth feature vector in the first feature library.
5. The retrieval method of claim 1, wherein the calculating the distance between the second vector and the second feature library comprises:
calculating the distance between the second vector and the second feature library according to formula (6), formula (7) and formula (8),
F b =(f 1 ,f 2 ,…f j ,…,f n ),1≤j≤n (7)
wherein D is b F for the distance between the second vector and the second feature library b Representing the Euclidean distance set of the second vector and all the feature vectors in the second feature library, |F b2 Representing European distance set F b L2 norm, p i Q being the ith element in the second vector i I element of the j feature vector in the second feature library, l is the length of the second vector, f j And the Euclidean distance between the second vector and the jth feature vector in the second feature library.
6. A neural network based substation equipment image retrieval system, characterized in that the retrieval system comprises a processor configured to perform the retrieval method of any of claims 1 to 5.
CN202110909340.6A 2021-08-09 2021-08-09 Substation equipment image retrieval method and system based on neural network Active CN113780304B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110909340.6A CN113780304B (en) 2021-08-09 2021-08-09 Substation equipment image retrieval method and system based on neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110909340.6A CN113780304B (en) 2021-08-09 2021-08-09 Substation equipment image retrieval method and system based on neural network

Publications (2)

Publication Number Publication Date
CN113780304A CN113780304A (en) 2021-12-10
CN113780304B true CN113780304B (en) 2023-12-05

Family

ID=78837089

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110909340.6A Active CN113780304B (en) 2021-08-09 2021-08-09 Substation equipment image retrieval method and system based on neural network

Country Status (1)

Country Link
CN (1) CN113780304B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107256246A (en) * 2017-06-06 2017-10-17 西安工程大学 PRINTED FABRIC image search method based on convolutional neural networks
CN109299306A (en) * 2018-12-14 2019-02-01 央视国际网络无锡有限公司 Image search method and device
CN110347851A (en) * 2019-05-30 2019-10-18 中国地质大学(武汉) Image search method and system based on convolutional neural networks
CN112163114A (en) * 2020-09-10 2021-01-01 华中科技大学 Image retrieval method based on feature fusion
CN112579816A (en) * 2020-12-29 2021-03-30 二十一世纪空间技术应用股份有限公司 Remote sensing image retrieval method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111831844A (en) * 2019-04-17 2020-10-27 京东方科技集团股份有限公司 Image retrieval method, image retrieval device, image retrieval apparatus, and medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107256246A (en) * 2017-06-06 2017-10-17 西安工程大学 PRINTED FABRIC image search method based on convolutional neural networks
CN109299306A (en) * 2018-12-14 2019-02-01 央视国际网络无锡有限公司 Image search method and device
CN110347851A (en) * 2019-05-30 2019-10-18 中国地质大学(武汉) Image search method and system based on convolutional neural networks
CN112163114A (en) * 2020-09-10 2021-01-01 华中科技大学 Image retrieval method based on feature fusion
CN112579816A (en) * 2020-12-29 2021-03-30 二十一世纪空间技术应用股份有限公司 Remote sensing image retrieval method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于图像内容检索的作物病害识别数据库系统研究;濮永仙;计算机与现代化(第04期);54-60 *

Also Published As

Publication number Publication date
CN113780304A (en) 2021-12-10

Similar Documents

Publication Publication Date Title
CN105022835B (en) A kind of intelligent perception big data public safety recognition methods and system
CN111026842B (en) Natural language processing method, natural language processing device and intelligent question-answering system
CN114743020B (en) Food identification method combining label semantic embedding and attention fusion
CN110309331A (en) A kind of cross-module state depth Hash search method based on self-supervisory
CN108536754A (en) Electronic health record entity relation extraction method based on BLSTM and attention mechanism
CN111127385A (en) Medical information cross-modal Hash coding learning method based on generative countermeasure network
CN110555084A (en) remote supervision relation classification method based on PCNN and multi-layer attention
CN112417132B (en) New meaning identification method for screening negative samples by using guest information
CN109344227A (en) Worksheet method, system and electronic equipment
CN116842194A (en) Electric power semantic knowledge graph system and method
CN117151222B (en) Domain knowledge guided emergency case entity attribute and relation extraction method thereof, electronic equipment and storage medium
CN115269899A (en) Remote sensing image overall planning system based on remote sensing knowledge map
CN116129286A (en) Method for classifying graphic neural network remote sensing images based on knowledge graph
CN117390497A (en) Category prediction method, device and equipment based on large language model
CN115392254A (en) Interpretable cognitive prediction and discrimination method and system based on target task
CN116958512A (en) Target detection method, target detection device, computer readable medium and electronic equipment
CN115619117A (en) Power grid intelligent scheduling method based on duty system
CN113722494A (en) Equipment fault positioning method based on natural language understanding
CN118035440A (en) Enterprise associated archive management target knowledge feature recommendation method
CN113780304B (en) Substation equipment image retrieval method and system based on neural network
CN116050419B (en) Unsupervised identification method and system oriented to scientific literature knowledge entity
CN115797795B (en) Remote sensing image question-answer type retrieval system and method based on reinforcement learning
CN109697257A (en) It is a kind of based on the network information retrieval method presorted with feature learning anti-noise
CN114898776A (en) Voice emotion recognition method of multi-scale feature combined multi-task CNN decision tree
CN114610882A (en) Abnormal equipment code detection method and system based on electric power short text classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: No. 397, Tongcheng South Road, Baohe District, Hefei City, Anhui Province 230061

Applicant after: Super high voltage branch of State Grid Anhui Electric Power Co.,Ltd.

Applicant after: WUHAN ZHONGYUAN HUADIAN SOFTWARE Co.,Ltd.

Address before: No.8, jincui Road, Shuangfeng Industrial Park, Fuyang North Road, Changfeng County, Hefei City, Anhui Province

Applicant before: STATE GRID ANHUI POWER SUPPLY COMPANY OVERHAUL BRANCH

Applicant before: WUHAN ZHONGYUAN HUADIAN SOFTWARE Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant