CN110929723A - Identification method of transformer substation pointer instrument based on convolutional neural network - Google Patents
Identification method of transformer substation pointer instrument based on convolutional neural network Download PDFInfo
- Publication number
- CN110929723A CN110929723A CN201911143610.6A CN201911143610A CN110929723A CN 110929723 A CN110929723 A CN 110929723A CN 201911143610 A CN201911143610 A CN 201911143610A CN 110929723 A CN110929723 A CN 110929723A
- Authority
- CN
- China
- Prior art keywords
- neural network
- convolutional neural
- instrument
- pointer
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/245—Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/02—Recognising information on displays, dials, clocks
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y04—INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
- Y04S—SYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
- Y04S10/00—Systems supporting electrical power generation, transmission or distribution
- Y04S10/50—Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method for identifying a pointer instrument of a transformer substation based on a convolutional neural network, which comprises the following steps of: acquiring a pointer instrument image of a transformer substation, and establishing an instrument image library consisting of instrument images; training a convolutional neural network for positioning a pointer instrument area by utilizing an instrument image library, and determining learning parameters in the convolutional neural network; training a convolutional neural network for instrument type identification by using the instrument database, and determining learning parameters in the convolutional neural network; positioning the area of the instrument image in the instrument image library by using the trained convolutional neural network for positioning the instrument area; recognizing the cut instrument area image by using the trained convolutional neural network for instrument type recognition, and outputting an instrument detection result; then, according to the detection result, different types of meters are read by adopting corresponding methods.
Description
Technical Field
The invention relates to the field of image processing and pattern recognition, in particular to a recognition method of a transformer substation pointer instrument based on a convolutional neural network.
Background
At present, the climate of most power plants in the south belongs to subtropical monsoon type marine climate areas, the atmosphere is rich in strong corrosive ions such as nitrogen oxides, namely the environment of the power plants is poor, certain matching logic exists among all devices, once a certain device has an operation fault, the whole system can be shut down, and therefore the safe and efficient operation of the whole plant is influenced, and the relation is great. Therefore, the pointer instrument is used for reflecting the working state of the equipment in the power plant. In this case, manual inspection and recording of various meters in the plant are required. However, the manpower inspection has two main disadvantages: 1. the internal environment of a power plant is complex, the number of types of instruments is large, manual inspection is time-consuming, labor-consuming and prone to misjudgment, and potential safety hazards exist; 2. the traditional pointer instrument identification method has great limitation, can be identified only in a stable environment, and has low accuracy.
Disclosure of Invention
The invention provides a transformer substation pointer instrument identification method based on a convolutional neural network, which helps to improve the inspection efficiency of a power plant pointer instrument.
In order to solve the technical problems, the technical scheme of the invention is as follows:
a transformer substation pointer instrument identification method based on a convolutional neural network comprises the following steps:
s1: acquiring an image comprising a transformer substation pointer instrument, and establishing an instrument image library;
s2: training a first convolutional neural network and a second convolutional neural network by using the instrument image library in the step S1, wherein the first convolutional neural network is used for positioning the pointer instrument area, and the second convolutional neural network is used for identifying the type of the pointer instrument;
s3: positioning an instrument area where an instrument is located in an instrument image by using the trained first convolution neural network;
s4: identifying the type of the instrument by using the trained second convolutional neural network, and outputting an instrument identification result;
s5: according to the identification result of S4, different types of meters are read by corresponding methods.
Preferably, after step S3 and before step S4, the instrument area obtained in step S3 is also trimmed.
Preferably, the image including the substation pointer instrument acquired in step S1 includes a pointer type circular ammeter with a range of 0-70A, a pointer type circular ammeter with a range of 0-1A, and a pointer type ammeter with a range of 0-20A.
Preferably, the training of the first convolutional neural network in step S2 includes learning parameters of the first convolutional neural network, and learning the parameters of the first convolutional neural network by continuously decreasing function values of the loss function, wherein the loss function L ({ p) of the first convolutional neural networki},{ti}) is:
where i denotes the ith anchor point in the first convolutional neural network, piFor the prediction probability that the ith anchor point is the instrument region, if the anchor point is the true value of the regular labelIs 1, anchor point is negativeIs 0, when1, the regression loss function LregIs activated; t is tiIs a vector containing the coordinate parameters of the four vertices of the location box of the meter region,the actual value of the positioning frame; l isclsLog loss functions including target and non-target for both classes; l isregThe regression loss function representing the positioning frame is taken asWherein R is a 1-norm loss function; n is a radical ofCLS256, normalized to the cls term; n is a radical ofreg2400(40 × 60), the number of alignment boxes; λ is 10, which is the balance factor.
Preferably, the training of the second convolutional neural network in step S2 includes parameter learning of the second convolutional neural network, and learning to obtain parameters of the second convolutional neural network by continuously decreasing function values of the loss function, where the loss function l of the second convolutional neural network is:
wherein g andthe prediction values represent prediction probabilities of the instruments and are distributed in [0,1 [0 ], 1 [ [ 1 ] ] and [0 ] ]]To (c) to (d); the size of each image is N × W × H, where N denotes the number of output categories, W and H denote the width and height of the image, and N denotes the true label of pixel (i, j).
Preferably, the instrument area obtained in S3 is cut, specifically:
and cutting the image in the frame along four outer edges formed by the four vertexes according to the four vertex coordinates of the positioning frame containing the instrument area obtained in the step S3 to obtain an instrument area image.
Preferably, in step S4, the clipped instrument region image is sent to a training convolutional neural network for recognition, and an instrument recognition result is output.
Preferably, in step S5, when the meter identification result is an indicator-type circular ammeter, the reading is represented by the following formula:
the final reading is obtained, where Angle is the 0 scale to pointer deflection Angle and θ is the 0 Angle to pointer deflection Angle. Value is the last reading of the pointer and a is the maximum range of the pointer.
Preferably, in step S5, when the meter identification result is a pointer-type square ammeter, the reading is performed by using a neural network regression method, where the neural network inputs the real current value and outputs the predicted current value specifically as follows: and continuously reducing the function value of the value function by adopting a random gradient descent method to learn the parameters of the established third convolutional neural network, wherein the value function is expressed as:
where ω is a weight parameter of the third convolutional neural network, n is a number of samples of the third convolutional neural network training, xiIs the feature vector of the ith training sample, yiIs the label of the ith training sample; f (-) is the excitation function of the third convolutional neural network, and L (-) is the loss function of the third convolutional neural network.
Preferably, the third convolutional neural network uses only a portion of the training samples (x) in each iterationi,yi) And learning and updating the weight parameters, wherein the weight parameters of each generation can be expressed as:
wherein t represents the number of iterations, and the value range is [3000, + ∞ ]]α denotes the learning rateThe value range is: [0.0003,0.01];Representing the partial differential of the cost function.
Compared with the prior art, the technical scheme of the invention has the beneficial effects that:
the method comprises the steps of training a convolutional neural network for positioning a pointer instrument area and a convolutional neural network for instrument type identification, and positioning an instrument image area in an instrument image library by using the trained convolutional neural network for positioning the instrument area; recognizing the cut instrument area image by using the trained convolutional neural network for instrument type recognition, and outputting an instrument detection result; then, according to the detection result, different types of meters are read by adopting corresponding methods. The invention does not need manpower to patrol, saves time and cost, reduces potential safety hazards, and greatly improves the patrol efficiency and accuracy of the pointer instrument of the power plant.
Drawings
Fig. 1 is a flowchart of a method for identifying a pointer instrument of a transformer substation based on a convolutional neural network.
Fig. 2 is a schematic overall structure diagram of the first convolutional neural network.
Fig. 3 is a schematic diagram of the overall structure of the second convolutional neural network.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the patent;
for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product;
it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The technical solution of the present invention is further described below with reference to the accompanying drawings and examples.
Example 1
The embodiment provides a method for identifying a pointer instrument of a transformer substation based on a convolutional neural network, as shown in fig. 1, the method comprises the following steps:
s1: the method comprises the steps of obtaining an image comprising a transformer substation pointer instrument, and establishing an instrument image library, wherein the image comprising a pointer type circular ammeter with the measuring range of 0-70A, a pointer type circular ammeter with the measuring range of 0-1A and a pointer type ammeter with the measuring range of 0-20A.
S2: training a first convolutional neural network and a second convolutional neural network by using the instrument image library in the step S1, wherein the first convolutional neural network is used for positioning the pointer instrument area, and the second convolutional neural network is used for identifying the type of the pointer instrument; training the two convolutional neural networks comprises the structure establishment of the convolutional neural networks and the parameter learning of the convolutional neural networks; and the first convolution neural network finishes the positioning of the instrument area in the image, cuts the image in the positioning frame to obtain an instrument area image, inputs the instrument area image into the second convolution neural network and outputs an instrument detection result. The structure establishment of the two convolutional neural networks comprises the steps of determining the number of convolutional layers of the neural network, the number of feature maps of each convolutional layer, the number of fully-connected layers, the number of feature maps of each fully-connected layer, the number of layers of pooling layers, the size of convolutional cores used by the convolutional layers, the size of sampling cores used by the pooling layers and training step length.
The first convolutional neural network structure used in this embodiment is as shown in fig. 2, the target detection method used in this embodiment is fast-RCNN, and the main structure is an instrument area candidate frame network, which is used for positioning an instrument area, and an instrument image in an instrument image library is input to the convolutional neural network, and is output as an instrument image with an instrument area positioning frame; fig. 3 shows a flowchart of identifying the type of the instrument using the second convolutional neural network, in which a google internet inclusion _ V3 convolutional neural network structure is used, and the instrument area image clipped in the instrument area positioning frame is input to the convolutional neural network, and is output as a detection result of a certain instrument type.
The training of the first convolutional neural network comprises parameter learning of the first convolutional neural network, and the parameter of the first convolutional neural network is obtained by continuously reducing the function value of the loss function through learning, wherein the first convolutional neural networkLoss function of L ({ p)i},{ti}) is:
where i denotes the ith anchor point in the first convolutional neural network, piFor the prediction probability that the ith anchor point is the instrument region, if the anchor point is the true value of the regular labelIs 1, anchor point is negativeIs 0, when1, the regression loss function LregIs activated; t is tiIs a vector containing the coordinate parameters of the four vertices of the location box of the meter region,the actual value of the positioning frame; l isclsLog loss functions including target and non-target for both classes; l isregThe regression loss function representing the positioning frame is taken asWherein R is a 1-norm loss function; (ii) a N is a radical ofCLS256, normalized to the cls term; n is a radical ofreg2400(40 × 60), the number of alignment boxes; λ is 10, which is the balance factor. The learning rate of the first convolutional neural network is set to 0.001 and the number of iterations is set to 30000.
Training the second convolutional neural network comprises parameter learning of the second convolutional neural network, and learning to obtain parameters of the second convolutional neural network by continuously reducing function values of the loss function, wherein the loss function l of the second convolutional neural network is as follows:
wherein g andthe prediction values represent prediction probabilities of the instruments and are distributed in [0,1 [0 ], 1 [ [ 1 ] ] and [0 ] ]]To (c) to (d); the size of each image is N × W × H, where N denotes the number of output categories, W and H denote the width and height of the image, and N denotes the true label of pixel (i, j). The learning rate of the second convolutional neural network is set to 0.001 and the number of iterations is set to 10000.
S3: positioning an instrument area where an instrument is located in an instrument image by using a trained first convolution neural network, cutting the obtained instrument area, and cutting images in a frame along four outer edges formed by four vertexes according to coordinates of the four vertexes of an obtained positioning frame containing the instrument area to obtain an instrument area image;
s4: identifying the type of the instrument by using the trained second convolutional neural network, and outputting an instrument identification result; and sending the cut instrument area image into a trained second convolutional neural network for recognition, and outputting an instrument recognition result. Tests are carried out on the instrument database, and the test results are as follows: accuracy is 0.9762, and recall (also called sensitivity) is 1, where accuracy is used to describe the overall effect of the method and recall is used to describe the overall effect of the method on the instrument
S5: according to the identification result of S4, different types of meters are read by corresponding methods.
When the instrument identification result is a pointer type circular ammeter, the reading is as follows:
the final reading is obtained, where Angle is the 0 scale to pointer deflection Angle and θ is the 0 Angle to pointer deflection Angle. Value is the last reading of the pointer and a is the maximum range of the pointer.
When the instrument identification result is a pointer type square ammeter, reading is carried out by using a neural network regression method, wherein the neural network inputs a real current value and outputs a predicted current value. The method specifically comprises the following steps: and continuously reducing the function value of the value function by adopting a random gradient descent method to learn the parameters of the established third convolutional neural network, wherein the value function is expressed as:
where ω is a weight parameter of the third convolutional neural network, n is a number of samples of the third convolutional neural network training, xiIs the feature vector of the ith training sample, yiIs the label of the ith training sample; f (-) is the excitation function of the third convolutional neural network, and L (-) is the loss function of the third convolutional neural network.
The third convolutional neural network uses only a portion of the training samples (x) in each iterationi,yi) And learning and updating the weight parameters, wherein the weight parameters of each generation can be expressed as:
wherein t represents the number of iterations, and the value range is [3000, + ∞ ]]α denotes learning rate, and its value range is [0.0003,0.01 ]];Representing the partial differential of the cost function.
The same or similar reference numerals correspond to the same or similar parts;
the terms describing positional relationships in the drawings are for illustrative purposes only and are not to be construed as limiting the patent;
it should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.
Claims (10)
1. A transformer substation pointer instrument identification method based on a convolutional neural network is characterized by comprising the following steps:
s1: acquiring an image comprising a transformer substation pointer instrument, and establishing an instrument image library;
s2: training a first convolutional neural network and a second convolutional neural network by using the instrument image library in the step S1, wherein the first convolutional neural network is used for positioning the pointer instrument area, and the second convolutional neural network is used for identifying the type of the pointer instrument;
s3: positioning an instrument area where an instrument is located in an instrument image by using the trained first convolution neural network;
s4: identifying the type of the instrument by using the trained second convolutional neural network, and outputting an instrument identification result;
s5: according to the identification result of S4, different types of meters are read by corresponding methods.
2. The method for identifying the pointer instrument of the substation based on the convolutional neural network as claimed in claim 1, wherein after step S3 and before step S4, the instrument area obtained in step S3 is further cut.
3. The convolutional neural network-based substation pointer instrument identification method as claimed in claim 2, wherein the image including the substation pointer instrument acquired in step S1 includes a pointer circular ammeter with a range of 0-70A, a pointer circular ammeter with a range of 0-1A, and a pointer mode ammeter with a range of 0-20A.
4. The method for identifying the pointer instrument of the substation based on the convolutional neural network as claimed in claim 3, wherein the training of the first convolutional neural network in step S2 includes parameter learning of the first convolutional neural network, and the parameter of the first convolutional neural network is obtained by learning by continuously decreasing a function value of a loss function, wherein the loss function L (p) of the first convolutional neural networki},{ti}) is:
where i denotes the ith anchor point in the first convolutional neural network, piFor the prediction probability that the ith anchor point is the instrument region, if the anchor point is the true value of the regular labelIs 1, anchor point is negativeIs 0, when1, the regression loss function LregIs activated; t is tiIs a vector containing the coordinate parameters of the four vertices of the location box of the meter region,the actual value of the positioning frame; l isclsLog loss functions including target and non-target for both classes; l isregThe regression loss function representing the positioning frame is taken asWherein R is a 1-norm loss function; n is a radical ofCLS256, normalized to the cls term;Nreg2400(40 × 60), the number of alignment boxes; λ is 10, which is the balance factor.
5. The method for identifying the pointer instrument of the substation based on the convolutional neural network as claimed in claim 3, wherein the training of the second convolutional neural network in step S2 includes parameter learning of the second convolutional neural network, and learning to obtain the parameters of the second convolutional neural network by continuously decreasing the function value of a loss function, where the loss function l of the second convolutional neural network is:
wherein g andthe prediction values represent prediction probabilities of the instruments and are distributed in [0,1 [0 ], 1 [ [ 1 ] ] and [0 ] ]]To (c) to (d); the size of each image is N × W × H, where N denotes the number of output categories, W and H denote the width and height of the image, and N denotes the true label of pixel (i, j).
6. The method for identifying the pointer instrument of the transformer substation based on the convolutional neural network as claimed in claim 4 or 5, wherein the instrument area obtained in S3 is cut, and specifically:
and cutting the image in the frame along four outer edges formed by the four vertexes according to the four vertex coordinates of the positioning frame containing the instrument area obtained in the step S3 to obtain an instrument area image.
7. The method for identifying the pointer instrument in the substation based on the convolutional neural network as claimed in claim 6, wherein in step S4, the cut instrument area image is sent to the trained convolutional neural network for identification, and an instrument identification result is output.
8. The method for identifying the pointer instrument in the substation based on the convolutional neural network as claimed in claim 7, wherein in step S5, when the instrument identification result is a pointer circular ammeter, the reading is represented by the following formula:
and obtaining the final reading, wherein Angle is the deflection Angle from the scale 0 to the pointer, theta is the deflection Angle from the scale 0 to the pointer, Value is the final reading of the pointer, and A is the maximum measuring range of the pointer.
9. The method for identifying pointer type instruments in transformer substations based on convolutional neural network as claimed in claim 7, wherein in step S5, when the instrument identification result is a pointer type square ammeter, the reading is performed by using a neural network regression method, the neural network inputs real current value and outputs predicted current value, specifically: and continuously reducing the function value of the value function by adopting a random gradient descent method to learn the parameters of the established third convolutional neural network, wherein the value function is expressed as:
where ω is a weight parameter of the third convolutional neural network, n is a number of samples of the third convolutional neural network training, xiIs the feature vector of the ith training sample, yiIs the label of the ith training sample; f (-) is the excitation function of the third convolutional neural network, and L (-) is the loss function of the third convolutional neural network.
10. According to the claimsThe identification method of the transformer substation pointer instrument based on the convolutional neural network is characterized in that a third convolutional neural network only uses a part of the training samples (x) in each iterationi,yi) And learning and updating the weight parameters, wherein the weight parameters of each generation can be expressed as:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911143610.6A CN110929723B (en) | 2019-11-20 | 2019-11-20 | Identification method of transformer substation pointer instrument based on convolutional neural network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911143610.6A CN110929723B (en) | 2019-11-20 | 2019-11-20 | Identification method of transformer substation pointer instrument based on convolutional neural network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110929723A true CN110929723A (en) | 2020-03-27 |
CN110929723B CN110929723B (en) | 2022-12-02 |
Family
ID=69851377
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911143610.6A Active CN110929723B (en) | 2019-11-20 | 2019-11-20 | Identification method of transformer substation pointer instrument based on convolutional neural network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110929723B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111582071A (en) * | 2020-04-23 | 2020-08-25 | 浙江大学 | SF6 instrument image reading method based on HRNet network model |
CN112347929A (en) * | 2020-11-06 | 2021-02-09 | 电子科技大学中山学院 | Pointer instrument system and monitoring method |
CN113283419A (en) * | 2021-04-29 | 2021-08-20 | 国网浙江省电力有限公司湖州供电公司 | Convolutional neural network pointer instrument image reading identification method based on attention |
CN114202910A (en) * | 2020-08-28 | 2022-03-18 | 京东方科技集团股份有限公司 | Instrument recognition device, instrument monitoring system and monitoring method thereof |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108416772A (en) * | 2018-03-07 | 2018-08-17 | 汕头大学 | A kind of strabismus detection method based on concatenated convolutional neural network |
CN109344763A (en) * | 2018-09-26 | 2019-02-15 | 汕头大学 | A kind of strabismus detection method based on convolutional neural networks |
CN109815950A (en) * | 2018-12-28 | 2019-05-28 | 汕头大学 | A kind of reinforcing bar end face recognition methods based on depth convolutional neural networks |
CN110263790A (en) * | 2019-04-18 | 2019-09-20 | 汕头大学 | A kind of power plant's ammeter character locating and recognition methods based on convolutional neural networks |
-
2019
- 2019-11-20 CN CN201911143610.6A patent/CN110929723B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108416772A (en) * | 2018-03-07 | 2018-08-17 | 汕头大学 | A kind of strabismus detection method based on concatenated convolutional neural network |
CN109344763A (en) * | 2018-09-26 | 2019-02-15 | 汕头大学 | A kind of strabismus detection method based on convolutional neural networks |
CN109815950A (en) * | 2018-12-28 | 2019-05-28 | 汕头大学 | A kind of reinforcing bar end face recognition methods based on depth convolutional neural networks |
CN110263790A (en) * | 2019-04-18 | 2019-09-20 | 汕头大学 | A kind of power plant's ammeter character locating and recognition methods based on convolutional neural networks |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111582071A (en) * | 2020-04-23 | 2020-08-25 | 浙江大学 | SF6 instrument image reading method based on HRNet network model |
CN111582071B (en) * | 2020-04-23 | 2022-05-13 | 浙江大学 | SF6 instrument image reading method based on HRNet network model |
CN114202910A (en) * | 2020-08-28 | 2022-03-18 | 京东方科技集团股份有限公司 | Instrument recognition device, instrument monitoring system and monitoring method thereof |
CN112347929A (en) * | 2020-11-06 | 2021-02-09 | 电子科技大学中山学院 | Pointer instrument system and monitoring method |
CN113283419A (en) * | 2021-04-29 | 2021-08-20 | 国网浙江省电力有限公司湖州供电公司 | Convolutional neural network pointer instrument image reading identification method based on attention |
CN113283419B (en) * | 2021-04-29 | 2022-07-05 | 国网浙江省电力有限公司湖州供电公司 | Convolutional neural network pointer instrument image reading identification method based on attention |
Also Published As
Publication number | Publication date |
---|---|
CN110929723B (en) | 2022-12-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110929723B (en) | Identification method of transformer substation pointer instrument based on convolutional neural network | |
CN110533631B (en) | SAR image change detection method based on pyramid pooling twin network | |
CN109493320B (en) | Remote sensing image road extraction method and system based on deep learning, storage medium and electronic equipment | |
CN113705478B (en) | Mangrove single wood target detection method based on improved YOLOv5 | |
CN113592822B (en) | Insulator defect positioning method for electric power inspection image | |
CN109146889A (en) | A kind of field boundary extracting method based on high-resolution remote sensing image | |
CN116468730B (en) | Aerial Insulator Image Defect Detection Method Based on YOLOv5 Algorithm | |
CN111598942A (en) | Method and system for automatically positioning electric power facility instrument | |
CN106056619A (en) | Unmanned aerial vehicle vision wire patrol method based on gradient constraint Radon transform | |
CN114155244B (en) | Defect detection method, device, equipment and storage medium | |
CN111814597A (en) | Urban function partitioning method coupling multi-label classification network and YOLO | |
CN115372877B (en) | Lightning arrester leakage ammeter inspection method of transformer substation based on unmanned aerial vehicle | |
CN110263790A (en) | A kind of power plant's ammeter character locating and recognition methods based on convolutional neural networks | |
CN111192267A (en) | Multisource perception fusion remote sensing image segmentation method based on UNET network and application | |
CN114266881A (en) | Pointer type instrument automatic reading method based on improved semantic segmentation network | |
CN109684910A (en) | A kind of method and system of network detection transmission line of electricity ground surface environment variation | |
CN116503399A (en) | Insulator pollution flashover detection method based on YOLO-AFPS | |
CN113012107B (en) | Power grid defect detection method and system | |
CN111091534A (en) | Target detection-based pcb defect detection and positioning method | |
CN113516177A (en) | Wheat lodging region identification method based on spectral texture features and support vector machine | |
CN116452942A (en) | Packaging quality prediction method and system based on neural network | |
Shaotong et al. | Location and identification of insulator and bushing based on YOLOv3-spp algorithm | |
CN114255458A (en) | Method and system for identifying reading of pointer instrument in inspection scene | |
CN114037993A (en) | Substation pointer instrument reading method and device, storage medium and electronic equipment | |
CN113989209A (en) | Power line foreign matter detection method based on fast R-CNN |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |