CN115219810A - Line trip prediction method based on lightning positioning system - Google Patents

Line trip prediction method based on lightning positioning system Download PDF

Info

Publication number
CN115219810A
CN115219810A CN202210544447.XA CN202210544447A CN115219810A CN 115219810 A CN115219810 A CN 115219810A CN 202210544447 A CN202210544447 A CN 202210544447A CN 115219810 A CN115219810 A CN 115219810A
Authority
CN
China
Prior art keywords
layer
lightning
neural network
data
convolution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210544447.XA
Other languages
Chinese (zh)
Other versions
CN115219810B (en
Inventor
汪颖
雷蕾
胡文曦
肖先勇
郑子萱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan University
Original Assignee
Sichuan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan University filed Critical Sichuan University
Priority to CN202210544447.XA priority Critical patent/CN115219810B/en
Publication of CN115219810A publication Critical patent/CN115219810A/en
Application granted granted Critical
Publication of CN115219810B publication Critical patent/CN115219810B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01RMEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
    • G01R31/00Arrangements for testing electric properties; Arrangements for locating electric faults; Arrangements for electrical testing characterised by what is being tested not provided for elsewhere
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y04INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
    • Y04SSYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
    • Y04S10/00Systems supporting electrical power generation, transmission or distribution
    • Y04S10/50Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications

Abstract

The invention discloses a line trip prediction method based on a lightning positioning system, which comprises the following steps: carrying out space grid division on a target area, determining the time step and acquiring training data; constructing a three-dimensional convolution neural network; training the three-dimensional convolution neural network by adopting training data to obtain a trained three-dimensional convolution neural network; acquiring prediction basic data; and inputting the prediction basic data serving as the trained three-dimensional convolutional neural network to obtain a line trip prediction result. The invention constructs the input matrix fusing the thunder and lightning space-time characteristics, not only can fit the coverage range and parameter values of the lightning falling points of the actual thunder and lightning activities, but also can fit the time sequence characteristics of the thunder and lightning activities, such as the movement rule of the thunder and lightning activities, the change rule of the activity parameters of the thunder and lightning current amplitude and the like along with the time, thereby realizing the line trip prediction based on the thunder and lightning positioning system.

Description

Line trip prediction method based on lightning positioning system
Technical Field
The invention relates to the field of power grids, in particular to a line trip prediction method based on a lightning positioning system.
Background
Lightning strikes are the most prominent cause of transmission line failure. According to the statistical data of line tripping of 12 countries with continuous 3 years published by the international large power grid Conference (CIGRE), the number of lightning accidents in a power transmission line with the voltage level of 275kV to 500kV accounts for 60% of the total accidents. Aiming at the line trip event caused by lightning stroke, the early warning of the power transmission line is realized in advance based on real-time lightning activity, and the early warning is the key of active lightning protection of the power transmission line. The traditional active lightning early warning method mainly uses an atmospheric electric field instrument and a weather radar, has large early warning scale, can only predict the area and time of thunderstorm occurrence, cannot obtain the geographic distribution density and the evolution rule of lightning activity, and influences the accuracy of a trip prediction result of a power transmission line.
The lightning positioning system has the characteristic of high space-time resolution, and can provide parameters such as the position, time, lightning current amplitude, lightning strike number and the like of each lightning strike in real time. With the realization of national networking of the lightning location system, a foundation is provided for the application of the lightning location system in the lightning early warning of the power system. Most of the thunderstorm activity prediction methods based on the data of the lightning positioning system use monitoring data to count historical lightning activity parameters reflecting the lightning trip-out rate, such as the thunderbolt density, the probability distribution of the lightning current amplitude and the like, then calculate the lightning trip-out probability according to a physical or mathematical analysis model based on a lightning stroke fault mechanism, such as a lead model method, an electrical geometry method and the like, and indirectly realize the prediction according to the lightning stroke trip-out rate. In recent years, a large amount of lightning activity data and trip records are accumulated in the operation of a power system, and data mining is very suitable for acquiring the relation between lightning and lightning trip. However, a single trip event positioning analysis only has a measure of a space angle, the frequency characteristic of trip event counting is only considered from the perspective of a time sequence, a prediction model is difficult to fuse the space information and the time sequence characteristic of real-time lightning activities, and a power transmission line real-time trip prediction model capable of fusing the space-time information of the lightning activities needs to be built urgently.
Disclosure of Invention
Aiming at the defects in the prior art, the line trip prediction method based on the lightning positioning system solves the problem that the spatial information and the time sequence characteristics of real-time lightning activities are difficult to fuse in the prior art.
In order to achieve the purpose of the invention, the invention adopts the technical scheme that:
the line trip prediction method based on the lightning location system comprises the following steps:
s1, performing spatial grid division on a target area, determining a time step, and acquiring line trip data in the time step t and lightning strike data of each spatial grid in at least n adjacent time steps before the time step t to obtain a group of training data; n is greater than or equal to 3;
s2, constructing a three-dimensional convolution neural network;
s3, training the three-dimensional convolutional neural network by adopting training data to obtain a trained three-dimensional convolutional neural network;
s4, lightning strike data of each space grid in at least n adjacent time steps before a target time step is obtained, and prediction basic data are obtained;
and S5, taking the prediction basic data as the input of the trained three-dimensional convolution neural network to obtain a line trip prediction result.
Further, the specific method of step S1 includes the following substeps:
s1-1, initially dividing a target area by grids with the side length of 1km multiplied by 1km to obtain a spatial grid;
s1-2, making a circle by taking the central point of each spatial grid as a circular point and the monitoring range R as a radius to obtain a monitoring area of each spatial grid;
s1-3, obtaining line trip data in a time step t in each monitoring area and lightning strike data in at least n adjacent time steps before the time step t in each monitoring area to obtain lightning strike data of each space grid;
s1-4, according to a formula:
Figure BDA0003649336490000031
carrying out standardized processing on the different types of lightning strike data; wherein d is i The data is the i-th lightning strike data before standardization processing; d i ' standardized type i thunderbolt data; mu.s i The average value of the ith lightning strike data is obtained; delta i The standard deviation of the ith lightning strike data is obtained;
s1-5, combining the standardized thunderbolt data of different types into a matrix to obtain a training sample; and taking the line trip data as a real label to obtain training data.
Further, the lightning strike data in the step S1 includes the number of lightning strikes, a maximum lightning current amplitude, an average lightning current amplitude, a minimum lightning strike distance, and an average lightning strike distance;
the line trip data includes trip position and number of trips.
Further, the value of the monitoring range R in step S1-2 is 3km or 5km.
Further, the three-dimensional convolutional neural network in step S2 includes:
the system comprises a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer, a first SE layer, a third convolution layer, a third pooling layer, a second SE layer, a fourth convolution layer, a fourth pooling layer, a third SE layer, a first full-connection layer, a second full-connection layer, a first classifier and a second classifier;
wherein the input of the first convolutional layer is the input of the three-dimensional convolutional neural network, and the input of the first pooling layer is the output of the first convolutional layer; the input of the first convolution layer and the output of the first pooling layer are used together as the input of the second convolution layer; the second convolution layer, the second pooling layer and the first SE layer are connected in sequence; the output of the first SE layer and the input of the second convolutional layer are used together as the input of the third convolutional layer; the third convolution layer, the third pooling layer and the second SE layer are connected in sequence; the output of the second SE layer and the input of the third convolutional layer are used together as the input of the fourth convolutional layer; the fourth convolution layer, the fourth pooling layer and the third SE layer are connected in sequence; the output of the third SE layer and the input of the fourth convolution layer are jointly used as the input of the first full-connection layer; the first full connection layer, the second full connection layer, the first classifier and the second classifier are connected in sequence; the output of the second classifier is used as the output of the three-dimensional convolution neural network; each convolutional layer is activated by a Relu nonlinear correction unit function.
Further, the numbers of convolution kernels of the first convolution layer to the fourth convolution layer are 64, 128, 256 and 512 respectively; the convolution kernel dimension is 3 x 3; both the filling and step parameters are 1 x 1; the step size parameter of the first pooling layer is 1 x 1; the step size parameters of the second and third pooling layers were 2 x 2.
Further, the specific method of step S4 includes the following sub-steps:
s4-1, inputting training samples in the training data into a three-dimensional convolutional neural network to obtain the output of the three-dimensional convolutional neural network;
s4-2, according to a formula:
Figure BDA0003649336490000041
computing an output corresponding to a jth input of a three-dimensional convolutional neural network
Figure BDA0003649336490000042
Value of focal loss of
Figure BDA0003649336490000043
Wherein y is j As a genuine label, y j =1 indicating that there is a trip at the current position, y j =0 indicates that the current location is not tripped; alpha is a balance factor, and alpha belongs to (0, 1); omega is a modulation factor, omega belongs to [0, + ∞];
S4-3, according to a formula:
Figure BDA0003649336490000044
convolution kernel parameter phi for three-dimensional convolutional neural networks i Updating to obtain updated parameter phi' i (ii) a Wherein γ represents the learning rate of the training process;
Figure BDA0003649336490000045
representing the derivation of a partial derivative;
s4-4, according to a formula:
Figure BDA0003649336490000051
Figure BDA0003649336490000052
Figure BDA0003649336490000053
respectively acquiring a current three-dimensional convolutional neural network fault detection rate FDR, a false alarm rate FAR and a total prediction accuracy rate PA; wherein FY and FN are the number of trip samples that are classified as normal and tripped by prediction, respectively; TY and TN are the number of normal samples that are classified as normal and tripped by prediction, respectively;
s4-5, judging whether the fault detection rate FDR, the false alarm rate FAR and the total prediction accuracy rate PA of the current three-dimensional convolutional neural network all reach corresponding threshold values, if so, outputting the current three-dimensional convolutional neural network, and finishing training the three-dimensional convolutional neural network; otherwise, the step S4-1 is returned.
The beneficial effects of the invention are as follows:
1. the method constructs the input matrix fusing the thunder and lightning space-time characteristics, can fit the coverage range and parameter values of the lightning falling points of the actual thunder and lightning activities, and can also fit the time sequence characteristics of the thunder and lightning activities, such as the movement rule of the thunder and lightning activities, the change rule of the activity parameters of the thunder and lightning current amplitude and the like along with the time, thereby realizing the line trip prediction based on the thunder and lightning positioning system.
2. The invention has higher line trip prediction accuracy due to more lightning monitoring information, and the prediction space accuracy can reach 1 kilometer. The influence of real-time lightning activity parameters such as lightning geographic density, lightning current amplitude, thunderstorm cloud cluster activity track and the like on the accuracy of a prediction result is comprehensively considered, and input parameters capable of reflecting tripping of the power transmission line are constructed; the area size is processed by using a 1km x 1km grid, and the predicted spatial accuracy can reach 1 km.
3. The output of the three-dimensional convolutional neural network built by the method depends on the high-level characteristics of the subsequent convolutional layer and also depends on the low-level characteristics of the early convolutional layer; and a 'compression-excitation' module in a self-attention mechanism is used, so that the effect of inhibiting useless information by useful information is enhanced, and the feature extraction capability of the network is improved. In addition, the focus loss is introduced to guide neural network training to eliminate the influence of sample imbalance on the model identification result, and the accuracy of the prediction method is improved. The problem that a traditional three-dimensional convolution network is not suitable for real-time prediction due to more parameters and low efficiency is solved, model complexity and time complexity are reduced, and the method is suitable for an actual power grid.
Drawings
FIG. 1 is a schematic flow diagram of the present invention;
fig. 2 is a schematic structural diagram of the three-dimensional convolutional neural network.
Detailed Description
The following description of the embodiments of the present invention is provided to facilitate the understanding of the present invention by those skilled in the art, but it should be understood that the present invention is not limited to the scope of the embodiments, and it will be apparent to those skilled in the art that various changes may be made without departing from the spirit and scope of the invention as defined and defined by the appended claims, and all changes that can be made by the invention using the inventive concept are intended to be protected.
As shown in fig. 1, the line trip prediction method based on the lightning location system comprises the following steps:
s1, performing spatial grid division on a target area, determining a time step, and acquiring line trip data in the time step t and lightning strike data of each spatial grid in at least n adjacent time steps before the time step t to obtain a group of training data; n is greater than or equal to 3;
s2, constructing a three-dimensional convolution neural network;
s3, training the three-dimensional convolutional neural network by adopting training data to obtain a trained three-dimensional convolutional neural network;
s4, lightning strike data of each space grid in at least n adjacent time steps before a target time step is obtained, and prediction basic data are obtained;
and S5, taking the prediction basic data as the input of the trained three-dimensional convolutional neural network to obtain a line trip prediction result.
The specific method of step S1 comprises the following substeps:
s1-1, carrying out initial division on a target area by using a square grid with the side length of 1km multiplied by 1km to obtain a spatial grid;
s1-2, making a circle by taking the central point of each spatial grid as a circular point and the monitoring range R as a radius to obtain a monitoring area of each spatial grid;
s1-3, obtaining line trip data in a time step t in each monitoring area and lightning strike data in at least n adjacent time steps before the time step t in each monitoring area to obtain lightning strike data of each space grid;
s1-4, according to a formula:
Figure BDA0003649336490000071
carrying out standardized processing on the lightning strike data of different types; wherein d is i The data is the i-th lightning strike data before standardization processing; d is a radical of i ' is the standardized i-th type lightning strike data; mu.s i The average value of the i-th lightning strike data is obtained; delta. For the preparation of a coating i Standard deviation of the i-th lightning strike data;
s1-5, combining the standardized thunderbolt data of different types into a matrix to obtain a training sample; and taking the line trip data as a real label to obtain training data.
As shown in fig. 2, the three-dimensional convolutional neural network in step S2 includes:
the device comprises a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer, a first SE (sequence-and-Excitation) module, a third convolution layer, a third pooling layer, a second SE layer, a fourth convolution layer, a fourth pooling layer, a third SE layer, a first full-connection layer, a second full-connection layer, a first classifier and a second classifier;
wherein the input of the first convolution layer is the input of the three-dimensional convolution neural network, and the input of the first pooling layer is the output of the first convolution layer; the input of the first convolution layer and the output of the first pooling layer are jointly used as the input of the second convolution layer; the second convolution layer, the second pooling layer and the first SE layer are connected in sequence; the output of the first SE layer and the input of the second convolutional layer are used together as the input of the third convolutional layer; the third convolution layer, the third pooling layer and the second SE layer are connected in sequence; the output of the second SE layer and the input of the third convolutional layer are jointly used as the input of a fourth convolutional layer; the fourth convolution layer, the fourth pooling layer and the third SE layer are connected in sequence; the output of the third SE layer and the input of the fourth convolution layer are jointly used as the input of the first full-connection layer; the first full connection layer, the second full connection layer, the first classifier and the second classifier are connected in sequence; the output of the second classifier is used as the output of the three-dimensional convolution neural network; each convolutional layer is activated by the Relu nonlinear correction unit function.
The number of convolution kernels of the first convolution layer to the fourth convolution layer is respectively 64, 128, 256 and 512; the convolution kernel dimension is 3 x 3; both the filling and step parameters are 1 x 1; the step size parameter of the first pooling layer is 1 x 1; the step size parameters of the second and third pooling layers were 2 x 2.
The specific method of step S4 includes the following substeps:
s4-1, inputting training samples in the training data into a three-dimensional convolutional neural network to obtain the output of the three-dimensional convolutional neural network;
s4-2, according to a formula:
Figure BDA0003649336490000081
computing an output corresponding to a jth input of a three-dimensional convolutional neural network
Figure BDA0003649336490000082
Value of focal loss of
Figure BDA0003649336490000083
Wherein y is j As a genuine label, y j =1 indicates that there is a trip at the current location, y j =0 indicates that the current location is not tripped; alpha is a balance factor, and alpha belongs to (0, 1); omega is a modulation factor, omega belongs to [0, + ∞];
S4-3, according to a formula:
Figure BDA0003649336490000084
convolution kernel parameter phi for three-dimensional convolution neural network i Updating to obtain updated parameter phi' i (ii) a Wherein γ represents the learning rate of the training process;
Figure BDA0003649336490000085
representing the derivation of a partial derivative;
s4-4, according to a formula:
Figure BDA0003649336490000086
Figure BDA0003649336490000091
Figure BDA0003649336490000092
respectively acquiring a current three-dimensional convolutional neural network fault detection rate FDR, a false alarm rate FAR and a total prediction accuracy rate PA; wherein FY and FN are the number of trip samples predicted to be classified as normal and tripped, respectively; TY and TN are the number of normal samples classified as normal and tripped by prediction, respectively;
s4-5, judging whether the current three-dimensional convolutional neural network fault detection rate FDR, the false alarm rate FAR and the total prediction accuracy rate PA all reach corresponding threshold values, if so, outputting the current three-dimensional convolutional neural network, and finishing training the three-dimensional convolutional neural network; otherwise, the step S4-1 is returned.
In one embodiment of the invention, the time step is taken to be 5 minutes. The value of the monitoring range R in the step S1-2 is 3km or 5km. The value of n is 3. The value of the monitoring range R in the step S1-2 is 3km or 5km. The lightning falling data in the step S1 comprise lightning falling times, maximum lightning current amplitude, average lightning current amplitude, nearest lightning falling distance and average lightning falling distance; the line trip data includes trip position and number of trips. The input size of the three-dimensional convolutional neural network is thus H×M×3×5 Where H × M is the number of grids into which the target region is divided by 1km × 1 km; 3 is the selected number of participating prediction time periods (i.e. parameter n); and 5 is the total class number of the input thunderbolt data.
The SE module can effectively improve the generalization capability of the prediction model and accelerate the model training while adding a small amount of model parameters. Before entering the next convolutional layer, the method takes the output of the previous layer as input and also takes the input of the previous convolutional layer as input, thereby reducing characteristic diagram channels and greatly reducing the network parameter number compared with the original three-dimensional convolutional neural network. And finally, classifying the extracted space-time characteristics, adopting a hierarchical softmax function as an activation function (classifier) in an output layer, calculating whether the sample trips or not by using a first classifier, further calculating the probability of tripping the sample at different positions by using a second classifier, and finally outputting a comprehensive judgment result of two layers as a prediction result.

Claims (7)

1. A line trip prediction method based on a lightning location system is characterized by comprising the following steps:
s1, performing space grid division on a target area, determining a time step, and acquiring line trip data in the time step t and lightning strike data of each space grid in at least n adjacent time steps before the time step t to obtain a group of training data; n is greater than or equal to 3;
s2, constructing a three-dimensional convolution neural network;
s3, training the three-dimensional convolution neural network by adopting training data to obtain a trained three-dimensional convolution neural network;
s4, lightning strike data of each space grid in at least n adjacent time steps before a target time step is obtained, and prediction basic data are obtained;
and S5, taking the prediction basic data as the input of the trained three-dimensional convolution neural network to obtain a line trip prediction result.
2. The lightning location system based line trip prediction method of claim 1, wherein the specific method of step S1 comprises the following sub-steps:
s1-1, initially dividing a target area by grids with the side length of 1km multiplied by 1km to obtain a spatial grid;
s1-2, making a circle by taking the central point of each spatial grid as a circular point and the monitoring range R as a radius to obtain a monitoring area of each spatial grid;
s1-3, obtaining line trip data in a time step t in each monitoring area and lightning strike data in at least n adjacent time steps before the time step t in each monitoring area to obtain lightning strike data of each space grid;
s1-4, according to a formula:
Figure FDA0003649336480000011
carrying out standardized processing on the lightning strike data of different types; wherein d is i The data is the i-th lightning strike data before standardization processing; d i ' standardized type i thunderbolt data; mu.s i The average value of the ith lightning strike data is obtained; delta i Standard deviation of the i-th lightning strike data;
s1-5, combining the standardized thunderbolt data of different types into a matrix to obtain a training sample; and taking the line trip data as a real label to obtain training data.
3. The method for predicting the line trip based on the lightning location system according to claim 1 or 2, wherein the lightning falling data in the step S1 comprises lightning falling times, maximum lightning current amplitude, average lightning current amplitude, nearest lightning distance and average lightning distance;
the line trip data includes trip position and number of trips.
4. The lightning location system based line trip prediction method of claim 2, wherein the value of the listening range R in step S1-2 is 3km or 5km.
5. The lightning location system based line trip prediction method of claim 1, wherein the three-dimensional convolutional neural network in step S2 comprises:
the device comprises a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer, a first SE layer, a third convolution layer, a third pooling layer, a second SE layer, a fourth convolution layer, a fourth pooling layer, a third SE layer, a first full-connection layer, a second full-connection layer, a first classifier and a second classifier;
wherein the input of the first convolution layer is the input of the three-dimensional convolution neural network, and the input of the first pooling layer is the output of the first convolution layer; the input of the first convolution layer and the output of the first pooling layer are used together as the input of the second convolution layer; the second convolution layer, the second pooling layer and the first SE layer are connected in sequence; the output of the first SE layer and the input of the second convolutional layer are used together as the input of the third convolutional layer; the third convolution layer, the third pooling layer and the second SE layer are connected in sequence; the output of the second SE layer and the input of the third convolutional layer are jointly used as the input of a fourth convolutional layer; the fourth convolution layer, the fourth pooling layer and the third SE layer are connected in sequence; the output of the third SE layer and the input of the fourth convolution layer are jointly used as the input of the first full-connection layer; the first full connection layer, the second full connection layer, the first classifier and the second classifier are connected in sequence; the output of the second classifier is used as the output of the three-dimensional convolution neural network; each convolutional layer is activated by the Relu nonlinear correction unit function.
6. The lightning location system-based line trip prediction method of claim 5, wherein the number of convolution kernels of the first convolution layer to the fourth convolution layer is 64, 128, 256 and 512; the convolution kernel dimension is 3 x 3; both the filling and step parameters are 1 x 1; the step size parameter of the first pooling layer is 1 x 1; the step size parameters of the second and third pooling layers were each 2 x 2.
7. The lightning location system based line trip prediction method of claim 1, wherein the specific method of step S4 comprises the following sub-steps:
s4-1, inputting training samples in the training data into a three-dimensional convolution neural network to obtain the output of the three-dimensional convolution neural network;
s4-2, according to a formula:
Figure FDA0003649336480000031
computing an output corresponding to a jth input of a three-dimensional convolutional neural network
Figure FDA0003649336480000032
Value of focal loss of
Figure FDA0003649336480000033
Wherein y is j As a genuine label, y j =1 indicates that there is a trip at the current location, y j =0 indicates that the current location is not tripped; alpha is a balance factor, and alpha belongs to (0, 1); omega is a modulation factor, omega belongs to [0, + ∞];
S4-3, according to a formula:
Figure FDA0003649336480000034
convolution kernel parameter phi for three-dimensional convolution neural network i Updating to obtain updated parameter phi' i (ii) a Wherein γ represents the learning rate of the training process;
Figure FDA0003649336480000035
representing the derivation of a partial derivative;
s4-4, according to a formula:
Figure FDA0003649336480000036
Figure FDA0003649336480000037
Figure FDA0003649336480000041
respectively acquiring a fault detection rate FDR, a false alarm rate FAR and a total prediction accuracy rate PA of the current three-dimensional convolutional neural network; wherein FY and FN are the number of trip samples predicted to be classified as normal and tripped, respectively; TY and TN are the number of normal samples that are classified as normal and tripped by prediction, respectively;
s4-5, judging whether the fault detection rate FDR, the false alarm rate FAR and the total prediction accuracy rate PA of the current three-dimensional convolutional neural network all reach corresponding threshold values, if so, outputting the current three-dimensional convolutional neural network, and finishing training the three-dimensional convolutional neural network; otherwise, the step S4-1 is returned.
CN202210544447.XA 2022-05-18 2022-05-18 Line tripping prediction method based on lightning positioning system Active CN115219810B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210544447.XA CN115219810B (en) 2022-05-18 2022-05-18 Line tripping prediction method based on lightning positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210544447.XA CN115219810B (en) 2022-05-18 2022-05-18 Line tripping prediction method based on lightning positioning system

Publications (2)

Publication Number Publication Date
CN115219810A true CN115219810A (en) 2022-10-21
CN115219810B CN115219810B (en) 2023-06-20

Family

ID=83607879

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210544447.XA Active CN115219810B (en) 2022-05-18 2022-05-18 Line tripping prediction method based on lightning positioning system

Country Status (1)

Country Link
CN (1) CN115219810B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115967179A (en) * 2022-12-17 2023-04-14 国家电网有限公司 Lightning arrester action data monitoring and analysis platform
CN117217546A (en) * 2023-11-08 2023-12-12 合肥工业大学 Power transmission line lightning trip prediction model, method, system and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108108877A (en) * 2017-11-29 2018-06-01 海南电网有限责任公司电力科学研究院 A kind of transmission line of electricity damage to crops caused by thunder methods of risk assessment based on BP neural network
CN108717568A (en) * 2018-05-16 2018-10-30 陕西师范大学 A kind of image characteristics extraction and training method based on Three dimensional convolution neural network
CN109102502A (en) * 2018-08-03 2018-12-28 西北工业大学 Pulmonary nodule detection method based on Three dimensional convolution neural network
WO2019001209A1 (en) * 2017-06-28 2019-01-03 苏州比格威医疗科技有限公司 Classification algorithm for retinal oct image based on three-dimensional convolutional neural network
US20190220699A1 (en) * 2018-01-15 2019-07-18 Gyrfalcon Technology Inc. System and method for encoding data in an image/video recognition integrated circuit solution
WO2021057328A1 (en) * 2019-09-24 2021-04-01 上海数创医疗科技有限公司 St segment classification convolutional neural network based on feature selection and method for using same
US11181634B1 (en) * 2018-09-28 2021-11-23 Rockwell Collins, Inc. Systems and methods of intelligent weather sensing using deep learning convolutional neural networks
CN114118232A (en) * 2021-11-08 2022-03-01 北京智芯微电子科技有限公司 Intelligent ammeter fault prediction method based on time-space convolution neural network

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019001209A1 (en) * 2017-06-28 2019-01-03 苏州比格威医疗科技有限公司 Classification algorithm for retinal oct image based on three-dimensional convolutional neural network
CN108108877A (en) * 2017-11-29 2018-06-01 海南电网有限责任公司电力科学研究院 A kind of transmission line of electricity damage to crops caused by thunder methods of risk assessment based on BP neural network
US20190220699A1 (en) * 2018-01-15 2019-07-18 Gyrfalcon Technology Inc. System and method for encoding data in an image/video recognition integrated circuit solution
CN108717568A (en) * 2018-05-16 2018-10-30 陕西师范大学 A kind of image characteristics extraction and training method based on Three dimensional convolution neural network
CN109102502A (en) * 2018-08-03 2018-12-28 西北工业大学 Pulmonary nodule detection method based on Three dimensional convolution neural network
US11181634B1 (en) * 2018-09-28 2021-11-23 Rockwell Collins, Inc. Systems and methods of intelligent weather sensing using deep learning convolutional neural networks
WO2021057328A1 (en) * 2019-09-24 2021-04-01 上海数创医疗科技有限公司 St segment classification convolutional neural network based on feature selection and method for using same
CN114118232A (en) * 2021-11-08 2022-03-01 北京智芯微电子科技有限公司 Intelligent ammeter fault prediction method based on time-space convolution neural network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
文元美 等: "基于高低维度特征融合的双通道卷积神经网络", pages 101 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115967179A (en) * 2022-12-17 2023-04-14 国家电网有限公司 Lightning arrester action data monitoring and analysis platform
CN117217546A (en) * 2023-11-08 2023-12-12 合肥工业大学 Power transmission line lightning trip prediction model, method, system and storage medium
CN117217546B (en) * 2023-11-08 2024-01-12 合肥工业大学 Power transmission line lightning trip prediction model, method, system and storage medium

Also Published As

Publication number Publication date
CN115219810B (en) 2023-06-20

Similar Documents

Publication Publication Date Title
CN115219810B (en) Line tripping prediction method based on lightning positioning system
CN103207340B (en) On-line transmission line lightning shielding failure trip early-warning method
CN103778476B (en) Method for monitoring and predicting galloping of a transmission line in real time based on video analysis
CN110908014B (en) Galloping refined correction forecasting method and system
CN111897030A (en) Thunderstorm early warning system and method
CN114862278B (en) Power transmission line lightning stroke risk assessment method and system based on distribution network lightning stroke data
CN106507315A (en) A kind of urban traffic accident Forecasting Methodology and system based on network social intercourse media data
US11105958B2 (en) Systems and methods for distributed-solar power forecasting using parameter regularization
Dokic et al. Risk assessment of a transmission line insulation breakdown due to lightning and severe weather
CN107067683B (en) A kind of transmission line forest fire clusters quantitative forecast method and system
Gu et al. Study on lightning risk assessment and early warning for UHV DC transmission channel
CN105279612A (en) Poisson distribution-based power transmission line tripping risk assessment method
CN104599023A (en) Typhoon weather transmission line time-variant reliability calculation method and risk evaluation system
CN112886923B (en) Photovoltaic power station operation and maintenance method and device in thunder and lightning weather
CN112149887A (en) PM2.5 concentration prediction method based on data space-time characteristics
Bao et al. Resilience-oriented transmission line fragility modeling and real-time risk assessment of thunderstorms
CN108596514A (en) Power equipment mixing Weibull Reliability Modeling based on fuzzy genetic algorithm
Gao et al. Heuristic failure prediction model of transmission line under natural disasters
Bao et al. Lightning performance evaluation of transmission line based on data-driven lightning identification, tracking, and analysis
CN103093044A (en) Electric transmission line icing galloping distribution diagram surveying and mapping method
CN114357670A (en) Power distribution network power consumption data abnormity early warning method based on BLS and self-encoder
Vahidi et al. Determining arresters best positions in power system for lightning shielding failure protection using simulation optimization approach
JP2003090887A (en) Predication system and prediction method of instantaneous voltage drop by thunderbolt
CN112257329A (en) Method for judging influence of typhoon on line
Su et al. Lightning Trip Warning Based on GA-BP Neural Network Technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant