CN117523605B - Substation animal intrusion detection method based on multi-sensor information fusion - Google Patents
Substation animal intrusion detection method based on multi-sensor information fusion Download PDFInfo
- Publication number
- CN117523605B CN117523605B CN202311457127.1A CN202311457127A CN117523605B CN 117523605 B CN117523605 B CN 117523605B CN 202311457127 A CN202311457127 A CN 202311457127A CN 117523605 B CN117523605 B CN 117523605B
- Authority
- CN
- China
- Prior art keywords
- similarity matrix
- probability distribution
- distribution function
- data
- matrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000004927 fusion Effects 0.000 title claims abstract description 58
- 241001465754 Metazoa Species 0.000 title claims abstract description 43
- 238000001514 detection method Methods 0.000 title claims abstract description 35
- 239000011159 matrix material Substances 0.000 claims abstract description 151
- 238000005315 distribution function Methods 0.000 claims abstract description 91
- 238000000034 method Methods 0.000 claims abstract description 39
- 230000000877 morphologic effect Effects 0.000 claims abstract description 24
- 238000001228 spectrum Methods 0.000 claims description 21
- 238000012545 processing Methods 0.000 claims description 18
- 230000006872 improvement Effects 0.000 claims description 14
- 238000013507 mapping Methods 0.000 claims description 11
- 230000008569 process Effects 0.000 claims description 11
- 230000009466 transformation Effects 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 8
- 230000009545 invasion Effects 0.000 claims description 7
- 238000010586 diagram Methods 0.000 claims description 6
- 238000000605 extraction Methods 0.000 abstract description 9
- 239000000178 monomer Substances 0.000 abstract description 3
- 241000699670 Mus sp. Species 0.000 description 6
- 230000007613 environmental effect Effects 0.000 description 4
- 241000699666 Mus <mouse, genus> Species 0.000 description 3
- 241000283984 Rodentia Species 0.000 description 2
- 238000005311 autocorrelation function Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000002265 prevention Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/11—Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/86—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using syntactic or structural representations of the image or video pattern, e.g. symbolic string recognition; using graph matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Physics (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- Computational Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Computing Systems (AREA)
- Pure & Applied Mathematics (AREA)
- Operations Research (AREA)
- Algebra (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The invention relates to the technical field of small animal intrusion detection of a transformer substation, in particular to a multi-sensor information fusion-based method for detecting the small animal intrusion of the transformer substation, which is characterized in that images, video data and ultrasonic data are collected, two types of data sources of the images, the videos and the ultrasonic data are respectively subjected to data fusion, then the image data are respectively subjected to morphological feature extraction and time-frequency feature extraction, a biological monomer recognition matrix is respectively constructed, similarity matrixes of the two types of data are calculated through a Gaussian kernel function, attribute judgment results, probability of each result and probability distribution functions of the two types of data are obtained through a fuzzy attribute judgment function, and finally the probability distribution functions are fused through a D-S fusion rule to obtain a final detection result.
Description
Technical Field
The invention relates to the technical field of substation animal intrusion detection, in particular to a substation animal intrusion detection method based on multi-sensor information fusion.
Background
With the development of a power system, a transformer substation plays a key role in electric energy conversion, transmission and distribution as an important component of the power system. However, substations often face problems with small animal invasion, such as birds, rodents, etc. These small animals may enter the inside of the substation, causing serious consequences such as equipment failure, short circuit, fire, etc., and even causing grid accidents. Therefore, timely and accurate detection and prevention of small animal invasion are important to ensuring safe and stable operation of a power system.
At present, aiming at the problem of small animal intrusion detection of a transformer substation, methods for small animal intrusion detection of the transformer substation based on multi-sensor information fusion mainly comprise fusion of an infrared sensor and a sound sensor, video monitoring and fusion of the infrared sensor and fusion of the sound sensor and a vibration sensor, wherein in fusion application of a first sensor and a second sensor, the problem that the infrared sensor is possibly influenced by environmental temperature to cause false alarm easily occurs; in the fusion application of the third sensor, there may be a problem that the interference of the environmental noise leads to false alarm or missing alarm, that is, the fusion of the sensors in the prior art is the fusion of two large types of sensors, and one sensor is easily affected by the external environment, so that the detection data is affected, thereby reducing the accuracy of intrusion detection.
Therefore, the research on a method for detecting the invasion of the small animals in the transformer substation with high accuracy is of great significance.
Disclosure of Invention
The invention aims to provide a transformer substation animal intrusion detection method based on multi-sensor information fusion, which aims to solve the problem that in the existing fusion detection method, an infrared sensor and a sound sensor are easily influenced by the outside, so that animal intrusion detection accuracy is low.
In order to solve the technical problems, the invention provides a substation animal intrusion detection method based on multi-sensor information fusion, which comprises the following steps: collecting video data, image data and ultrasonic data obtained by a sensor; performing image registration processing on the video data and the image data, and performing image data fusion on the registered video data and the registered image data to obtain fused image data; extracting morphological features of the fused image data, constructing a first biological identification matrix by using the extracted morphological features, and calculating to obtain a first similarity matrix based on the first biological identification matrix; carrying out waveform data fusion on the ultrasonic data to obtain fused waveform data; extracting time-frequency characteristics of the fused waveform data, constructing a second biological recognition matrix by using the extracted time-frequency characteristics, and calculating to obtain a second similarity matrix based on the second biological recognition matrix; mapping the first similarity matrix and the second similarity matrix to fuzzy attribute decision functions respectively to perform probability decision processing to obtain a first basic probability distribution function and a second basic probability distribution function; and fusing the first basic probability distribution function and the second basic probability distribution function based on the D-S fusion rule to obtain a final detection result.
In one embodiment, in performing image registration processing on video data and image data, performing image data fusion on the registered video data and image data to obtain fused image data, the method specifically includes the following steps: converting the acquired video data into picture data, and performing image registration on the converted picture data and the acquired image data to obtain registered image data; and carrying out data fusion on registered image data of a plurality of sensors at the same time by using a generating countermeasure network.
In one embodiment, in the step of extracting the morphological features of the fused image data, constructing a first biological recognition matrix by using the extracted morphological features, and calculating a first similarity matrix based on the first biological recognition matrix, the method specifically comprises the following steps: respectively extracting morphological features of the fused image data with different time periods to obtain the morphological features with different time periods; constructing a first biological identification matrix by using the morphological features with the same time period to obtain the first biological identification matrix; calculating the similarity between the first biological identification matrixes with different time periods by using a Gaussian kernel function to obtain a first similarity matrix; wherein the extracted topographical features for each time period include at least body length, body contour, tail shape, color, tail length, and tail length to body length ratio.
In one embodiment, in the step of denoising the ultrasonic data, performing waveform data fusion on the denoised ultrasonic data to obtain fused waveform data, the method specifically includes the following steps: denoising the ultrasonic data to obtain denoised waveform data; respectively converting the de-noised waveform data of different sensors into a plurality of waveform matrixes; converting the plurality of waveform matrixes into a single waveform matrix, and drawing a time domain waveform diagram based on the single waveform matrix.
In one embodiment, in the step of extracting the time-frequency characteristics of the fused waveform data, constructing a second biological recognition matrix by using the extracted time-frequency characteristics, and calculating to obtain a second similarity matrix based on the second biological recognition matrix, the method specifically comprises the following steps: respectively carrying out a plurality of transformation processes on the fused waveform data with different time periods to obtain a plurality of frequency spectrums, energy spectrums and power spectrums with different time periods; respectively extracting time-frequency characteristics of the frequency spectrum, the energy spectrum and the power spectrum with the same time period to obtain time-frequency characteristics; constructing a second biological recognition matrix for the time-frequency characteristics with the same time period to obtain the second biological recognition matrix; calculating the similarity between the second biological recognition matrixes with different time periods by using a Gaussian kernel function to obtain a second similarity matrix; wherein the extracted time-frequency characteristics of each time period at least comprise time delay, amplitude, frequency, doppler frequency shift, signal reflection length, energy and power.
In one embodiment, the probability decision processing is performed by mapping the first similarity matrix and the second similarity matrix to the fuzzy attribute decision function respectively, so as to obtain a first basic probability distribution function and a second basic probability distribution function, and the method specifically comprises the following steps: mapping each element in the first similarity matrix and the second similarity matrix to a fuzzy attribute judgment function respectively to carry out probability judgment processing, so as to obtain an attribute judgment value of each element in the first similarity matrix and the second similarity matrix; mapping a plurality of attribute judgment values of the first similarity matrix and the second similarity matrix into probability values by using a ambiguity function to obtain the probability of a judgment result; and obtaining a first basic probability distribution function and a second basic probability distribution function based on the probability of the judgment result.
In one embodiment, before fusing the first basic probability distribution function and the second basic probability distribution function based on the D-S fusion rule, the method further comprises the following steps: judging whether the first basic probability distribution function and the second basic probability distribution function meet the improvement condition or not; and after the improvement condition is met, performing reliability improvement processing on the first basic probability distribution function and the second basic probability distribution function to obtain an improved first basic probability distribution function and an improved second basic probability distribution function.
In one embodiment, the improvement condition is:
m:2θ→[0,1]
Where m is a basic probability distribution function on θ, θ= { H 1,H2,H3,H4},H1 is correct: correctly detecting the existence of the preset animal, and misreporting by H 2: identifying the absence of the preset animal as present, H 3 miss report: failing to correctly identify the actual preset animals, H 4 misjudgment: identifying other objects as preset animals, A is a proposition containing one or more identification frames theta, m (A) represents the supporting degree of evidence on the proposition A, Representing an empty set.
In one embodiment, the reliability improvement processing is performed on the first basic probability distribution function and the second basic probability distribution function, so as to obtain an improved first basic probability distribution function and an improved second basic probability distribution function, and the method specifically comprises the following steps: calculating the correlation between the first similarity matrix and the second similarity matrix by using the Pearson correlation coefficient to obtain a credible similarity matrix between the first similarity matrix and the second similarity matrix; calculating the credibility of the image data and the credibility of the waveform data based on the credibility similarity matrix; multiplying the credibility of the image data by a first basic probability distribution function to obtain an improved first basic probability distribution function; and multiplying the credibility of the waveform data by the second basic probability distribution function to obtain an improved second basic probability distribution function.
In one embodiment, in calculating the confidence level of the image data and the confidence level of the waveform data based on the confidence similarity matrix, in this step,
The confidence similarity matrix formula is as follows:
Wherein D represents a first similarity matrix, L represents a second similarity matrix, cov (D, L) represents a covariance of the first similarity matrix and the second similarity matrix, σ D represents a standard deviation of the first similarity matrix, σ L represents a standard deviation of the second similarity matrix, E represents a mathematical expectation, μ D is a mathematical expectation of the first similarity matrix, and μ L is a mathematical expectation of the second similarity matrix;
The calculation formula of the credibility is as follows:
Where s ij (D, L) represents an element in the reliability similarity matrix, that is, a correlation coefficient between the first similarity matrix and the second similarity matrix, α 1 is the reliability of the image data, and α 2 is the reliability of the waveform data.
The beneficial effects of the invention are as follows:
The method comprises the steps of acquiring images, video data and ultrasonic data by combining a plurality of sensors in the transformer substation, respectively carrying out data fusion on the image data, the video data and the ultrasonic data, respectively carrying out feature extraction on the image data and time-frequency feature extraction on the ultrasonic data, constructing a biological monomer recognition matrix, calculating a similarity matrix of the two types of data through a Gaussian kernel function, obtaining attribute judgment results, probability of each result and probability distribution functions of the two types of data by utilizing a fuzzy attribute judgment function, and finally carrying out fusion on the probability distribution functions by utilizing a D-S fusion rule to obtain a final detection result.
Drawings
In order to more clearly illustrate the technical solutions of the present invention, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of an overall method provided by a preferred embodiment of the present invention;
FIG. 2 is a flow chart of a method for image fusion using a least squares generation countermeasure network (LSGAN) provided in accordance with a preferred embodiment of the present invention;
FIG. 3 is a flow chart of an overall method provided by a second preferred embodiment of the present invention;
fig. 4 is an application scenario diagram of a preferred embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
With the development of a power system, a transformer substation plays a key role in electric energy conversion, transmission and distribution as an important component of the power system. However, substations often face problems with small animal invasion, such as birds, rodents, etc. These animals may enter the transformer station, causing serious consequences such as equipment failure, short circuit, fire disaster, etc., and even causing power grid accidents, so timely and accurate detection and prevention of the invasion of the animals are critical for ensuring safe and stable operation of the power system.
At present, aiming at the problem of intrusion detection of small animals in a transformer substation, methods have been proposed, wherein the multi-sensor information fusion can utilize the advantages of different sensors to improve the detection accuracy, and in the prior art, the intrusion detection method of small animals in the transformer substation based on the multi-sensor information fusion mainly comprises the fusion of an infrared sensor and a sound sensor, the fusion of video monitoring and the infrared sensor and the fusion of the sound sensor and a vibration sensor; the infrared sensor can detect the temperature change of the small animal, the sound sensor can detect the sound of the small animal, the accuracy and the reliability of detection can be improved by fusing the information of the two sensors, however, the infrared sensor can be influenced by the environmental temperature to cause false alarm; the sound sensor is fused with the vibration sensor, the sound sensor can detect the sound of the small animal, the vibration sensor can detect the vibration of the small animal on the equipment, the accuracy of the intrusion detection of the small animal can be improved by fusing the information of the two sensors, however, false alarm or missing alarm can be caused by the interference of environmental noise, and the fusion method has the same problem that the external interference is easy to occur, so that the inaccurate intrusion detection is caused.
In order to solve the problem that the infrared sensor, the sound sensor and the like are easily affected by the outside, so that the detection of the intrusion detection judgment of the small animals is inaccurate, the scheme provides a substation animal intrusion detection method based on multi-sensor information fusion, the flow of the method in the first embodiment is shown in fig. 1, and the specific application scene is shown in fig. 4 and specifically comprises the following steps.
In the embodiment of the application, firstly, video data, image data and ultrasonic data obtained by a plurality of sensors are collected, specifically, the collected data are mouse images, video data and ultrasonic data in a transformer substation environment, and after the video data, the image data and the ultrasonic data are adopted, the problem that sound data or infrared data are easily influenced by external environments can be effectively avoided.
In the embodiment of the application, the video data and the image data are subjected to image registration processing, the registered video data and the image data are subjected to image data fusion to obtain the fused image data, and the two image data can be mutually cross-verified after the video data and the image data are fused, so that the accuracy of the data is improved, and the condition that the detection work cannot be carried out when an image sensor or a video sensor is influenced is avoided.
Preferably, the process of fusing image data includes the steps of:
Converting the acquired video data into picture data through video editing software, and performing image registration on the converted picture data and the acquired image data together to obtain registered image data;
the registered image data of a plurality of sensors at the same moment in the same time under the same time and space are subjected to data fusion by utilizing a least square generation countermeasure network (LSGAN), and the loss function is as follows:
wherein D is a discriminator, G is a generator, z is a noise compliance normalized distribution or Gaussian distribution, z is a probability distribution obeyed by real data x, E x~p(x)、Ez~p(z) is a desired value.
It should be noted that the process of data fusion using least squares generation countermeasure network (LSGAN) is shown in fig. 2, and includes initializing the maximum iteration number and D, G parameters; then substituting the noise Z into G to generate a false image; secondly, updating the D by using the true image and the false image; the loss function of D for the false image is again used to update G until the maximum number of iterations is reached.
In the embodiment of the application, the morphological feature extraction is carried out on the fused image data, the extracted morphological feature is utilized to construct a first biological recognition matrix, the first similarity matrix is obtained by calculation based on the first biological recognition matrix, and after the morphological feature extraction, the basic physiological feature of the detected object can be determined, so that the detection reliability is ensured.
Preferably, the process of calculating the first similarity matrix comprises the steps of:
And respectively extracting the morphological features of the fused image data with different time periods to obtain the morphological features with different time periods, wherein the extracted morphological features of each time period at least comprise a body length l, a body contour s, a tail shape n, a color c, a tail length x and a tail length and body length ratio z.
Constructing a first biological identification matrix D n by using the morphological features with the same time period (where n is a certain period in the whole time region, k is the number of samples in the period), and obtaining a first biological identification matrix D n, namely, in the certain time region, dividing the certain time region into a plurality of time periods, and constructing a plurality of first biological identification matrices D n with different time periods according to the morphological features of the corresponding time periods, wherein the constructed first biological identification matrix D n is as follows:
the similarity between the first biological recognition matrices D n with different time periods is calculated by using the gaussian kernel function, so as to obtain a first similarity matrix S D, and the specific formula is as follows:
Wherein D i,Dj (i, j=1, 2..n) is a sample matrix, i.e., the first biological recognition matrix constructed as described above, ||d i-Dj||2 represents euclidean distance, σ is a parameter of a gaussian kernel function, and element D ij in matrix S D is the similarity between sample matrices D i,Dj.
In the embodiment of the application, the waveform data is fused with the ultrasonic data to obtain the fused waveform data, and the waveform data extracted by utilizing the ultrasonic wave has the advantages of high sensitivity, high speed and low cost, can position and quantify the object to be detected, can provide double guarantee for judging the morphological characteristics by combining the morphological characteristics, and improves the accuracy of determining the type of the object to be detected.
Preferably, the process of waveform data fusion includes the steps of:
Denoising the ultrasonic data to obtain denoised waveform data, and removing noise of the ultrasonic data after denoising, so that the accuracy of the ultrasonic data can be effectively improved;
The waveform data acquired and denoised by different sensors in the same time are respectively converted into a plurality of waveform matrixes, and the waveform matrixes are in the following forms:
Wherein M 1、M2、…、Mj, (j=1, 2, … n) is a matrix obtained by converting waveform data of each sensor, t 0、t1、…、tn is a time sequence, and y jn is an amplitude value of the j-th waveform diagram at time t n.
Converting the plurality of waveform matrixes into a single waveform matrix M, and drawing a time domain waveform diagram based on the single waveform matrix M, wherein the single waveform matrix M has the following formula:
Wherein each row of the matrix M represents a waveform curve.
In the embodiment of the application, the time-frequency characteristic extraction is carried out on the fused waveform data, a second biological recognition matrix is constructed by utilizing the extracted time-frequency characteristic, and a second similarity matrix is obtained by calculation based on the second biological recognition matrix.
Preferably, the calculation process of the second similarity matrix includes the following steps:
Respectively carrying out a plurality of transformation processes on the fused waveform data (namely the time domain waveform diagram) with different time periods to obtain a plurality of frequency spectrums, energy spectrums and power spectrums with different time periods;
wherein the transformation of the spectrum is obtained by Fast Fourier Transform (FFT), the formula is as follows:
where X [ K ] is a spectral function and X [ n ] is a discrete time domain signal.
The transformation of the energy spectrum is obtained by square transformation of Fourier transformation, and the formula is as follows:
the transformation of the power spectrum is obtained by Fourier transformation of an autocorrelation function, and the formula is as follows:
Where R x (τ) represents the autocorrelation function.
Respectively extracting time-frequency characteristics of the frequency spectrum, the energy spectrum and the power spectrum with the same time period to obtain time-frequency characteristics, wherein the time-frequency characteristics extracted in each time period at least comprise time delay t, amplitude w, frequency f, doppler frequency shift b, signal reflection length lambda, energy e and power q;
Constructing a second biological recognition matrix L n for the time-frequency characteristics with the same time period (wherein n is a certain period in the whole time region, k is the number of samples in the period), and obtaining a second biological recognition matrix L n, namely, dividing the certain time region into a plurality of time periods, constructing a plurality of second biological recognition matrices L n with different time periods according to the time-frequency characteristics of the corresponding time periods, wherein the constructed second biological recognition matrix L n is as follows:
And calculating the similarity between the second biological recognition matrixes with different time periods by using a Gaussian kernel function to obtain a second similarity matrix S L, wherein the calculation formula is as follows:
Where L i,Lj is a sample matrix, ||l i-Lj||2 is a euclidean distance, σ is a parameter of a gaussian kernel function, and element L ij in matrix S L is the similarity between sample matrices L i,Lj.
In the embodiment of the application, a first similarity matrix and a second similarity matrix are respectively mapped to a fuzzy attribute decision function to carry out probability decision processing, so as to obtain a first basic probability distribution function and a second basic probability distribution function;
preferably, in the process of obtaining the first basic probability distribution function and the second basic probability distribution function, the method includes the following steps:
Mapping each element in the first similarity matrix and the second similarity matrix to a fuzzy attribute judgment function respectively to carry out probability judgment processing, so as to obtain an attribute judgment value of each element in the first similarity matrix and the second similarity matrix;
And respectively mapping a plurality of attribute judgment values of the first similarity matrix and the second similarity matrix into probability values by using a ambiguity function to obtain the probability (P 11,P12,P13,P14)、(P21,P22,P23,P24) of a judgment result, and constructing to obtain a judgment table.
TABLE 1 Attribute decision results and probabilities
Attribute determination | H 1 (correct) | H 2 (false alarm) | H 3 (missing report) | H 4 (misjudgment) |
Image and video data | P11 | P12 | P13 | P14 |
Ultrasonic data | P21 | P22 | P23 | P24 |
In the table, P 11 is the probability of correct judgment of image and video data, P 12 is the probability of false alarm of judgment of image and video data, P 13 is the probability of false alarm of judgment of image and video data, P 14 is the probability of false alarm of judgment of image and video data, P 21 is the probability of correct judgment of ultrasonic data, P 22 is the probability of false alarm of judgment of ultrasonic data, P 23 is the probability of false alarm of judgment of ultrasonic data, and P 24 is the probability of false judgment of ultrasonic data.
Based on the probabilities of the decision results, a first basic probability distribution function m 1 and a second basic probability distribution function m 2 are obtained.
In the embodiment of the application, the first basic probability distribution function m 1 and the second basic probability distribution function m 2 are fused based on the D-S fusion rule, so that a final detection result is obtained.
Preferably, the process of obtaining the detection result specifically includes the following steps:
establishing an identification frame according to the attribute judgment result;
θ={H1,H2,H3,H4}
wherein, H 1 is correct: correctly detecting the existence of mice, and misreporting by H 2: the non-existing mice were identified as existing, H 3 missing report: misjudgment of failure to correctly identify actual mice and H 4: the other object is identified as a mouse.
And fusing according to the following fusing rules to obtain a final detection result, wherein the fusing rules are as follows:
A, B is provided with,
Where K is the normalization constant:
m (H 1) is calculated through the D-S fusion rule, and a proper threshold value beta is set, and when m (H 1) is not less than beta, the existence of the mouse is considered to be correctly detected.
After the method is adopted, a plurality of sensors in the transformer substation comprise video, image and ultrasonic sensors, image data, video data and ultrasonic data are collected, then the two types of data sources are respectively subjected to data fusion by the image sensors, then the image data are subjected to morphological feature extraction and the ultrasonic data are subjected to time-frequency feature extraction, a respective biological monomer recognition matrix is constructed, similarity matrixes of the two types of data are calculated through a Gaussian kernel function, attribute judgment results, probability of each result and probability distribution function of the two types of data are obtained by using a fuzzy attribute judgment function, and finally a final detection result is obtained by improving the probability distribution function by using a D-S fusion rule.
Example two
The second embodiment of the present solution is substantially identical to the first embodiment, except that before fusing the first basic probability distribution function and the second basic probability distribution function based on the D-S fusion rule, the method further includes the following steps, improving the first basic probability distribution function and the second basic probability distribution function by using the reliability, and please refer to fig. 3:
In an embodiment of the present application, it is determined whether the first basic probability distribution function and the second basic probability distribution function satisfy the improvement condition;
Preferably, the improvement conditions are:
m:2θ→[0,1]
Where m is a basic probability distribution function on θ, θ= { H 1,H2,H3,H4},H1 is correct: correctly detecting the existence of mice, and misreporting by H 2: identifying the absence of the preset animal as present, H 3 miss report: misjudgment of failure to correctly identify actual mice and H 4: identifying other objects as mice, A being one or more propositions contained in an identification framework θ, m (A) representing the degree of evidence support for propositions A, Representing an empty set.
In the embodiment of the application, the reliability improvement processing is performed on the first basic probability distribution function and the second basic probability distribution function until the improvement condition is met, so as to obtain an improved first basic probability distribution function and an improved second basic probability distribution function.
Preferably, the process of improving the first and second basic probability distribution functions specifically comprises the following steps:
and calculating the correlation between the first similarity matrix and the second similarity matrix by using the Pearson correlation coefficient to obtain a credibility similarity matrix s between the first similarity matrix and the second similarity matrix, wherein the credibility similarity matrix s is expressed as follows:
in the formula, the element calculation formula in the matrix s is as follows:
μD=E(D),μL=E(L)
Wherein D represents a first similarity matrix, L represents a second similarity matrix, cov (D, L) represents a covariance of the first similarity matrix and the second similarity matrix, σ D represents a standard deviation of the first similarity matrix, σ L represents a standard deviation of the second similarity matrix, E represents a mathematical expectation, μ D is a mathematical expectation of the first similarity matrix, and μ L is a mathematical expectation of the second similarity matrix;
Calculating the credibility alpha 1 of the image data and the credibility alpha 2(α1,α2 epsilon [0,1] of the waveform data based on the credibility similarity matrix, wherein alpha 1+α2 =1);
The calculation formulas of the credibility alpha 1 and the credibility alpha 2 are as follows:
Where s ij (D, L) represents an element in the reliability similarity matrix, that is, a correlation coefficient between the first similarity matrix and the second similarity matrix, α 1 is the reliability of the image data, and α 2 is the reliability of the waveform data.
Multiplying the credibility of the image data by the first basic probability distribution function to obtain an improved first basic probability distribution function, namely the following formula:
Multiplying the reliability of the waveform data by the second basic probability distribution function to obtain an improved second basic probability distribution function, namely the following formula:
Subsequently, the modified first basic probability distribution function and the modified second basic probability distribution function are fused according to the fusion rule in the first embodiment, that is, the first basic probability distribution function m 1 in the first embodiment is replaced by the modified first basic probability distribution function Replacing the second basic probability distribution function m 2 with the modified second basic probability distribution function/>The final result is obtained.
After the setting mode is adopted, the first basic probability distribution function and the second basic probability distribution function are both improved by combining the credibility, so that the credibility of the first basic probability distribution function and the second basic probability distribution function is higher in D-S fusion, and the judgment accuracy is effectively improved.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that changes and modifications may be made without departing from the principles of the invention, such changes and modifications are also intended to be within the scope of the invention.
Claims (7)
1. The substation animal intrusion detection method based on multi-sensor information fusion is characterized by comprising the following steps of:
collecting video data, image data and ultrasonic data obtained by a sensor;
performing image registration processing on the video data and the image data, and performing image data fusion on the registered video data and the registered image data to obtain fused image data;
extracting morphological features of the fused image data, constructing a first biological identification matrix by using the extracted morphological features, and calculating to obtain a first similarity matrix based on the first biological identification matrix;
the obtaining of the first similarity matrix comprises the following steps:
Respectively extracting morphological features of the fused image data with different time periods to obtain the morphological features with different time periods;
Constructing a first biological identification matrix by using the morphological features with the same time period to obtain the first biological identification matrix;
calculating the similarity between the first biological identification matrixes with different time periods by using a Gaussian kernel function to obtain a first similarity matrix;
wherein the extracted topographical features for each time period include at least body contour, color, tail length to length ratio;
carrying out waveform data fusion on the ultrasonic data to obtain fused waveform data;
The step of obtaining the fused waveform data comprises the following steps:
Denoising the ultrasonic data to obtain denoised waveform data;
respectively converting the de-noised waveform data of different sensors into a plurality of waveform matrixes;
Converting the plurality of waveform matrixes into a single waveform matrix, and drawing a time domain waveform diagram based on the single waveform matrix;
extracting time-frequency characteristics of the fused waveform data, constructing a second biological recognition matrix by using the extracted time-frequency characteristics, and calculating to obtain a second similarity matrix based on the second biological recognition matrix;
obtaining the second similarity matrix comprises the following steps:
respectively carrying out a plurality of transformation processes on the fused waveform data with different time periods to obtain a plurality of frequency spectrums, energy spectrums and power spectrums with different time periods;
Respectively extracting time-frequency characteristics of the frequency spectrum, the energy spectrum and the power spectrum with the same time period to obtain time-frequency characteristics;
Constructing a second biological recognition matrix for the time-frequency characteristics with the same time period to obtain the second biological recognition matrix;
calculating the similarity between the second biological recognition matrixes with different time periods by using a Gaussian kernel function to obtain a second similarity matrix;
the time-frequency characteristics extracted in each time period at least comprise amplitude, frequency and energy;
Mapping the first similarity matrix and the second similarity matrix to fuzzy attribute decision functions respectively to perform probability decision processing to obtain a first basic probability distribution function and a second basic probability distribution function;
and fusing the first basic probability distribution function and the second basic probability distribution function based on the D-S fusion rule to obtain a final detection result.
2. The method for detecting the invasion of the transformer substation animal based on the multi-sensor information fusion according to claim 1, wherein the step of performing image registration processing on video data and image data and performing image data fusion on the registered video data and image data to obtain fused image data specifically comprises the following steps:
Converting the acquired video data into picture data, and performing image registration on the converted picture data and the acquired image data to obtain registered image data;
And carrying out data fusion on registered image data of a plurality of sensors at the same time by using a generating countermeasure network.
3. The method for detecting the intrusion of the transformer substation animal based on the multi-sensor information fusion according to claim 1, wherein the probability judgment processing is performed by mapping the first similarity matrix and the second similarity matrix to the fuzzy attribute judgment function respectively, so as to obtain a first basic probability distribution function and a second basic probability distribution function, and the method specifically comprises the following steps:
Mapping each element in the first similarity matrix and the second similarity matrix to a fuzzy attribute judgment function respectively to carry out probability judgment processing, so as to obtain an attribute judgment value of each element in the first similarity matrix and the second similarity matrix;
Mapping a plurality of attribute judgment values of the first similarity matrix and the second similarity matrix into probability values by using a ambiguity function to obtain the probability of a judgment result;
And obtaining a first basic probability distribution function and a second basic probability distribution function based on the probability of the judgment result.
4. The method for detecting the intrusion of the animals in the transformer substation based on the information fusion of the multiple sensors according to claim 1, wherein,
Before fusing the first basic probability distribution function and the second basic probability distribution function based on the D-S fusion rule, the method further comprises the following steps:
Judging whether the first basic probability distribution function and the second basic probability distribution function meet the improvement condition or not;
and after the improvement condition is met, performing reliability improvement processing on the first basic probability distribution function and the second basic probability distribution function to obtain an improved first basic probability distribution function and an improved second basic probability distribution function.
5. The method for detecting the invasion of the animals in the transformer substation based on the multi-sensor information fusion according to claim 4, wherein the improvement condition is as follows:
m:2θ→[0,1]
Where m is the base probability distribution function on θ,2 θ → [0,1] means that the base probability distribution function is a mapping from 2 θ to [0,1], θ= { H 1,H2,H3,H4},H1 correct: correctly detecting the existence of the preset animal, and misreporting by H 2: identifying the absence of the preset animal as present, H 3 miss report: failing to correctly identify the actual preset animals, H 4 misjudgment: identifying other objects as preset animals, A is a proposition containing one or more identification frames theta, m (A) represents the supporting degree of evidence on the proposition A, Representing an empty set.
6. The method for detecting the intrusion of the animals in the transformer substation based on the information fusion of the multiple sensors according to claim 4, wherein the reliability improvement processing is performed on the first basic probability distribution function and the second basic probability distribution function to obtain the improved first basic probability distribution function and the improved second basic probability distribution function, and the method specifically comprises the following steps:
Calculating the correlation between the first similarity matrix and the second similarity matrix by using the Pearson correlation coefficient to obtain a credible similarity matrix between the first similarity matrix and the second similarity matrix;
Calculating the credibility of the image data and the credibility of the waveform data based on the credibility similarity matrix;
multiplying the credibility of the image data by a first basic probability distribution function to obtain an improved first basic probability distribution function;
and multiplying the credibility of the waveform data by the second basic probability distribution function to obtain an improved second basic probability distribution function.
7. The method for detecting an intrusion into a substation animal based on multi-sensor information fusion according to claim 6, wherein, in calculating the credibility of the image data and the credibility of the waveform data based on the credibility similarity matrix, in this step,
The confidence similarity matrix s is formulated as follows:
Wherein D represents a first similarity matrix, L represents a second similarity matrix, cov (D, L) represents a covariance of the first similarity matrix and the second similarity matrix, σ D represents a standard deviation of the first similarity matrix, σ L represents a standard deviation of the second similarity matrix, E represents a mathematical expectation, μ D is a mathematical expectation of the first similarity matrix, and μ L is a mathematical expectation of the second similarity matrix;
The calculation formula of the credibility is as follows:
Where s ij (D, L) represents an element in the reliability similarity matrix, that is, a correlation coefficient between the first similarity matrix and the second similarity matrix, α 1 is the reliability of the image data, and α 2 is the reliability of the waveform data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311457127.1A CN117523605B (en) | 2023-11-03 | 2023-11-03 | Substation animal intrusion detection method based on multi-sensor information fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311457127.1A CN117523605B (en) | 2023-11-03 | 2023-11-03 | Substation animal intrusion detection method based on multi-sensor information fusion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117523605A CN117523605A (en) | 2024-02-06 |
CN117523605B true CN117523605B (en) | 2024-06-11 |
Family
ID=89754239
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311457127.1A Active CN117523605B (en) | 2023-11-03 | 2023-11-03 | Substation animal intrusion detection method based on multi-sensor information fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117523605B (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104574329A (en) * | 2013-10-09 | 2015-04-29 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasonic fusion imaging method and ultrasonic fusion imaging navigation system |
CN105738779A (en) * | 2016-01-26 | 2016-07-06 | 国网上海市电力公司 | Partial discharge detection method based on multi-source image fusion |
CN105930791A (en) * | 2016-04-19 | 2016-09-07 | 重庆邮电大学 | Road traffic sign identification method with multiple-camera integration based on DS evidence theory |
CN106101641A (en) * | 2016-07-18 | 2016-11-09 | 中国科学院地理科学与资源研究所 | Video frequency monitoring system and monitoring method thereof |
CN106778883A (en) * | 2016-12-23 | 2017-05-31 | 贵州电网有限责任公司电力科学研究院 | A kind of evidence theory intelligent patrol detection information fusion method based on fuzzy set |
CN109086470A (en) * | 2018-04-08 | 2018-12-25 | 北京建筑大学 | A kind of method for diagnosing faults based on fuzzy preference relation and D-S evidence theory |
CN113071518A (en) * | 2021-04-14 | 2021-07-06 | 上海锵玫人工智能科技有限公司 | Automatic unmanned driving method, minibus, electronic equipment and storage medium |
CN114372693A (en) * | 2021-12-31 | 2022-04-19 | 新疆大学 | Transformer fault diagnosis method based on cloud model and improved DS evidence theory |
CN115291055A (en) * | 2022-07-27 | 2022-11-04 | 国网吉林省电力有限公司电力科学研究院 | Weight-adaptive acousto-optic collaborative diagnosis method for external insulation of power equipment |
-
2023
- 2023-11-03 CN CN202311457127.1A patent/CN117523605B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104574329A (en) * | 2013-10-09 | 2015-04-29 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasonic fusion imaging method and ultrasonic fusion imaging navigation system |
CN105738779A (en) * | 2016-01-26 | 2016-07-06 | 国网上海市电力公司 | Partial discharge detection method based on multi-source image fusion |
CN105930791A (en) * | 2016-04-19 | 2016-09-07 | 重庆邮电大学 | Road traffic sign identification method with multiple-camera integration based on DS evidence theory |
CN106101641A (en) * | 2016-07-18 | 2016-11-09 | 中国科学院地理科学与资源研究所 | Video frequency monitoring system and monitoring method thereof |
CN106778883A (en) * | 2016-12-23 | 2017-05-31 | 贵州电网有限责任公司电力科学研究院 | A kind of evidence theory intelligent patrol detection information fusion method based on fuzzy set |
CN109086470A (en) * | 2018-04-08 | 2018-12-25 | 北京建筑大学 | A kind of method for diagnosing faults based on fuzzy preference relation and D-S evidence theory |
CN113071518A (en) * | 2021-04-14 | 2021-07-06 | 上海锵玫人工智能科技有限公司 | Automatic unmanned driving method, minibus, electronic equipment and storage medium |
CN114372693A (en) * | 2021-12-31 | 2022-04-19 | 新疆大学 | Transformer fault diagnosis method based on cloud model and improved DS evidence theory |
CN115291055A (en) * | 2022-07-27 | 2022-11-04 | 国网吉林省电力有限公司电力科学研究院 | Weight-adaptive acousto-optic collaborative diagnosis method for external insulation of power equipment |
Non-Patent Citations (4)
Title |
---|
Gao CY,Zhang ML, Yang BJ.Mobile Manipulators' Object Recognition Method Based on Multi-sensor Information Fusion.《LECTURE NOTES IN ARTIFICIAL INTELLIGENCE》.2008,第5314卷第1099-1108页. * |
基于免疫危险理论及多传感器信息融合的煤矿安全监测;李正杰;中国优秀硕士学位论文全文数据库 工程科技Ⅰ辑;20130715(第07期);B021-43 * |
模糊关联规则在超声波探伤焊缝检验中的运用;刘贵超;;化学工程与装备;20160115(第01期);全文 * |
车辆智能障碍物检测方法及其农业应用研究进展;何勇;蒋浩;方慧;王宇;刘羽飞;;农业工程学报;20180508(第09期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN117523605A (en) | 2024-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108171119B (en) | SAR image change detection method based on residual error network | |
CN116609440B (en) | Intelligent acceptance management method and system for building engineering quality based on cloud edge cooperation | |
CN111428772B (en) | Photovoltaic system depth anomaly detection method based on k-nearest neighbor adaptive voting | |
CN114519372B (en) | One-dimensional range profile target recognition method based on support vector machine | |
CN115731221A (en) | Self-adaptive infrared small target detection method considering neighborhood anisotropy | |
CN111523586A (en) | Noise-aware-based full-network supervision target detection method | |
CN114972871A (en) | Image registration-based few-sample image anomaly detection method and system | |
CN115060184A (en) | Optical fiber perimeter intrusion detection method and system based on recursive graph | |
CN118151119A (en) | Millimeter wave radar open-set gait recognition method oriented to search task | |
CN114528909A (en) | Unsupervised anomaly detection method based on flow log feature extraction | |
CN117523605B (en) | Substation animal intrusion detection method based on multi-sensor information fusion | |
CN117630860A (en) | Gesture recognition method of millimeter wave radar | |
CN117079005A (en) | Optical cable fault monitoring method, system, device and readable storage medium | |
CN116449328A (en) | Radar working mode identification method and system based on time-frequency analysis feature fusion | |
CN113608190B (en) | Sea surface target detection method and system based on three characteristics of singular space | |
CN112633320B (en) | Radar radiation source data cleaning method based on phase image coefficient and DBSCAN | |
CN113361579B (en) | Underwater target detection and identification method, system, equipment and readable storage medium | |
CN113743293A (en) | Fall behavior detection method and device, electronic equipment and storage medium | |
CN115761211A (en) | Petrochemical enterprise pump room equipment temperature monitoring method and device based on RGB image and thermal imaging coupling | |
CN112232329A (en) | Multi-core SVM training and alarming method, device and system for intrusion signal recognition | |
CN107809430B (en) | Network intrusion detection method based on extreme point classification | |
Murthy et al. | Power system insulator condition monitoring automation using mean shift tracker-FIS combined approach | |
CN117274564B (en) | Airport runway foreign matter detection method and system based on graphic-text semantic difference | |
CN116863313B (en) | Target re-identification method and system based on label increment refining and symmetrical scoring | |
CN117079416B (en) | Multi-person 5D radar falling detection method and system based on artificial intelligence algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |