CN106714220B - One kind being based on MEA-BP neural network WSN method for detecting abnormality - Google Patents
One kind being based on MEA-BP neural network WSN method for detecting abnormality Download PDFInfo
- Publication number
- CN106714220B CN106714220B CN201710008709.XA CN201710008709A CN106714220B CN 106714220 B CN106714220 B CN 106714220B CN 201710008709 A CN201710008709 A CN 201710008709A CN 106714220 B CN106714220 B CN 106714220B
- Authority
- CN
- China
- Prior art keywords
- sensor node
- node
- data
- neural network
- sub
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W24/00—Supervisory, monitoring or testing arrangements
- H04W24/04—Arrangements for maintaining operational condition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W84/00—Network topologies
- H04W84/18—Self-organising networks, e.g. ad-hoc networks or sensor networks
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Testing And Monitoring For Control Systems (AREA)
- Testing Or Calibration Of Command Recording Devices (AREA)
- Alarm Systems (AREA)
Abstract
The invention discloses one kind to be based on MEA-BP neural network WSN method for detecting abnormality, and by each distribution sensor node initializing, each sensor node starts to acquire data;Space sub-clustering is carried out to each sensor node using K-means algorithm and obtains several groups cluster;Parameter optimization is carried out to BP neural network using mind evolutionary, is optimized by weight and threshold value of the convergent operation dissimilation to BP neural network, obtains best initial weights and threshold value, inputted best initial weights and threshold value, establish MEA-BP neural network model;Using distributed algorithm, abnormality detection is independently executed to sensor node in every group of sub-clustering, sensor node, which will test result and be transmitted to the leader cluster node of this group of sub-clustering, after abnormality detection further verifies.BP neural network algorithm performance is improved, the learning rate of BP neural network is accelerated, the accuracy rate of anomaly data detection is effectively increased, reduces rate of false alarm.
Description
Technical field
The invention belongs to wireless sensor network (WSN) data reliability detection technique fields, are specifically related to a kind of base
In MEA-BP neural network WSN method for detecting abnormality.
Background technique
Wireless sensor network (WSN) is used as a kind of wireless self-organization network, and wireless sensor network has low energy consumption, section
Point difference flexibly, even without manual maintenance, the features such as can working long hours in harsh environment.By by sensor network
Node is dispersed in Target monitoring area, and the monitoring for carrying out acquisition and the particular event of environmental data is presently the most universal
One of using.Since wireless sensor node resource is limited, and it is easy interference and destruction or external rings by extraneous factor
The influence of border emergency event, the collected data of node probably generate obvious deviation with environmental characteristic under normal circumstances,
This kind of data are known as abnormal data.Therefore, designing a kind of effective method for detecting abnormality is that wireless sensor network is different in recent years
The emphasis of often detection research.
The no theoretical foundation of selection of traditional BP neural network many parameters in training study, so that actual nerve net
Network, which is applied, has limitation, mainly there is that pace of learning is slow, fault-tolerant ability is poor, can converge on local minimum in place of Shortcomings
Deng.By for for environmental monitoring wireless sensor network node, the collected temperature data of wireless sensor network node without
It all can be with other in same sampling time section by statistical natures such as amplitude, frequency or the mean values, intermediate value, variance for being fluctuation
Data have apparent difference, if not considering the otherness between different types of data, undoubtedly will affect the performance of detection algorithm,
Want more accurately to judge data exception, in addition to the temporal correlation of data itself is also contemplated that spatial coherence.In addition, being directed to
The abnormality detection problem of wireless sensor network environment data, BP neural network algorithm, which exists, is easily trapped into locally optimal solution, instruction
The problems such as practicing long time, low efficiency.
Summary of the invention
Goal of the invention: in order to overcome the deficiencies in the prior art, the present invention provides a kind of based on MEA-BP nerve net
Network WSN method for detecting abnormality exists for BP neural network algorithm and is easily trapped into locally optimal solution, training time length, low efficiency
The problems such as, BP neural network is improved using mind evolutionary, BP neural network algorithm performance is improved, accelerates BP
The learning rate of neural network effectively increases the accuracy rate of anomaly data detection, reduces rate of false alarm.
Technical solution: to achieve the above object, of the invention based on MEA-BP neural network WSN method for detecting abnormality, institute
State method the following steps are included:
By each distribution sensor node initializing, each sensor node starts to acquire data S1;
If sensor node number is n, each sensor node is Xtj(j=1,2 ..., n), sensor node XtjSliding
Window is Wj, the sliding window size of each sensor node is m, then sensor node XtjIn its sliding window WjOn measurement
Data sequence isSensor node XtjIn tpMoment acquisition data beThe data include h
Property measurement value, then
S2 carries out space to each sensor node using K-means algorithm and divides to obtain several groups sub-clustering;
It include 1 leader cluster node Xt in every group of sub-clustering if q+1 sensor node forms one group of sub-clusteringcIt is saved with q distribution
Point (Xt1,Xt2,…,Xtq);
S3 carries out parameter optimization to BP neural network using mind evolutionary, by convergent operation dissimilation to BP nerve net
The weight and threshold value of network optimize, and obtain best initial weights and threshold value, input best initial weights and threshold value, establish MEA-BP nerve net
Network model;
S4 uses distributed algorithm, to sensor node (Xt in every group of sub-clustering1,Xt2,…,Xtq) independently execute exception
It detects, sensor node (Xt after abnormality detection1,Xt2,…,Xtq) it will test the cluster head section that result is transmitted to this group of sub-clustering
Point XtcFurther verifying.
Further, the step S2 the following steps are included:
S21 arbitrarily selects K sensor node object to make from the distribution sensor node object of Target monitoring area first
For K cluster centre;
Then S22 is directed to the sensor node object in addition to cluster centre, calculate separately sensor node object and K
Similarity between cluster centre obtains and the immediate cluster centre of sensor node object similarity;
S23 distributes to sensor node object poly- with the immediate cluster centre of sensor node object similarity
Class obtains K cluster after being assigned all the sensors node;
S24 recalculates the cluster centre of this K cluster, obtains new cluster centre;
S25 recalculates the similarity of each sensor node Yu new cluster centre, returns to step S22;
S26 terminates this operation when the cluster centre convergence recalculated.
Further, the step S3 is the following steps are included: generate training data;Determine BP neural network topological structure;
Parameter setting is carried out by mind evolutionary;Initial population, winning sub- population and interim sub- population is randomly generated;To sub- population
Carry out operation similartaxis;Operation dissimilation is carried out to sub- population;Judge whether to meet termination condition, if it is satisfied, then output optimal
Body obtains best initial weights and threshold value, otherwise re-starts convergent operation dissimilation.
Further, the step S4 the following steps are included:
S41 is by temporal correlation to sensor node (Xt1,Xt2,…,Xtq) abnormality detection is independently executed, using each
Current time passes through sensor node XtjSliding window WjDataIt trains neural network, completes
Training data is chosen in the forecast of subsequent time dataRear v sample data, pass through formula (1) count
Calculate the model residual error S of MEA-BP neural network:
Wherein, Er(r=1,2 ..., v) is the sampled data values chosen, and F is the sample data averages chosen,
S42 calculates the confidence interval at sensor node current time, and confidence interval is Wherein SprePrediction for MEA-BP neural network to subsequent time data
Value, tα/2,v-1It is distributed for the t of freedom degree v-1, suitable α value is chosen in t distribution table, obtains t value;
S43 works as subsequent time data SnewWhen in into sensor node sliding window, subsequent time data S is judgednewIt is
In the no fiducial interval range for falling into current time, if so, judging data SnewFor normal data;Otherwise judge the data
SnewFor abnormal data;
Sensor node (Xt after S44 abnormality detection1,Xt2,…,Xtq) will test result and be transmitted to this group of sub-clustering
Leader cluster node XtcIn;
S45 leader cluster node XtcIt is detected by spatial coherence and introduces voting mechanism verificating sensor node abnormal data
Producing cause, reason includes event anomalies, node failure is abnormal and judges by accident.
Further, in the step S42, α=0.05 is chosen.
Further, in the step S45 carry out spatial coherence detection and introduce voting mechanism the following steps are included:
S451 is by the abnormal data of sensor node and other node datas with the sensor node in same sub-clustering
It is compared;It include 1 leader cluster node Xt in every group of sub-clustering if q+1 sensor node forms one group of sub-clusteringcIt is distributed with q
Node (Xt1,Xt2,…,Xtq);
S452 presets error amount θ, if the abnormal data of sensor node is STIf with the sensor node same
The data of other nodes in sub-clustering are Si(i=1,2 ..., q-1), if | ST-Si| < θ, then the counting NN that initial value is 0 add 1,
Count final NN value;
If S453Then judge that the detection node abnormal data is due to event anomalies;IfThen judgement should
Detection node abnormal data is due to node failure or erroneous judgement;IfThe reference node in the sub-clustering is then chosen, if reference
The data of node are SCCIf | ST-SCC|≤θ, then judge the sensor node abnormal data be due to event anomalies, if | ST-SCC
| > θ then judges that the sensor node abnormal data is due to node failure or erroneous judgement;It is wherein in the sub-clustering referring to node
The nearest node of cluster centre node Euclidean distance;
S454 in the step S453, when judgement obtain the sensor node abnormal data be due to node failure or
When erroneous judgement, further judge the sensor node abnormal data be due to node failure or erroneous judgement, specifically includes the following steps:
Then sentenced by carrying out temporal correlation detection to sensor node when generating abnormal data in sensor node continuous time
The sensor node abnormal data that breaks is due to node failure;When sensor node only have the moment be abnormal data, other when
Carving and generating data is normal data, then judges that the sensor node abnormal data is due to erroneous judgement.
The utility model has the advantages that the present invention compared with the prior art, has the advantage, that
There is the problems such as being easily trapped into locally optimal solution, training time length, low efficiency in traditional BP neural network algorithm, it is difficult to
Meet detection demand, it is proposed by the present invention that BP neural network is improved to improve BP neural network using mind evolutionary
Algorithm performance, when handling multidimensional data using between the event correlation and different nodes between sensor network circuit-switched data stream
Spatial coherence, to effectively increase the accuracy rate of anomaly data detection;
Compared with conventional BP neural network, context of methods is by accelerating BP neural network after optimization weight and threshold value
Learning rate, improve abnormality detection rate, reduce False Rate.
Detailed description of the invention
Fig. 1 is t distribution table schematic diagram.
Fig. 2 is k-means clustering algorithm flow chart.
Fig. 3 is MEA Optimized BP Neural Network weight threshold flow chart.
Fig. 4 is MEA-BP neural network anomaly data detection flow chart.
Specific embodiment
The present invention will be further explained with reference to the accompanying drawing.
The invention proposes one kind be based on MEA-BP neural network WSN method for detecting abnormality, introduce the method for the present invention it
Before, some definition are introduced first:
1, sensor network model, in distributed sensor networks, if sensor node number is n, each sensor section
Point is Xtj(j=1,2 ..., n).
2, time series data is a series of sequence datas generated in chronological order by sensor node, its feature
It is fast, a large amount of and continuous arrival of variation.So first having to introduce sliding window mechanism, utilizing before establishing detection model
Sliding window observes the situations of change of data in a nearest period, and rejecting outliers are carried out inside sliding window.
3, sliding window model, sliding window model are for observing the time series number in a nearest sampling time section
According to method is the sliding window for taking a regular length to sensing data, is reduced by handling the new data for being added and just leaving
Time complexity;In distributed sensor networks, if sensor node number is n, each sensor node is Xtj(j=1,
2 ..., n), sensor node XtjSliding window be Wj, the sliding window size of each sensor node is m, then sensor section
Point XtjIn its sliding window WjOn measurement data sequence beSensor node XtjIn tpMoment adopts
The data of collection areThe data include h property measurement value, then
4, verification and measurement ratio refers to the ratio between abnormal data sample number that algorithm detects and actual abnormal data total sample number.
5, rate of false alarm, refer to by algorithm be mistaken for abnormal normal data sample number and total normal data sample number it
Than.
The present invention divides space nodes using K-means algorithm based on the temporal correlation of sensor node
The similar sensor node of data is divided into the same cluster by cluster, then proposes to examine extremely based on MEA-BP neural network WSN
Survey method, this method are broadly divided into three parameter optimization, the detection of abnormal data and the judgement of data exception steps, main special
Point has: (1) carrying out parameter optimization to BP neural network using mind evolutionary, operated by convergent alienation etc. to BP nerve net
Weight, threshold value of network etc. optimize;(2) in the detection-phase of abnormal data mainly to the collected data flow of sensor node
In exceptional data point that may be present identified that the step uses distributed algorithm, independently executed in each sensor node,
Then result is transmitted to leader cluster node and further verified by each sensor node;(3) it proposes each to work as when anomaly data detection
The preceding moment is led to by the history data set training neural network of sensor node sliding window to complete the forecast of subsequent time
The model residual error of neural network is crossed, determines the confidence interval for judging subsequent time data exception, when subsequent time data are fallen
Enter in confidence interval, then data Ei is determined as normally, conversely, the data are further by spatial coherence in leader cluster node
Ground verifying, it is specific method includes the following steps:
Each branch's sensor node is initialized, each sensor node starts to acquire data, if sensor node number is
N, each sensor node are Xtj(j=1,2 ..., n), sensor node XtjSliding window be Wj, the cunning of each sensor node
Dynamic window size is m, then sensor node XtjIn its sliding window WjOn measurement data sequence beSensor node XtjIn tpMoment acquisition data beThe data include h property measurement value,
Then
Space sub-clustering is carried out to each sensor node using K-means algorithm and obtains several groups cluster, by the similar biography of data
Sensor node is divided into the same cluster, improves cluster interior nodes space similarity, which completes before node failure detects, complete
Cheng Houzai carries out temporal correlation detection to node acquisition data and spatial coherence detects, and utilizes in spatial coherence detection
Leading space divide as a result, sensor node first carries out temporal correlation detection to the data that current time acquires, work as the time
When correlation detection shows problematic, then suspicious data is informed leader cluster node by the sensor node, and leader cluster node is to the data
Carry out spatial coherence detection;Referring to Fig. 2, basic thought is: arbitrarily selecting K object from sensor node object first and makees
For cluster centre, for remaining sensor node, then according to the similarity (Euclidean of they and the cluster centre for being selected out
Distance), sensor node of system distribution is given to the cluster of its most like cluster centre respectively, then calculates and each newly to cluster again
Cluster centre (mean values of all objects in the cluster), constantly repeats this process until cluster centre starts convergence;
K-means algorithm steps are as follows:
If sensor node number is p, the coordinate of sensor node is { x(1),x(2),…,x(p), each x(i)(i=1,
2,…,p)∈R;
Step 1, K sensor node is randomly selected as cluster centre, and the coordinate of K cluster centre is u(j)(j=1,
2 ..., k) ∈ R, K cluster centre correspond to K cluster;
Step 2, following process is repeated until cluster centre is restrained
{
For each sensor node sample i, its cluster that should belong to is calculated, formula is as follows:
c(i):=argminj||x(i)-u(j)||2Formula (2)
For each cluster j, the cluster centre of the cluster is recalculated, formula is as follows:
U in formula (2)(j)One in K cluster centre is represented, by continuous adjusting parameter j (j=1,2 ..., k), is made
Obtain the overhead functions c of each point(i)Reach minimum value, c(i)Representative sensor node sample i with distance in K cluster centre most
Close cluster is divided into each point the cluster of a cluster centre nearest apart from it;
Denominator indicates that the sum of sample in each cluster, molecule are sensor node sample i in each cluster in formula (3)
Corresponding coordinate and.
Several groups cluster is finally obtained, includes 1 cluster head in every group of sub-clustering if q+1 sensor node forms one group of sub-clustering
Nodes X tcWith q distribution node (Xt1,Xt2,…,Xtq);
Parameter optimization is carried out to BP neural network using mind evolutionary, by convergent operation dissimilation to BP neural network
Weight and threshold value optimize, obtain best initial weights and threshold value, input best initial weights and threshold value, establish MEA-BP neural network
Model, the process of Optimized BP Neural Network is as shown in figure 3, substantially steps are as follows:
Step 1: mind-evolution initial population generates, and N group number is randomly generated as initial population, in every group of number in solution space
Represent an individual (i.e. neural network structure) comprising n element, each individual matrix be 1*n, group matrix N*n,
Group includes individual;
Step 2: according to BP neural network topological structure, solution space being mapped to space encoder, each space encoder is corresponding
One solution of problem, i.e. an individual, code length are equal to the element number in each individual, and code length n is
N=tL+wL+L+w formula (4)
In formula (4), t is neural network input number of nodes, and w is output node number, and L is node in hidden layer, is selected herein
Take t=19, L=20, w=1;
Step 3: the number of generation number iter, winning sub-group M and interim sub-group T are fallen in definition;Evolutionary process it is each
All groups of individuals in generation become a group, and a group is divided into M winning sub-groups and T interim sub-groups, often
A winning sub-group and interim sub-group contain SG individual, SG are as follows:
SG=N/ (M+T) formula (5)
Usually choose iter=10, M=T=5, N=200, SG=20;
Step 4: the determination of scoring function, neural network are made of input layer, hidden layer, output layer, training sample
Input layer is AK=(a1 k,a2 k,…,at k), (K=1,2 ..., P, P are number of training), input matrix t*p, expectation network
Output layer is YK=(Y1 k,Y2 k,…,Yw k), output matrix w*p, the input of each sample corresponding one exports, one in sample
Share p inputoutput pair;Each node input of hidden layer is Z among networkK=(z1,z2,…,zL), intermediate each node of hidden layer
Output is BK=(b1,b2,…,bL);Each node input of network is QK=(q1,q2,…,qw), each node output of network output layer is
GK=(g1,g2,…,gw).Define input layer and hidden layer weight Wij(i=1,2 ..., t, j=1,2 ..., L), hidden layer with it is defeated
Layer weight V outju(j=1,2 ..., L, u=1,2 ..., w), hidden layer threshold value be { θj(j=1,2 ..., L) }, output layer threshold value
{βu(u=1,2 ..., w) }, matrix w*w.According to coding rule, WijIt is the 1st in single individual to (L*t) a element,
Matrix is L*t;Vju(L*t+1) is a in single individual arrives (L*t+w*L) a element, matrix w*L;θjFor in single individual
(L*t+w*L+1) is a to (L*t+w*L+L) a element, matrix w*L;βu(L*t+w*L+1) is a in single individual
To a last element.It calculates each node of network hidden layer and inputs Zj, then use { ZjCalculated by S type activation primitive it is implicit
Each node of layer exports { bj, S type function expression formula are as follows:
bj=f (Zj), j=1,2 ... L formula (7)
∑ Wa is known as the activation value of this model in formula (7), is the input summation of model;F () swashs in formula (7) for model
Function living.
Then according to the output { b of hidden layerj, weight VjuAnd threshold value { βuCalculate each node input Q of output layeru, then use
{QuOutput { the G of each node of output layer is calculated by S type functionu}:
Gu=f (Qu) (u=1,2 ..., w) formula (9)
Formula (8) (9) is ibid;
Select the inverse of the mean square error of training sample as the scoring function of each individual and groupykIndicate the desired output of k-th training sample, matrix w*p, GkIndicate actual output valve,
The value come out by neural metwork training, matrix are also w*p, and p is training sample number;
Step 5: training weight and threshold value are uniformly distributed each individual between (- 1,1) and generate n group random number,
A volume matrix is 1*n, is calculated according to scoring function per each and every one as initial weight threshold group according to network query function rule
Body score, the individual of highest scoring are referred to as winner, and the best individual of q score is chosen in punching as winner, chooses q=
10;
Step 6: sub-group operation similartaxis, in sub-group, individual become victor and the process that competes be called it is convergent, one
Sub-group it is convergent during, if not generating new victor, become subgroup body maturation, convergent process terminates, respectively with every
Centered on one winner, Normal Distribution generates individual, forms M winning sub-groups and T interim sub-groups, every height
Group includes SG individual, which can be expressed as N (u, ∑), and u is the center vector of normal distribution in formula, and ∑ is this
The center of the covariance matrix of normal distribution, normal distribution is exactly the coordinate of victor, i.e. the weight and threshold value of victor;
Step 7: sub-group operation dissimilation.Dissimilation is the mistake that each sub-group is competed as victor in entire solution space
Journey.By global advertisement plate, it has recorded each sub-group scoring function value and maturity, carries out between each sub-group complete
Office's competition, if an interim sub-group score is higher than some mature winning sub-group score, interim sub-group replaces winning
Sub-group, the individual in former winning sub-group are abandoned;If the score of a mature interim sub-group is excellent lower than any one
Win the score of sub-group, then the interim sub-group is abandoned, and individual therein is released.The number for the interim sub-group being abandoned
It is denoted as Tr, the number for the winning sub-group being abandoned is denoted as Mr, under the guidance of global advertisement plate, regenerates in solution space
Mr+Tr interim sub-groups;
Step 8: parsing optimum individual.Repeat above-mentioned 6,7 step, when meeting iteration stopping condition, mind evolutionary
Terminate optimization process.At this point, being parsed according to coding rule to the optimum individual searched out, to obtain corresponding BP mind
Best initial weights and threshold value through network input best initial weights and threshold value, establish MEA-BP neural network model;
Referring to Fig. 4, using distributed algorithm, to sensor node (Xt in every group of sub-clustering1,Xt2,…,Xtq) independently hold
Row abnormality detection, sensor node (Xt after abnormality detection1,Xt2,…,Xtq) will test result and be transmitted to this group of sub-clustering
Leader cluster node XtcFurther verifying, comprising the following steps:
S41 is by temporal correlation to sensor node (Xt1,Xt2,…,Xtq) abnormality detection is independently executed, using each
Current time passes through sensor node XtjSliding window WjDataIt trains neural network, completes
Training data is chosen in the forecast of subsequent time dataRear v sample data, pass through formula (1) count
Calculate the model residual error S of MEA-BP neural network:
Wherein, Er(r=1,2 ..., v) is the sampled data values chosen, and F is the sample data averages chosen,
S42 calculates the confidence interval at sensor node current time, and confidence interval is Wherein SprePrediction for MEA-BP neural network to subsequent time data
Value, tα/2,v-1It is distributed for the t of freedom degree v-1, suitable α value is chosen in t distribution table, obtains t value;T distribution table such as Fig. 1 institute
Show, normal distribution also known as Gaussian Profile, if stochastic variable one mathematic expectaion of obedience is μ, the Gaussian Profile that variance is σ 2, is denoted as
N (μ, σ 2), our usually said standardized normal distributions are μ=0, the normal distribution of σ=1.From average value be μ, variance is σ 2
Standardized normal distribution totality in extract capacity be v sample, sample obey average value be μ, variance be σ 2/v normal distribution,
Population variance σ 2 is always unknown, to can only be replaced with s2.If v is very big, s2 is exactly that one of σ 2 preferably estimates
Metering, is still an approximate standardized normal distribution;If v is smaller, s2 and σ's 2 differs greatly, and therefore, sample divides at this time
Cloth is no longer just a standardized normal distribution, but obeys t distribution, and t distribution is a curve becoming according to freedom degree, in coordinate
Axis Y-axis two sides are symmetrical, and mean value is that 0, t distribution is suitable for when population standard deviation is unknown, then uses sample standard deviation generation
For population standard deviation.Freedom degree in t distribution refers to the number that can freely change in any variable.Sample t-test at this time
Only estimate a parameter: population mean, sample size v constitutes v kind for estimating the information of population mean and its variability, disappears
One degree of freedom is consumed to estimate mean value, remaining v-1 freedom degree is for estimating variability, and therefore, sample t-test uses freedom degree
It is distributed for the t of v-1;When not needing estimation parameter, freedom degree v;Usually choose α=0.05.
S43 works as subsequent time data SnewWhen in into sensor node sliding window, subsequent time data S is judgednewIt is
In the no fiducial interval range for falling into current time, if so, judging data SnewFor normal data;Otherwise judge the data
SnewFor abnormal data;
Sensor node (Xt after S44 abnormality detection1,Xt2,…,Xtq) will test result and be transmitted to this group of sub-clustering
Leader cluster node XtcIn;
S45 leader cluster node XtcIt is detected by spatial coherence and introduces voting mechanism verificating sensor node abnormal data
Producing cause, reason includes event anomalies, node failure is abnormal and judges by accident, specifically includes the following steps:
S451 is by the abnormal data of sensor node and other node datas with the sensor node in same sub-clustering
It is compared;It include 1 leader cluster node Xt in every group of sub-clustering if q+1 sensor node forms one group of sub-clusteringcIt is distributed with q
Node (Xt1,Xt2,…,Xtq);
S452 presets error amount θ, if the abnormal data of sensor node is STIf with the sensor node same
The data of other nodes in sub-clustering are Si(i=1,2 ..., q-1), if | ST-Si| < θ, then the counting NN that initial value is 0 add 1,
Count final NN value;
If S453Then judge that the detection node abnormal data is due to event anomalies;IfThen judgement should
Detection node abnormal data is due to node failure or erroneous judgement;IfThe reference node in the sub-clustering is then chosen, if reference
The data of node are SCCIf | ST-SCC|≤θ, then judge the sensor node abnormal data be due to event anomalies, if | ST-SCC
| > θ then judges that the sensor node abnormal data is due to node failure or erroneous judgement;It is wherein in the sub-clustering referring to node
The nearest node of cluster centre node Euclidean distance;Referring to node: carrying out space to each sensor node using K-means algorithm
Sub-clustering obtains several groups cluster, and the similar sensor node of data is divided into the same cluster, according to final sub-clustering as a result, choosing
Take the node nearest apart from the cluster mass center Euclidean distance for referring to node;
S454 in the step S453, when judgement obtain the sensor node abnormal data be due to node failure or
When erroneous judgement, further judge the sensor node abnormal data be due to node failure or erroneous judgement, specifically includes the following steps:
Then sentenced by carrying out temporal correlation detection to sensor node when generating abnormal data in sensor node continuous time
The sensor node abnormal data that breaks is due to node failure;When sensor node only have the moment be abnormal data, other when
Carving and generating data is normal data, then judges that the sensor node abnormal data is due to erroneous judgement;
For the abnormal data in detection node, each node can acquire data with some cycles, and formation belongs to the section
The data flow of point, in order to guarantee the correctness of regional nodes acquisition data sample, each node all needs before uploading data
The exceptional value in sliding window is replaced using the value of neural network prediction.
Data sample derive from Intel's Berkeley laboratory sensor network data, the data sampling frequency be every
Sampling in 31 seconds is primary.100 groups of temperature of sensor node 1-5 node are chosen, humidity is as training data, 20 groups of temperature, humidity
As prediction data.
S1 training=[20.6156,20.6254,20.6450,20.6352,20.6450,20.6156,20.6058,
20.576420.4882,20.4588,20.4392,20.4196,20.3804,20.3510,20.3020,20.2726,
20.2530,20.1942,20.184420.1354,20.0864,20.0668,20.0374,20.0178,19.9982,
19.9786,19.9688,19.8414,19.8022,19.8022,19.7826,19.8022,19.8218,19.8316,
19.8610,19.9002,19.9296,19.9786,19.9884,20.0080,20.0374,20.0472,20.1060,
20.1060,20.1256,20.1354,20.1550,20.2236,20.2432,20.2432,20.3020,20.3216,
20.3608,20.4588,20.4980,20.5176,20.5568,20.5666,20.5960,20.6254,21.4192,
21.4094,21.3996,21.3604,21.3212,21.3016,21.2624,21.2330,21.2232,21.2036,
21.1252,21.1252,21.1056,21.1056,21.0860,21.0860,21.0860,21.0762,21.0664,
21.0664,21.0468,21.0076,20.9880,20.9586,20.9684,20.9096,20.8998,20.8998,
20.8802,20.8704,20.8704,20.8704,20.8704,20.8802,20.8704,20.8802,20.8802,
20.8802,20.8704,20.8704]
[37.5737,37.6079,37.6422,37.6422,37.7107,37.7107,37.7792,37.8477,
38.0529,38.1213,38.189738.1897,38.3263,38.3946,38.4629,38.5311,38.5311,
38.7357,38.8039,38.8720,39.0082,39.0763,39.1443,39.1783,39.2123,39.2803,
39.3143,39.6200,39.755739.7896,39.755739.5521,39.5521,39.5521,39.4162,
39.4502,39.3143,39.2803,39.2123,39.178339.1443,39.110339.0082,39.0082,
38.9401,39.0082,38.9401,38.8379,38.8720,38.8039,38.7357,38.769838.7357,
38.6675,38.4970,38.3946,38.3946,38.3263,38.2580,38.2580,38.2239,
38.155537.9845,38.1897,38.1555,38.0529.37.9845,37.9161,37.8134,
37.813437.8819,37.8819,37.9161,37.9503,37.9503,37.9161,37.9503,37.9161,
37.8477,37.7792,37.7450,37.7450,37.7107,37.7107,37.7107,37.7107,37.6765,
37.6765,37.6422,37.8134,37.7792,37.7107,37.7107,37.6765,37.6765,37.7107,
37.7450,37.7792,37.9161,38.0529]
S1 prediction=[20.7724,20.7332,20.7528,20.7430,20.7332,20.7332,20.7234,
20.7136,20.7234,22.1530,20.7038,20.6940,20.6940,20.7038,20.6940,22.1530,
20.6744,20.6450,20.6352,20.6254][39.6200,39.9929,40.3652,40.9055,41.1414,
41.2761,41.3771,41.4780,41.7805,41.8812,41.9818,41.9483,42.2500,42.3840,
42.3170,52.3170,42.2500,42.2835,42.0824,41.9818]
S2 training=[20.6226,20.6224,20.6350,20.6352,20.6350,20.6256,20.6358,
20.5764,20.4782,20.4388,20.4492,20.4296,20.3604,20.3310,20.3320,20.2326,
20.2330,20.1042,20.1644,20.1354,20.0464,20.0468,20.0274,20.0178,19.9482,
19.9486,19.9588,19.8614,19.8322,19.8222,19.7926,19.8122,19.8518,19.8516,
19.8510,19.9502,19.9296,19.9386,19.9884,20.0080,20.0374,20.0472,20.1060,
20.1060,20.1236,20.1454,20.1350,20.2236,20.2332,20.2432,20.3020,20.3216,
20.3608,20.4588,20.4380,20.5376,20.5568,20.5666,20.5960,20.6354,20.6450,
20.7136,20.7038,20.6744,20.7528,20.8412,20.8506,20.8600,20.8796,20.8802,
20.8402,20.8704,20.8902,20.8704,20.8704,20.8704,20.8802,20.9096,20.9586,
20.9782,20.9380,21.0376,21.0076,21.0174,21.0076,21.0272,21.0468,21.0566,
20.9978,21.0370,21.0468,21.0366,21.0360,21.0368,21.0368,21.0634,21.0732,
21.0938,21.0360,21.1356]
[37.5337,37.6179,37.6222,37.6322,37.7207,37.7307,37.7492,37.8277,
38.0129,38.1113,38.149738.1597,38.3163,38.3546,38.4629,38.5311,38.5311,
38.7357,38.8039,38.8720,39.0082,39.0763,39.1443,39.1783,39.2123,39.2803,
39.3153,39.6200,39.7557,39.7896,39.7357,39.5521,39.5521,39.5521,39.4152,
39.4502,39.3143,39.2803,39.2123,39.1783,39.1443,39.1103,39.0032,39.0082,
38.9431,39.0042,38.8401,38.8479,38.8520,38.8039,38.7357,38.7698,38.7357,
38.6635,38.4970,38.3946,38.3946,38.3263,38.2380,38.2380,38.2339,38.1355,
37.9345,38.1897,38.1555,38.0529,37.9845,37.9161,37.8134,37.8134,37.8819,
37.8819,37.9161,37.9503,37.9503,37.9161,37.9503,37.9161,37.8477,37.7792,
37.7450,37.7450,37.7107,37.7107,37.7107,37.7107,37.6765,37.6765,37.6522,
37.8034,37.7592,37.6907,37.6807,37.6565,37.6865,37.7207,37.7550,37.7732,
37.9261,38.0629]
S2 prediction=(20.5670,20.5376,20.4788,20.4494,20.4984,20.4788,20.4592,
20.4494,20.4788,20.4690,20.4788,20.4592,20.4592,20.4494,20.4396,20.4396,
22.4396,20.4396,20.4396,20.3710)(39.4300,40.7863,40.4722,40.8955,41.3214,
41.4361,41.5671,41.6880,41.8205,41.9036,42.0142,42.0100,42.3200,42.4140,
42.3270,42.3370,52.3500,42.3825,42.1824,41.8918)
S3 training=[20.6156,20.6254,20.6450,20.6352,20.6450,20.6156,20.6058,
20.576420.4882,20.4588,20.4392,20.4196,20.3804,20.3510,20.3020,20.2726,
20.2530,20.1942,20.1844,20.1354,20.0864,20.0668,20.0374,20.0178,19.9982,
19.9786,19.9688,19.8414,19.8022,19.8022,19.7826,19.8022,19.8218,19.8316,
19.8610,19.9002,19.9296,19.9786,19.9884,20.0080,20.0374,20.0472,20.1060,
20.1060,20.1256,20.1354,20.1550,20.2236,20.2432,20.2432,20.3020,20.3216,
20.3608,20.4588,20.4980,20.5176,20.5568,20.5666,20.5960,20.6254,20.6450,
20.7136,20.7038,20.6744,20.7528,20.8312,20.8606,20.8900,20.9096,20.8802,
20.8802,20.8704,20.8802,20.8704,20.8704,20.8704,20.8802,20.9096,20.9586,
20.9782,20.9880,21.0076,21.0076,21.0174,21.0076,21.0272,21.0468,21.0566,
20.9978,21.0370,21.0468,21.0566,21.0860,21.0468,21.0468,21.0664,21.0762,
21.0958,21.0860,21.1056]
[37.5737,37.6079,37.6422,37.6422,37.7107,37.7107,37.7792,37.8477,
38.0529,38.1213,38.189738.1897,38.3263,38.3946,38.4629,38.5311,38.5311,
38.7357,38.8039,38.8720,39.0082,39.0763,39.1443,39.1783,39.2123,39.2803,
39.3143,39.6200,39.7557,39.7896,39.7557,39.5521,39.5521,39.5521,39.4162,
39.4502,39.3143,39.2803,39.2123,39.1783,39.1443,39.1103,39.0082,39.0082,
38.9401,39.0082,38.9401,38.8379,38.8720,38.8039,38.7357,38.7698,38.7357,
38.6675,38.4970,38.3946,38.3946,38.3263,38.2580,38.2580,38.2239,38.1555,
37.9845,38.1897,38.1555,38.0529,37.9845,37.9161,37.8134,37.8134,37.8819,
37.8819,37.9161,37.9503,37.9503,37.9161,37.9503,37.9161,37.8477,37.7792,
37.7450,37.7450,37.7107,37.7107,37.7107,37.7107,37.6765,37.6765,37.6422,
37.8134,37.7792,37.7107,37.7107,37.6765,37.6765,37.7107,37.7450,37.7792,
37.9161,38.0529]
S3 prediction=(20.4396,20.4102,20.4102,20.4004,20.3710,20.3612,20.3612,
20.3612,20.3612,20.3514,20.3710,20.3808,20.3416,20.3220,20.3220,20.3318,
20.3318,22.3220,20.3514,20.3416)(39.4200,40.7963,40.4422,40.8755,41.3614,
41.4161,41.5871,41.6780,41.8405,41.9236,42.0242,42.0200,42.3300,42.4240,
42.3170,42.3270,42.3400,52.3525,42.1624,41.8718)
S4 training=[20.4154,20.4254,20.4450,20.4352,20.4450,20.4154,20.4057,
20.574420.4772,20.4577,20.4392,20.4194,20.3704,20.3510,20.3020,20.2724,
20.2530,20.1942,20.1744,20.1354,20.0744,20.0447,20.0374,20.0277,19.9972,
19.9774,19.9477,19.7414,19.7022,19.7022,19.7724,19.7022,19.7217,19.7314,
19.7410,19.9002,19.9294,19.9774,19.9774,20.0070,20.0374,20.0472,20.1040,
20.1040,20.1254,20.1354,20.1350,20.2234,20.2432,20.2432,20.3020,20.3214,
20.3407,20.4577,20.4970,20.5174,20.5547,20.5444,20.5940,20.4254,20.4450,
20.7134,20.7037,20.4744,20.7527,20.7312,20.7404,20.7900,20.9094,20.7702,
20.7702,20.7704,20.7702,20.7704,20.7704,20.7704,20.7702,20.9094,20.9574,
20.9772,20.9770,21.0074,21.0074,21.0174,21.0074,21.0272,21.0447,21.0544,
20.9977,21.0370,21.0447,21.0544,21.0740,21.0447,21.0447,21.0444,21.0742,
21.0957,21.0840,21.1054]
[37.5737,37.4079,37.4422,37.4422,37.7107,37.7107,37.7792,37.8477,
38.0529,38.1213,38.189738.1897,38.3243,38.3944,38.4429,38.5311,38.5311,
38.7357,38.8039,38.8720,39.0082,39.0743,39.1443,39.1783,39.2123,39.2803,
39.3143,39.4200,39.755739.7894,39.755739.5521,39.5521,39.5521,39.4142,
39.4502,39.3143,39.2803,39.2123,39.178339.1443,39.110339.0082,39.0082,
38.9401,39.0082,38.9401,38.8379,38.8720,38.8039,38.7357,38.749838.7357,
38.4475,38.4970,38.3944,38.3944,38.3243,38.2580,38.2580,38.2239,
38.155537.9845,38.1897,38.1555,38.0529.37.9845,37.9141,37.8134,
37.813437.8819,37.8819,37.9141,37.9503,37.9503,37.9141,37.9503,37.9141,
37.8477,37.7792,37.7450,37.7450,37.7107,37.7107,37.7107,37.7107,37.4745,
37.4745,37.4422,37.8134,37.7792,37.7107,37.7107,37.4745,37.4745,37.7107,
37.7450,37.7792,37.9141,38.0529]
S4 prediction=(20.4394,20.4102,20.4788,20.4494,20.4984,20.4788,20.3412,
20.3412,20.3412,20.3514,20.3710,20.3808,20.3414,20.2240,20.2240,20.2338,
20.2240,20.3318,22.3514,20.3414)(39.4500,40.7343,40.4222,40.8555,41.4214,
41.5341,41.4471,41.4980,41.8805,41.9334,42.0142,42.0120,42.3100,42.4240,
42.3070,42.3470,42.3800,42.3925,52.2024,41.9118)
S5 training=[20.4154,20.4254,20.4450,20.4352,20.4450,20.4154,20.4058,
20.574420.4882,20.4588,20.4392,20.4194,20.3804,20.3510,20.3020,20.2724,
20.2530,20.1942,20.184420.1354,20.0844,20.0448,20.0374,20.0178,19.9982,
19.9784,19.9488,19.8414,19.8022,19.8022,19.7824,19.8022,19.8218,19.8314,
19.8410,19.9002,19.9294,19.9784,19.9884,20.0080,20.0374,20.0472,20.1040,
20.1040,20.1254,20.1354,20.1550,20.2234,20.2432,20.2432,20.3020,20.3214,
20.3408,20.4588,20.4980,20.5174,20.5548,20.5444,20.5940,20.4254,20.4450,
20.7134,20.7038,20.4744,20.7528,20.8312,20.8404,20.8900,20.9094,20.8802,
20.8802,20.8704,20.8802,20.8704,20.8704,20.8704,20.8802,20.9094,20.9584,
20.9782,20.9880,21.0074,21.0074,21.0174,21.0074,21.0272,21.0448,21.0544,
20.9978,21.0370,21.0448,21.0544,21.0840,21.0448,21.0448,21.0444,21.0742,
21.0958,21.0840,21.1054]
[37.5737,37.4079,37.4422,37.4422,37.7107,37.7107,37.7792,37.8477,
38.0529,38.1213,38.189738.1897,38.3243,38.3944,38.4429,38.5311,38.5311,
38.7357,38.8039,38.8720,39.0082,39.0743,39.1443,39.1783,39.2123,39.2803,
39.3143,39.4200,39.755739.7894,39.755739.5521,39.5521,39.5521,39.4142,
39.4502,39.3143,39.2803,39.2123,39.178339.1443,39.110339.0082,39.0082,
38.9401,39.0082,38.9401,38.8379,38.8720,38.8039,38.7357,38.749838.7357,
38.4475,38.4970,38.3944,38.3944,38.3243,38.2580,38.2580,38.2239,
38.155537.9845,38.1897,38.1555,38.0529.37.9845,37.9141,37.8134,
37.813437.8819,37.8819,37.9141,37.9503,37.9503,37.9141,37.9503,37.9141,
37.8477,37.7792,37.7450,37.7450,37.7107,37.7107,37.7107,37.7107,37.4745,
37.4745,37.4422,37.8134,37.7792,37.7107,37.7107,37.4745,37.4745,37.7107,
37.7450,37.7792,37.9141,38.0529]
S5 prediction=(20.4204,205002,20.4200,20.4200,20.3906,20.3612,20.3024,
20.2828,20.3122,20.2828,20.2828,20.2632,20.2436,20.2240,20.2240,20.2338,
20.2240,20.1848,20.1848,22.1848)(39.4050,40.7663,40.4532,40.8865,41.3314,
41.4261,41.5871,41.7080,41.7905,41.9236,42.0242,42.0180,42.3150,42.4040,
42.2970,42.3270,42.3700,42.3535,42.1924,51.9218)
Now using the temperature of sensor node acquisition as sample.As shown in Figure 1, being saved first by K-means algorithm to space
Point is divided, it is assumed that and then 1-5 node carries out parameter optimization using the data of node as shown in Figure 2 in the same cluster,
Establish MEA-BP neural network model.Confidence interval is determined using neural network model residual error, this posterior nodal point acquires new data,
It is predicted by sliding window, if the data of acquisition determine that the data for exceptional value, are transmitted simultaneously not in confidence interval
It is verified to cluster head.Otherwise, the data of acquisition are normal.
The initial weight and threshold value of neural network be it is random, optimum individual is obtained by optimization algorithm, according to coding advise
Optimum individual is then parsed, the element in individual is divided into four parts, input layer and hidden layer weight W, hidden layer and output layer
Weight V, hidden layer threshold value θ, output layer threshold value beta.
W=[0.5773, -0.0366, -0.5442,0.6235,1.1156,0.3768,0.3825,0.3136,
0.1267,-0.7622,-0.5179,1.0760,-0.7651,-0.6149,0.3441,0.9941,0.0842,0.9022,-
0.2135,0.0192;
1.0865,0.2526,0.6696,-0.5513,0.2104,0.2332,0.5453,-0.1723,1.3676,
0.3422,-0.3744,0.8603,0.0330,0.3656,-0.2953,0.0479,-0.3960,-0.6000,0.2272,-
0.7731;
-0.5749,0.5627,0.5373,1.0035,-0.9583,1.1374,0.9935,0.5132,0.5842,
0.4948,-0.5388,0.9441,0.3067,-0.5114,0.3832,-0.0877,-0.6106,-0.8577,-0.4521,
0.5040;
-0.1253,-0.9239,0.5082,0.5222,0.0151,0.0033,-0.2452,-0.3727,-0.7296,
1.0824,1.0505,-0.6283,-0.5399,0.4255,0.3935,-0.5921,-0.6411,-0.8547,0.5181,
0.7837;
-0.4296,-0.6532,0.1644,0.4362,1.2707,0.4877,-0.6715,0.9923,1.2893,
0.8293,,0.9933,-0.6541,-0.1126,0.5300,-0.7672,0.0751,0.2299,-0.2186,0.8131,-
0.9794;
-0.9365,-0.2176,1.0960,0.0105,1.3308,-0.3746,-1.0590,-0.5488,-0.8550,
0.2305,0.4455,-0.2741,-0.0727,-0.8329,0.6602,0.6955,1.2929,-0.2030,0.6805,-
0.1515;
0.9026,0.4729,0.4579,0.0247,-0.3627,-0.1774,0.3376,0.2647,-0.2081,-
0.9794,0.8114,0.0375,-0.1698,0.1191,-0.1169,0.3339,0.4083,-0.3918,-0.3013,
0.8716;
-0.1600,0.5712,-0.4983,-0.5316,-0.4528,-0.1343,0.6454,-0.6407,
0.9763,-0.3584,-0.5402,0.1329,0.7943,0.3628,-0.3161,0.5328,0.5403,0.7219,-
0.3242,0.4581;
-0.0458,-0.7775,0.1541,-0.0951,-0.5324,0.1833,-0.6400,-0.1086,-
0.8386,-0.1227,-1.1306,0.3747,-1.2653,-0.4936,-0.1682,0.7071,-0.3657,1.0828,
0.5700,1.1866;
-0.5898,-0.7801,0.5664,0.0111,0.3069,0.1700,0.3791,-0.3305,0.3633,
0.3725,0.0423,-0.0695,0.2999,0.9656,0.6382,0.1977,-0.4587,0.2812,-0.5573,-
0.3800;
0.9481,-0.0495,0.6783,-0.5981,-0.4970,0.8797,-0.9796,-0.3040,0.9687,-
1.1954,0.3528,-0.1131,-0.1752,-0.8480,0.0283,-0.1791,1.1680,0.1180,1.0543,-
0.7890;
0.9876,-0.5693,0.2064,1.0028,-0.7505,-0.0915,0.8364,0.0396,1.3324,
0.6910,0.4675,0.2987,-0.7527,-0.5169,-0.8266,1.0537,0.9445,-0.1559,-0.2290,
0.7849;
0.7757,0.0837,0.6734,-1.0255,0.2378,0.4449,0.5047,0.0004,0.4182,
0.1178,0.5781,-0.2788,0.5544,-0.6854,0.2246,-0.9034,-0.5780,-1.0345,-1.0602,-
0.5880;
-0.9348,-0.4230,-0.7827,-0.9307,-0.1429,-0.1718,0.2480,0.8521,-
0.9502,0.8372,0.5072,-0.5041,0.0417,-0.0333,0.1855,-0.1131,0.7978,1.2911,
0.3152,0.5734;
-0.4313,0.3686,-0.4494,-0.2113,-0.8895,-0.6692,0.8727,0.5028,-
1.2660,-0.1608,-1.1177,0.9249,-0.7744,-0.8300,-0.6689,-0.6472,0.5649,-
0.0527,-0.2023,0.7030;
0.8426,0.8520,0.6722,-0.9511-0.7137,-0.6196,0.1978,0.5044,-0.6855,-
0.3957,-1.1713,0.8359,0.2987,-0.3332,-0.7046,-0.0307,0.7013,-1.0424,-0.2173,-
0.8635;
0.4429,-0.5530,0.2594,-0.0055,-0.2293,-0.1312,0.1046,-1.1803,0.6552,
1.1167,0.2293,0.8949,0.2872,0.0639,-0.7069,-0.6443,-0.5359,-0.7943,0.0749,
0.2322;
0.1580,0.3637,-0.3663,0.3326,0.5618,-0.0908,0.6683,-0.1572,0.0053,-
0.6843,-1.1245,0.6485-0.4312,-0.7010,-0.1113,0.7301,-0.0746,0.0622,0.1535,-
0.0938;
-1.1087,-0.0046,-0.9751,-0.0265,1.1469,-0.7158,0.6888,-0.5601,0.5212,
0.6409,-0.5567,1.1806,-0.1444,0.5397,-0.5531,0.5221,-0.5053,-1.2739,-0.4797,
0.3794]
V=[- 0.8852, -0.4987, -1.0526,0.3578, -0.0436, -0.6403,1.1844, -1.2437, -
0.9961,-0.1289,-0.8779,-0.0577,-0.5220,0.4834,0.3050,0.5123,0.4276,-0.0404,
0.9207,0.0199]
θ=[- 0.3454, -0.0724, -0.9978, -1.0797,1.0350,0.2819,0.8036,0.2719,
0.2332,-0.6689,0.9885,-0.7912,0.1800,-0.6851,-0.7595,0.2428,-0.1237,-0.2353,-
0.0445,0.4698]
β=[- 1.2285]
Then confidence interval is calculated, the predicted value at 1 one moment of node is randomly selected, being worth is 20.8692, chooses instruction at this time
Practice rear 40 data of data by 1 computation model residual error of formula, the model residual values S=0.1776 found out obtains t value by Fig. 1
It is 2.02108, section bound at this time is calculated as [20.5058,21.2326] by formula 2.
Assuming that node S1 has trained MEA-BP model, new data are added by sliding window, as it appears from the above, working as
When sliding window predicts the 1st data, at this time temperature prediction value be 20.8692, fiducial interval range be [20.5058,
21.2326], this moment sensing data is 20.7724, it can be seen that 20.7724 this data obviously within interval range,
Then think that the time data is normal;When sliding window predicts the 10th data, temperature prediction value is 20.7062 at this time,
Fiducial interval range is [20.3428,21.0696], and this moment sensing data is 22.1530, and the data are in interval range
Except, it is then passed to leader cluster node and carries out spatiality verifying, find the data at the moment in the same cluster by voting mechanism
Interior temperature is temperature data exception abnormal, then that node 1 acquires, and node breaks down or wrong report, if discovery is in same cluster
Middle data are similar, then it is assumed that event occur.Exception can be similarly recognized accurately for the humidity temperature data of other nodes.
The method of the present invention divides node with K-means algorithm based on the temporal correlation of sensor node,
Then MEA-BP neural network model is established on each node, and following each moment each distribution node is detected according to the model
Whether the data of interior arrival are abnormal, and the result after detection is then transmitted to leader cluster node and is verified.Currently, abnormality detection exists
The problem of being all a further investigation in every field, the feature and stringent constraint condition of wireless sensor network uniqueness make
The research of the problem is more challenging.Method proposed by the present invention is multiple primarily directed to resource-constrained wireless sensor network
Miscellaneous abnormality detection greatly reduces the communication consumption between node, and can accurately detect abnormal data, suitable with more environment
It should be able to power.
The above is only a preferred embodiment of the present invention, it should be pointed out that: those skilled in the art are come
It says, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications also should be regarded as
Protection scope of the present invention.
Claims (6)
1. one kind is based on MEA-BP neural network WSN method for detecting abnormality, it is characterised in that: the described method comprises the following steps:
By each distribution sensor node initializing, each sensor node starts to acquire data S1;
If sensor node number is n, each sensor node is Xtj(j=1,2 ..., n), sensor node XtjSliding window
For Wj, the sliding window size of each sensor node is m, then sensor node XtjIn its sliding window WjOn measurement data
Sequence isSensor node XtjIn tpMoment acquisition data beThe data include h
Property measurement value, then
S2 carries out space sub-clustering to each sensor node using K-means algorithm and obtains several groups cluster:
It include 1 leader cluster node Xt in every group of cluster if q+1 sensor node forms one group of clustercWith q distribution node (Xt1,
Xt2..., Xtq);
S3 carries out parameter optimization to BP neural network using mind evolutionary, by convergent operation dissimilation to BP neural network
Weight and threshold value optimize, and obtain best initial weights and optimal threshold, input best initial weights and optimal threshold, establish MEA-BP mind
Through network model;
S4 uses distributed algorithm, to sensor node (Xt in every group of sub-clustering1,Xt2..., Xtq) abnormality detection is independently executed,
Sensor node (Xt after abnormality detection1,Xt2..., Xtq) it will test the leader cluster node Xt that result is transmitted to this group of sub-clusteringc
Further verifying.
2. described based on MEA-BP neural network WSN method for detecting abnormality according to claim 1, it is characterised in that: the step
Rapid S2 the following steps are included:
S21 arbitrarily selects K sensor node object as K cluster centre from distribution sensor node object first;
Then S22 is directed to the sensor node object in addition to cluster centre, calculate separately sensor node object and K cluster
Similarity between center obtains and the immediate cluster centre of sensor node object similarity;
Sensor node object is distributed to the cluster with the immediate cluster centre of sensor node object similarity by S23,
K cluster is obtained after all the sensors node is assigned;
S24 recalculates the cluster centre of this K cluster, obtains new cluster centre;
S25 terminates this operation, otherwise enters S26 when new cluster centre convergence;
S27 returns to step S22, recalculates the similarity of each sensor node Yu new cluster centre.
3. according to claim 1 be based on MEA-BP neural network WSN method for detecting abnormality, it is characterised in that: the step
Rapid S3 is the following steps are included: generate training data;Determine BP neural network topological structure;Parameter is carried out by mind evolutionary
Setting: initial population, winning sub- population and interim sub- population is randomly generated;Operation similartaxis is carried out to sub- population;To sub- population into
Row operation dissimilation;Judge whether to meet termination condition, if it is satisfied, then output optimum individual, obtains best initial weights and threshold value, it is no
Then re-start convergent operation dissimilation.
4. according to claim 1 be based on MEA-BP neural network WSN method for detecting abnormality, it is characterised in that: the step
Rapid S4 the following steps are included:
S41 is by temporal correlation to sensor node (Xt1,Xt2..., Xtq) abnormality detection is independently executed, using each current
Moment passes through sensor node xtjSliding window WjDataIt trains neural network, completes
Training data is chosen in the forecast of subsequent time dataRear v sample trees data, pass through formula
(l) the model residual error S of MEA-BP neural network is calculated:
Wherein, Er (r=1,2 ..., v) is the sampled data values chosen, and F is the sample trees statistical average chosen, formula
S42 calculates the confidence interval at sensor node current time, and confidence interval isWherein SpreFor MEA-BP neural network
To the predicted value of subsequent time data, ta/2,v-1It is distributed for the t of freedom degree v-1, suitable a value is chosen in t distribution table, is obtained
T value;
S43 works as subsequent time data SnewWhen in into sensor node sliding window, subsequent time data S is judgednewWhether fall
Enter in the fiducial interval range at current time, if so, judging data SnewFor normal data;Otherwise judge data SnewFor
Abnormal data;
Sensor node (Xt after S44 abnormality detection1,Xt2..., Xtq) it will test the cluster head that result is transmitted to this group of sub-clustering
Nodes X tcIn
S45 leader cluster node XtcIt is detected by spatial coherence and introduces voting mechanism verificating sensor node abnormal data and generate original
Prisoner, reason includes event anomalies, node failure is abnormal and judges by accident.
5. according to claim 4 be based on MEA-BP neural network WSN method for detecting abnormality, it is characterised in that: the step
In rapid S42, a=0.05 is chosen.
6. according to claim 4 be based on MEA-BP neural network WSN method for detecting abnormality, it is characterised in that: the step
Spatial coherence detection is carried out in rapid S45 and introduce voting mechanism the following steps are included:
S451 is carried out by the abnormal data of sensor node and with other node datas of the sensor node in same sub-clustering
Compare;It include 1 leader cluster node Xt in every group of sub-clustering if q+1 sensor node forms one group of sub-clusteringcWith q distribution node
(Xt1,Xt2..., Xtq);
S452 presets error amount θ, if the abnormal data of sensor node is STIf with the sensor node in same sub-clustering
In other nodes data be Si (i=1,2 ..., q-1), if | ST-Si| < θ, then the counting NN that initial value is 0 add l, count
Final NN value;
If S453The abnormal data for then judging the sensor node is due to event anomalies;IfThen sentence
The abnormal data of the sensor node of breaking is due to node failure or erroneous judgement;IfThen choose the reference in the sub-clustering
Node, if the data referring to node are SCC, if | ST-SCC|≤θ then judges that the sensor node abnormal data is due to event
It is abnormal, if | ST-SCC| > θ then judges that the sensor node abnormal data is due to node failure or erroneous judgement;Wherein referring to node
For the nearest node of the cluster centre node Euclidean distance in the sub-clustering;
S454 is for being due to node failure or erroneous judgement when judgement obtains the sensor node abnormal data in the step S453
When, further judge that the sensor node abnormal data is due to node failure or erroneous judgement, specifically includes the following steps: passing through
Temporal correlation detection is carried out to sensor node, when generating abnormal data in sensor node continuous time, then judgement should
Sensor node abnormal data is due to node failure;It is abnormal data when sensor node only has the moment, other moment produce
Raw data are normal data, then judge that the sensor node abnormal data is due to erroneous judgement.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710008709.XA CN106714220B (en) | 2017-01-06 | 2017-01-06 | One kind being based on MEA-BP neural network WSN method for detecting abnormality |
PCT/CN2017/119421 WO2018126984A2 (en) | 2017-01-06 | 2017-12-28 | Mea-bp neural network-based wsn abnormality detection method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710008709.XA CN106714220B (en) | 2017-01-06 | 2017-01-06 | One kind being based on MEA-BP neural network WSN method for detecting abnormality |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106714220A CN106714220A (en) | 2017-05-24 |
CN106714220B true CN106714220B (en) | 2019-05-17 |
Family
ID=58907069
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710008709.XA Expired - Fee Related CN106714220B (en) | 2017-01-06 | 2017-01-06 | One kind being based on MEA-BP neural network WSN method for detecting abnormality |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN106714220B (en) |
WO (1) | WO2018126984A2 (en) |
Families Citing this family (62)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106714220B (en) * | 2017-01-06 | 2019-05-17 | 江南大学 | One kind being based on MEA-BP neural network WSN method for detecting abnormality |
CN107358021B (en) * | 2017-06-01 | 2020-07-28 | 华南理工大学 | DO prediction model establishment method based on BP neural network optimization |
CN107249000B (en) * | 2017-07-06 | 2020-02-25 | 河南科技大学 | Method for detecting abnormal behaviors of mobile user |
CN107272660B (en) * | 2017-07-26 | 2019-05-17 | 江南大学 | A kind of random fault detection method of the network control system with packet loss |
CN107613540B (en) * | 2017-11-07 | 2019-08-30 | 合肥工业大学 | A kind of wireless chargeable sensor network cluster cluster routing method |
CN108763346B (en) * | 2018-05-15 | 2022-02-01 | 中南大学 | Abnormal point processing method for sliding window box type graph median filtering |
CN109714311B (en) * | 2018-11-15 | 2021-12-31 | 北京天地和兴科技有限公司 | Abnormal behavior detection method based on clustering algorithm |
CN109856299A (en) * | 2018-11-26 | 2019-06-07 | 国家电网有限公司 | A kind of transformer online monitoring differentiation threshold value dynamic setting method, system |
CN110427593B (en) * | 2018-12-19 | 2022-12-02 | 西安电子科技大学 | SMT printing parameter optimization method based on industrial big data |
CN110147829B (en) * | 2019-04-29 | 2022-10-11 | 郑州工程技术学院 | Aircraft data processing method and device based on cloud computing |
CN111899040B (en) * | 2019-05-05 | 2023-09-01 | 腾讯科技(深圳)有限公司 | Method, device, equipment and storage medium for detecting target object abnormal propagation |
CN110084326B (en) * | 2019-05-13 | 2022-12-06 | 东北大学 | Industrial equipment anomaly detection method based on fuzzy set |
CN109963317A (en) * | 2019-05-14 | 2019-07-02 | 中国联合网络通信集团有限公司 | A kind of election of cluster head method, apparatus |
CN110362608B (en) * | 2019-06-11 | 2023-04-28 | 广东工业大学 | Rain flow counting method and local anomaly factor-based energy consumption anomaly detection method |
CN110457550B (en) * | 2019-07-05 | 2022-11-18 | 中国地质大学(武汉) | Method for correcting abnormal operation data in sintering process |
CN110542659B (en) * | 2019-09-06 | 2020-04-07 | 四川大学 | Pearl luster detection method based on visible light spectrum |
CN110750641B (en) * | 2019-09-24 | 2022-02-11 | 武汉大学 | Classification error correction method based on sequence connection model and binary tree model |
CN111127184B (en) * | 2019-11-01 | 2023-05-30 | 复旦大学 | Distributed combined credit evaluation method |
CN110849404B (en) * | 2019-11-18 | 2022-03-22 | 中国华能集团清洁能源技术研究院有限公司 | Continuous discrimination method for sensor data abnormity |
CN111126437B (en) * | 2019-11-22 | 2023-05-02 | 中国人民解放军战略支援部队信息工程大学 | Abnormal group detection method based on weighted dynamic network representation learning |
CN110969198A (en) * | 2019-11-24 | 2020-04-07 | 广东浪潮大数据研究有限公司 | Distributed training method, device, equipment and storage medium for deep learning model |
CN110912272B (en) * | 2019-12-03 | 2023-02-21 | 合肥工业大学 | Urban power grid fault detection method and system based on regional abnormal pattern recognition |
CN111654831B (en) * | 2020-04-14 | 2023-01-31 | 南京信息工程大学 | Grinding machine load detection method based on wireless sensor network |
CN111542010A (en) * | 2020-04-22 | 2020-08-14 | 青岛黄海学院 | WSN data fusion method based on classification adaptive estimation weighting fusion algorithm |
CN111654874B (en) * | 2020-06-03 | 2023-02-24 | 枣庄学院 | Wireless sensor network anomaly detection method |
CN112882445B (en) * | 2020-06-05 | 2021-12-21 | 广域铭岛数字科技有限公司 | Intelligent monitoring and management system for 5G and block chain for intelligent manufacturing |
CN111814826B (en) * | 2020-06-08 | 2022-06-03 | 武汉理工大学 | Rapid detection and rating method for residual energy of retired power battery |
CN111950505B (en) * | 2020-08-24 | 2023-08-29 | 湖南科技大学 | SSA-AANN wind driven generator sensor state evaluation method |
CN112001638B (en) * | 2020-08-25 | 2024-01-23 | 瑞洲建设集团有限公司 | Building site management system based on internet of things |
CN112165485B (en) * | 2020-09-25 | 2022-08-09 | 昆明市网络建设运营有限公司 | Intelligent prediction method for large-scale network security situation |
CN112437440B (en) * | 2020-09-30 | 2024-02-02 | 北京工业大学 | Malicious collusion attack resistance method based on correlation theory in wireless sensor network |
CN112565183B (en) * | 2020-10-29 | 2022-12-09 | 中国船舶重工集团公司第七0九研究所 | Network flow abnormity detection method and device based on flow dynamic time warping algorithm |
CN112418281A (en) * | 2020-11-11 | 2021-02-26 | 国网福建省电力有限公司电力科学研究院 | Fire detection sensor data anomaly detection method and system |
CN112329351A (en) * | 2020-11-19 | 2021-02-05 | 上海嗨酷强供应链信息技术有限公司 | Flow analysis system and method based on data tracking |
CN112437085B (en) * | 2020-11-23 | 2023-03-24 | 中国联合网络通信集团有限公司 | Network attack identification method and device |
CN112506990B (en) * | 2020-12-03 | 2022-10-04 | 河海大学 | Hydrological data anomaly detection method based on spatiotemporal information |
CN112702408A (en) * | 2020-12-20 | 2021-04-23 | 国网山东省电力公司临沂供电公司 | Internet of things system and method based on multi-sensing function |
CN112770282B (en) * | 2020-12-23 | 2022-08-05 | 龙海建设集团有限公司 | Data processing system based on intelligent building Internet of things |
CN112783938B (en) * | 2020-12-30 | 2022-10-04 | 河海大学 | Hydrological telemetering real-time data anomaly detection method |
CN112820120B (en) * | 2020-12-30 | 2022-03-01 | 杭州趣链科技有限公司 | Multi-party traffic flow space-time cross validation method based on alliance chain |
CN112804255B (en) * | 2021-02-09 | 2022-10-18 | 中国人民解放军国防科技大学 | Network abnormal node detection method based on node multidimensional characteristics |
CN112861436A (en) * | 2021-02-18 | 2021-05-28 | 天津大学 | Real-time prediction method for engine emission |
CN113128626A (en) * | 2021-05-28 | 2021-07-16 | 安徽师范大学 | Multimedia stream fine classification method based on one-dimensional convolutional neural network model |
CN113378990B (en) * | 2021-07-07 | 2023-05-05 | 西安电子科技大学 | Flow data anomaly detection method based on deep learning |
CN113556770A (en) * | 2021-07-27 | 2021-10-26 | 广东电网有限责任公司 | Data verification method, device, terminal and readable storage medium |
CN113640308B (en) * | 2021-08-31 | 2024-03-29 | 夏冰心 | Rail anomaly monitoring system based on machine vision |
CN114051218B (en) * | 2021-11-09 | 2024-05-14 | 华中师范大学 | Environment-aware network optimization method and system |
CN114021297B (en) * | 2021-11-18 | 2023-12-01 | 吉林建筑科技学院 | Complex pipe network leakage positioning method based on echo state network |
CN114401516B (en) * | 2022-01-11 | 2024-05-10 | 国家计算机网络与信息安全管理中心 | 5G slice network anomaly detection method based on virtual network traffic analysis |
CN114484732B (en) * | 2022-01-14 | 2023-06-02 | 南京信息工程大学 | Air conditioning unit sensor fault diagnosis method based on voting network |
CN114422554B (en) * | 2022-01-27 | 2024-03-01 | 曹颂群 | Service area intelligent equipment management method and device based on distributed Internet of things |
CN114861776B (en) * | 2022-04-21 | 2024-04-09 | 武汉大学 | Dynamic self-adaptive network anomaly detection method based on artificial immunity technology |
CN114997276B (en) * | 2022-05-07 | 2024-05-28 | 北京航空航天大学 | Heterogeneous multi-source time sequence data anomaly identification method for compression molding equipment |
CN115002824A (en) * | 2022-05-25 | 2022-09-02 | 厦门大学 | Real-time fault detection and recovery method for underwater acoustic network data based on LSTM |
CN115022049B (en) * | 2022-06-06 | 2024-05-14 | 哈尔滨工业大学 | Distributed external network flow data detection method based on calculated mahalanobis distance, electronic equipment and storage medium |
CN115608793B (en) * | 2022-12-20 | 2023-04-07 | 太原科技大学 | Finish rolling temperature regulation and control method for mechanism fusion data |
CN116109176B (en) * | 2022-12-21 | 2024-01-05 | 成都安讯智服科技有限公司 | Alarm abnormity prediction method and system based on collaborative clustering |
CN116257892B (en) * | 2023-05-09 | 2023-08-29 | 广东电网有限责任公司佛山供电局 | Data privacy security verification method for digital archives |
CN116405368B (en) * | 2023-06-02 | 2023-08-22 | 南京信息工程大学 | Network fault diagnosis method and system under high-dimensional unbalanced data condition |
CN117093947B (en) * | 2023-10-20 | 2024-02-02 | 深圳特力自动化工程有限公司 | Power generation diesel engine operation abnormity monitoring method and system |
CN117349779B (en) * | 2023-12-04 | 2024-02-09 | 水利部交通运输部国家能源局南京水利科学研究院 | Method and system for judging potential sliding surface of deep-excavation expansive soil channel side slope |
CN117892095B (en) * | 2024-03-14 | 2024-05-28 | 山东泰开电力电子有限公司 | Intelligent detection method for faults of heat dissipation system for energy storage system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103916896A (en) * | 2014-03-26 | 2014-07-09 | 浙江农林大学 | Anomaly detection method based on multi-dimensional Epanechnikov kernel density estimation |
CN105791051A (en) * | 2016-03-25 | 2016-07-20 | 中国地质大学(武汉) | WSN (Wireless Sensor Network) abnormity detection method and system based on artificial immunization and k-means clustering |
CN106447092A (en) * | 2016-09-12 | 2017-02-22 | 浙江工业大学 | Marine reverse osmosis desalination system performance prediction method based on MEA-BP neural network |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103747537B (en) * | 2014-01-15 | 2017-05-03 | 广东交通职业技术学院 | Wireless sensor network outlier data self-adaption detecting method based on entropy measurement |
CN105764162B (en) * | 2016-05-10 | 2019-05-17 | 江苏大学 | A kind of wireless sensor network accident detection method based on more Attribute Associations |
CN106714220B (en) * | 2017-01-06 | 2019-05-17 | 江南大学 | One kind being based on MEA-BP neural network WSN method for detecting abnormality |
-
2017
- 2017-01-06 CN CN201710008709.XA patent/CN106714220B/en not_active Expired - Fee Related
- 2017-12-28 WO PCT/CN2017/119421 patent/WO2018126984A2/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103916896A (en) * | 2014-03-26 | 2014-07-09 | 浙江农林大学 | Anomaly detection method based on multi-dimensional Epanechnikov kernel density estimation |
CN105791051A (en) * | 2016-03-25 | 2016-07-20 | 中国地质大学(武汉) | WSN (Wireless Sensor Network) abnormity detection method and system based on artificial immunization and k-means clustering |
CN106447092A (en) * | 2016-09-12 | 2017-02-22 | 浙江工业大学 | Marine reverse osmosis desalination system performance prediction method based on MEA-BP neural network |
Non-Patent Citations (1)
Title |
---|
基于神经网络的无线传感器网络异常数据检测方法;胡石 等;《计算机科学》;20141130;第41卷(第11A期);第208-211页 |
Also Published As
Publication number | Publication date |
---|---|
CN106714220A (en) | 2017-05-24 |
WO2018126984A3 (en) | 2018-09-13 |
WO2018126984A2 (en) | 2018-07-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106714220B (en) | One kind being based on MEA-BP neural network WSN method for detecting abnormality | |
CN102340811B (en) | Method for carrying out fault diagnosis on wireless sensor networks | |
CN108400895A (en) | One kind being based on the improved BP neural network safety situation evaluation algorithm of genetic algorithm | |
CN101478534B (en) | Network exception detecting method based on artificial immunity principle | |
CN109525956B (en) | Energy-saving data collection method based on data-driven clustering in wireless sensor network | |
CN110133610A (en) | ULTRA-WIDEBAND RADAR action identification method based on time-varying distance-Doppler figure | |
CN101477374B (en) | Continuous casting bleed-out time sequence spacing combined diagnosis prediction method based on fuzzy neural network | |
CN109714324B (en) | User network abnormal behavior discovery method and system based on machine learning algorithm | |
CN108601026B (en) | Perception data error attack detection method based on random sampling consistency | |
CN109872346A (en) | A kind of method for tracking target for supporting Recognition with Recurrent Neural Network confrontation study | |
CN106407998A (en) | Probability time-varying seawater hydraulic pump fault prediction method | |
CN103942568A (en) | Sorting method based on non-supervision feature selection | |
CN104598968A (en) | Fault diagnosis method of transformer | |
CN111191559A (en) | Overhead line early warning system obstacle identification method based on time convolution neural network | |
CN109902564A (en) | A kind of accident detection method based on the sparse autoencoder network of structural similarity | |
CN105139029A (en) | Activity recognition method and activity recognition device for persons serving sentences | |
CN108415884B (en) | Real-time tracking method for structural modal parameters | |
CN109327480A (en) | A kind of multi-step attack scene method for digging based on neural network and Bayesian network attack graph | |
CN109471847A (en) | A kind of I/O jamming control method and control system | |
CN108133090A (en) | A kind of high-end complex equipment analysis method for reliability of reliability susceptibility driving | |
CN107132515A (en) | A kind of point mark screening technique constrained based on multidimensional information | |
CN101237357B (en) | Online failure detection method for industrial wireless sensor network | |
CN113780432B (en) | Intelligent detection method for operation and maintenance abnormity of network information system based on reinforcement learning | |
CN110705693A (en) | Unmanned aerial vehicle abnormal behavior recognition module and recognition method thereof | |
Bo et al. | Recognition of control chart patterns in auto-correlated process based on random forest |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20190517 Termination date: 20220106 |