CN102665294B - Vehicular sensor networks (VSN) event region detection method based on Dempster-Shafer (D-S) evidence theory - Google Patents

Vehicular sensor networks (VSN) event region detection method based on Dempster-Shafer (D-S) evidence theory Download PDF

Info

Publication number
CN102665294B
CN102665294B CN201210125372.8A CN201210125372A CN102665294B CN 102665294 B CN102665294 B CN 102665294B CN 201210125372 A CN201210125372 A CN 201210125372A CN 102665294 B CN102665294 B CN 102665294B
Authority
CN
China
Prior art keywords
event
sub cell
node
coordinate
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210125372.8A
Other languages
Chinese (zh)
Other versions
CN102665294A (en
Inventor
曾园园
李德识
项慨
曾子明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201210125372.8A priority Critical patent/CN102665294B/en
Publication of CN102665294A publication Critical patent/CN102665294A/en
Application granted granted Critical
Publication of CN102665294B publication Critical patent/CN102665294B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The invention provides a VSN event region detection method based on a D-S evidence theory. The method can be used for effectively detecting an event occurrence region under the condition that a priori knowledge is absent, highly mobile network topology is aimed at and complex changing road traffic scenes are handled with. The method comprises initialization and maintenance of VSN scenarios, a road division sub-cell event monitoring probability module, a road division sub-cell event occurrence probability module, an event occurrence confidence module, an evidence combination conflict computation module, an event region judgment module and an event detection trigger module.

Description

Vehicle mounted sensor network event region detection method based on D-S evidence theory
Technical field
The synergistic data that the present invention relates to vehicle-mounted self-organizing radio sensor network is processed and event monitoring field, more specifically, relate to a kind of new, utilize D-S evidence theory to describe the inconsistency of data fusion in vehicle mounted sensor network, thereby detect the method for the event area in road conditions environment.
Background technology
Along with the development of the technology such as universal and sensing technology, radio communication of automobile, by sensor installation node device in the vehicle travelling on road, interconnected by communication, organize themselves into wireless vehicle mounted sensor network; Vehicle mounted sensor network can be realized the information such as the different kinds of roads traffic in cooperation perception, processing and the transmission urban area between vehicle, is important means and the mode that realizes intelligent transportation.Event monitoring is vehicle mounted sensor network towards one of important application of intelligent transportation, event region detection is the key technology in vehicle mounted wireless sensor network event monitoring, by technology such as the effectively collaborative and Data Fusion between vehicle, effectively detect road conditions event area location, scope, directly affect road conditions emergency and process efficiency and the performance applied.The event region detection of wireless sensor network and self-organizing network is one of application oriented hot issue, all having discussion in technical literature and research paper in recent years.
Related documents: R.Nowak et al.Boundary Estimation in Sensor Network:Theory and Methods.In:Proc.IPSN 2003[c], 2003; K.Ren et al.Secure and Fault-tolerant Event Boundary Detection in Wireless Sensor Networks.IEEE Transactions on Wireless Communications[J], 2008,7 (1); Cao Donglei etc. the tolerant fail algorithm of event region detection in a kind of wireless sensor network. Chinese journal of computers [J], 2007,30 (10); Zhang Shukui etc. based on the event region detection tolerant fail algorithm [J] that merges tree. communication journal, 2010, (09).
The realization of event region detection technology can be divided into three kinds of methods: statistical method, based on classification method and the method based on image processing techniques.Statistical method, mainly by obtaining neighbor node data message, utilizes statistical computational methods to judge event node and non-event node.Method based on classification is processed by the data message of being collected by all nodes, obviously there is larger difference in event area internal node and external node data message, utilize this difference network node can be classified, obtain judging event area node and non-event zone node.Method based on image processing techniques is modified to image processing techniques, makes it be adapted to sensor network event region detection.Wherein, the method based on classification has the features such as realization is simple, complexity is low, is the implementation method that is suitable for event region detection technology in three kinds of methods most.
The implementation method of event region detection technology should be relevant to concrete application, under vehicle mounted sensor network environment, due to the high mobility of vehicle, the complexity of urban highway traffic road conditions environment, cause the dynamic change of network topology very fast, event monitoring cannot simply realize by sensing data threshold value is set; Therefore general wireless self-organization network, wireless sensor network event region detection method can not finely be applied to this application scenarios.Some researchers are for vehicle-mounted self-organizing network, propose to utilize the method for artificial intelligence, work in coordination with by local node, utilize machine learning, SVMs, Bayesian neural network or use hidden Markov model etc., carry out the probability of affair character extraction and classification, decision event generation, realization event monitoring; These event monitoring methods can effectively be monitored road vehicle dependent event under vehicle-mounted self-organizing network environment, but these methods need the information such as the priori under prior given road vehicle specific environment, these are larger owing to affected by natural environment, road features of terrain and human factor etc. in road traffic and Vehicle Driving Cycle, the reasonability of priori value, directly affects event monitoring system performance; In addition the judgement of event is mainly devoted in these researchs, instead of event generation regional extent and positional information, in fact, due to the high mobility of the vehicle under In-vehicle networking environment, the detection of event area has been brought to technological challenge.
Related documents: J.Eriksson et al.The Pothole Patrol:Using a Mobile Sensor Network for Road Surface Monitoring[c] .In Proc.ACM MobiSys, 2008; V.D.Sanchez et al.Advanced support vector machines and kernel me-thods[J] .Neurocomputing, 2003, (55); S Dipti.Evaluation of Adaptive Neural Network Models for Freeway Incident Detection[J] .IEEE Trans.On Intelligent Transportation Systems, 2004,5 (1); He Yi etc. the vehicle monitoring based on hidden Markov measure field model and tracking [J]. Shanghai Communications University's journal, 2008 (2). Zhang Cunbao etc. the freeway traffic event automatic distinguishing method research [J] based on Floating Car. Wuhan University of Technology's journal (traffic science and engineering version), 2006, (06); Zhang Jinglei etc. incident Detection Algorithm progress [J]. Wuhan University of Technology's journal (traffic science and engineering version), 2005, (02).
Summary of the invention
The present invention is directed to above-mentioned existing method existing problems and not enough, propose a kind of towards intelligent transportation application, without priori and effectively improve the vehicle mounted sensor network event region detection method of monitoring efficiency, be devoted under vehicle-mounted self-organizing network dynamic topology environment, by collaborative between vehicle, utilize data message method for amalgamation processing, utilize the evidence conflict of event area and non-event area, effectively detect event area.
A kind of vehicle mounted sensor network event region detection method based on D-S evidence theory of technical scheme of the present invention, it is characterized in that: the urban road area of event monitoring is divided into some sub cells, in the urban road area of event monitoring, be provided with the vehicle of onboard sensor by Ad hoc mode networking, set up vehicle mounted sensor network figure, in vehicle mounted sensor network figure, forming vehicle node by the vehicle that is provided with onboard sensor, is limit between adjacent vehicle node; In the time carrying out event region detection, carry out following steps:
Step 1, distance by each vehicle node to center, sub cell, place, calculate the monitoring weight of each vehicle node to sub cell, place; Change situation according to the direction of vehicle node and speed, calculate the behavior factor of vehicle node; In conjunction with the perception physical quantity historical data of sub cell, vehicle node place, calculate perception data rate of change; And according to result of calculation obtain sub cell, place event produce probability; Implementation is as follows,
Step a, establishes the coordinate (x from certain vehicle node k nk, y nk) to sub cell, place c icenter point coordinate (x ci, y ci) geometric distance, be designated as d k, i=|| (x nk, y nk)-(x ci, y ci) |, vehicle node k is to sub cell, place c imonitoring weight w (n k, c i) be calculated as follows,
w ( n k , c i ) = 1 - d k , i r d k , i ≤ r 0 d k , i > r
Wherein, the monitoring perception radius that r is onboard sensor;
Step b, establishes observing time sequence for (t, t '), and vehicle node k is v in the speed of moment t k, vehicle node k is v in the speed of moment t ' k', max (v k, v k') be speed v k andand v k' in higher value, if max (v k, v k')=0, the behavior factor mu of vehicle node k k=0; If max is (v k, v k') ≠ 0, the behavior factor mu of vehicle node k kbe calculated as follows,
Wherein, α is weight parameter, and θ is rate vector change to angular separation;
Step c, vehicle node k obtains sub cell, place c isequence (t observing time, t ') perception physical quantity historical data, and observing time sequence (t, t ') the time interval be designated as Δ t, ask for the average ave (p of certain physical quantity p in time interval Δ t, Δ t)=(Δ t), wherein sum (p_data) is sub cell c in time interval Δ t to sum (p_data)/N ito the value summation of physical quantity p monitoring institute, (Δ is t) the data monitoring number of times of Δ t time inner sensor to N;
Average ave (the p of physical quantity p in time interval Δ t, the higher value of Δ t) and between the value p_data of Real-Time Monitoring institute of current time t ' physical quantity p (t ') is designated as max (ave (p, Δ t), p_data (t ')), if max is (ave (p, Δ t), p_data (t '))=0, data variation rate γ ci=0, if max (ave (p, Δ t), p_data (t ')) ≠ 0, data variation rate γ cibe calculated as follows,
r ci = | p _ data ( t ′ ) - ave ( p , Δt ) | max ( p _ data ( t ′ ) , ave ( p , Δt ) )
Steps d, according to step a gained monitoring weight w (n k, c i), step b gained behavior factor mu kwith step c the data obtained rate of change γ ci, obtain at sub cell c ievent produce probability P r (c i) as shown in the formula:
Pr ( c i ) = λ 1 K * Σ k = 1 K w ( n k , c i ) μ k + ( 1 - λ ) γ ci
Wherein λ, for adjusting the factor, gets the constant between 0-1; K is sub cell c imaximum vehicle node number, the value of k is 1,2 ..., K;
Step 2, according to D-S evidence theory, the basic probability assignment function that each sub cell event occurs is set; Implementation is as follows,
In certain sub cell, have event produce if T represents, F represents in certain sub cell and produces without event, and target identification framework is expressed as that Θ={ in target identification framework, total state set is 2 for T, F} Θ=Φ, and T, F, Θ={ T, F}} use m i(T) represent sub cell c ithe elementary probability assignment of the state that " has event ", m i(F), m i(Θ) represent respectively sub cell c iin the elementary probability assignment of " without event ", " uncertain " state;
Sub cell c ibasic probability assignment function m i(T) event of calculating according to step 2 produces probability P r (c i) i.e. m is set i(T) ≡ Pr (c i);
Step 3, according to D-S evidence theory, calculate the upper each sub cell of sequence observing time (t, t ') and produce the confidence level functional value Bel of event i(T); Implementation is as follows,
By step 2 gained basic probability assignment function m i(T), be calculated as follows the upper sub cell c of sequence observing time (t, t ') ithe confidence level functional value Bel of generation event i(T),
Bel i(T)=m i(T)
Step 4, according to D-S evidence theory, merge the confidence level functional value of each sub cell and adjacent sub cell, the conflict value between the event evidence that obtains merging; Implementation is as follows,
If sub cell c iall directions on adjacent neighbours sub cell form S set N, by step 3 gained confidence level functional value Bel i(T), be calculated as follows and merge evidence conflict function Con (Bel i, Bel neigh (i)), obtain the conflict value between merged event evidence:
Con(Bel i,Bel neigh(i))=log(1/η), η = Σ 1 | SN | Bel i ( T ) * Bel neigh ( i ) ( T )
Wherein, Bel neigh (i)(T) be and sub cell c ithe confidence level functional value of adjacent each sub cell neigh (i), neigh (i) ∈ SN; | SN| is the sub cell number in S set N;
Step 5, judge whether each sub cell belongs to event area, and implementation is as follows,
If the conflict Con (Bel between step 4 gained merged event evidence i, Bel neigh (i)) be more than or equal to predetermined threshold value Con thtime, judge sub cell c iin event area, otherwise judge sub cell c iin non-event area.
And, periodically carry out event region detection, or the physical quantity being gathered by vehicle sensors triggers and carries out event region detection when more new data exceedes certain preset value.
Technical scheme provided by the present invention is applicable to dynamic vehicle-mounted self-organized network topology environment, feature that vehicular applications is complicated and changeable, without priori, make full use of the space-time correlation of Monitoring Data in network, adopt D-S evidence conflict principle, the inconsistency of recognition network area monitoring information, utilize this evidence difference to classify to event area and non-event area, solve the problem that under vehicle-mounted Sensor Network environment, event area is difficult to effectively monitoring and identification, improve accuracy and the validity of event monitoring and judgement.
Brief description of the drawings
Fig. 1 is that schematic diagram is divided in the region of the embodiment of the present invention;
Fig. 2 is the vehicle mounted sensor network networking schematic diagram towards road event monitoring of the embodiment of the present invention;
Fig. 3 is that 30 vehicle node of the embodiment of the present invention are at t 0the random network topology schematic diagram forming of moment;
Fig. 4 is that 30 vehicle node of the embodiment of the present invention are at t 1the network topology in moment and random generation event schematic diagram;
Fig. 5 is that 30 vehicle node of the embodiment of the present invention are at t 2moment network topology and lasting event schematic diagram;
Fig. 6 is the event region detection result of the embodiment of the present invention.
Fig. 7 is the structure chart of the embodiment of the present invention.
Embodiment
Event region detection method based on classification utilizes the inner and outside data message difference gathering of event area to classify, for judging event scope; D-S evidence theory is for a kind of effective inference method of processing uncertain problem, if represent have the generation of monitoring event, F to indicate without event in certain sub cell in covering path map with T, target identification framework is: Θ={ T, F}.The difference degree when conflict of D-S evidence is used for reflecting specific sky between information evidence: when D-S evidence conflict weighting function value is ∞, represent to merge evidence and have conflict completely, merge evidence and occur to conflict completely with other region evidence, illustrate that merging evidence areas and other region belong to event area and non-event area; When D-S evidence conflict weighting function value is 0, represent to merge evidence and do not have conflict, merge evidence and other region all belongs to non-event area.Be adjacent the evidence inconsistency rule of non-event area according to event area, identification and judgement by this evidence conflict for event scope.
By reference to the accompanying drawings, embodiments of the invention are elaborated.The present embodiment is implemented under the inventive method prerequisite, provided detailed execution mode and concrete operating process, but embodiments of the invention is not limited to following embodiment.
The invention provides a kind of vehicle mounted sensor network event region detection method based on the conflict of D-S evidence, for the purpose of describing simply, by 30 vehicle node networkings that are laid in certain specific monitored area, with t after starting in enforcement 1moment, the random oval event area producing was example, carried out event region detection.
First carry out netinit, comprise event area initialization and In-vehicle networking modeling.
The urban road area of event monitoring is divided into some sub cells, in the urban road area of event monitoring, be provided with the vehicle of onboard sensor by Ad hoc mode networking, set up vehicle mounted sensor network figure, in vehicle mounted sensor network figure, forming vehicle node by the vehicle that is provided with onboard sensor, is limit between adjacent vehicle node.
Event monitoring application covers to monitor urban road road conditions by Vehicle Driving Cycle.Can first the urban road area of event monitoring be approximately to planar rectangular region, extract the plane map of this urban road area, and monitored area is divided into the multiple little sub cell C={c that scale is identical 1, c 2... c n, as shown in Figure 1, can press simply row and column and divide sub cell.Available (X ci, Y ci), (i=1 ... n) the sub cell c of unique identification monitored area i, wherein X cifor line position call number, Y cifor column position call number.The size parameter of dividing sub cell depends on monitoring accuracy requirement, is specified and can in application process, be adjusted by upper layer application.After the vehicle that road area corresponding to extraction plane map travels, by Ad hoc mode networking, be modeled as vehicle mounted sensor network figure.The vehicle of wherein disposing and be mounted with transducer is abstracted into node in figure, between vehicle node, produces opportunistic connection by wireless mode, and these opportunistic communication links are abstracted into the limit of figure; Except the communication between vehicle node, vehicle node can also be linked into trackside access point by wireless mode, and by trackside access point access Internet network.Each vehicle node has been assigned with unique ID number in the time of netinit, node can obtain the information of vehicles such as sailing position in region, speed, direction of oneself being expert at (general also free, operating range, running time, vehicle acceleration according to the GPS installing on vehicle, in addition the inside and outside all types of transducers of installing and place of vehicle as: Temperature Humidity Sensor, 3 axle acceleration sensors etc. can be used for monitoring all types of physical quantitys of road vehicle running environment, for example pavement humidity, pavement temperature.The information of vehicle node monitoring can be delivered to trackside access point by opportunistic wireless link by multi-hop mode, thereby finally accesses the Surveillance center to road traffic road condition monitoring; The vehicle travelling also can receive the global message that Surveillance center issues by trackside access point.Between trackside access point and Surveillance center, can realize communication by Internet.Complete after netinit, the global message of issuing by opportunistic transmission of messages and trackside access point between vehicle node carries out the renewal of network communication node and link.By event region detection, can decision event, such as road surface is wet and slippery, road surface hole, road are congested etc.
Can periodically carry out event region detection, or the more new data being gathered by vehicle sensors triggers and carries out event region detection while exceeding certain preset value.
In the time carrying out event region detection, carry out following steps:
Step 1, distance by each vehicle node to center, sub cell, place, calculate the monitoring weight of each vehicle node to sub cell, place; Change situation according to the direction of vehicle node and speed, calculate the behavior factor of vehicle node; In conjunction with the perception physical quantity historical data of sub cell, vehicle node place, calculate perception data rate of change; And according to result of calculation obtain sub cell, place event produce probability.
Embodiment implementation is as follows:
Step a, establishes the coordinate (x from certain vehicle node k nk, y nk) to sub cell, place c icenter point coordinate (x ci, y ci) geometric distance, be designated as d k, i=|| (x nk, y nk)-(x ci, y ci) |, vehicle node k is to sub cell, place c imonitoring weight w (n k, c i) be calculated as follows,
w ( n k , c i ) = 1 - d k , i r d k , i ≤ r 0 d k , i > r Formula 1
Wherein, the monitoring perception radius that r is onboard sensor.
Step b, in sequence observing time (t, t '), vehicle node k is v in the speed of moment t k, vehicle node k is v in the speed of moment t ' k', max (v k, v k') be speed v k and andv k' in higher value, if max (v k, v k')=0, the behavior factor mu of vehicle node k k=0; If max is (v k, v k') ≠ 0, the behavior factor mu of vehicle node k kbe calculated as follows,
formula 2
Vehicle node behavior is subject to the impact of road event that corresponding behavior change occurs, for example: most of vehicles are selected to change travel direction and avoided event area.Time series (t, t ') according to the observation, vehicle node k is etching speed v in the time of t kwith node etching speed v when the t ' k', calculate each vehicle node behavior factor mu as shown in Equation 2.Wherein α is weight parameter, and for adjusting, direction of traffic changes and speed changes the weights of evaluating for the vehicle behavior factor; θ is rate vector change to angular separation, max (v k, v k') be the higher value (and should be non-vanishing) of t and t ' moment vehicle node rate travel, if max is (v k, v k')=0, vehicle node remains static, and the nodes ' behavior factor gets 0.
Step c, vehicle node k obtains sub cell, place c isequence (t observing time, t ') perception physical quantity historical data, observing time sequence (t, t ') the time interval be designated as Δ t, ask for the average ave (p of certain physical quantity p in time interval Δ t, Δ t)=(Δ t), wherein sum (p_data) is sub cell c in time interval Δ t to sum (p_data)/N ito the value summation of physical quantity p monitoring institute, (Δ is t) the data monitoring number of times of Δ t time inner sensor to N;
Average ave (the p of physical quantity p in time interval Δ t, the higher value of Δ t) and between the value p_data of Real-Time Monitoring institute of current time t ' physical quantity p (t ') is designated as max (ave (p, Δ t), p_data (t ')), if max is (ave (p, Δ t), p_data (t '))=0, data variation rate γ ci=0, if max (ave (p, Δ t), p_data (t ')) ≠ 0, data variation rate γ cibe calculated as follows,
r ci = | p _ data ( t ′ ) - ave ( p , Δt ) | max ( p _ data ( t ′ ) , ave ( p , Δt ) ) Formula 3
Vehicle node can be obtained adjacent cell c by trackside access point iperception physical quantity historical data, the physical quantity average of accountable time interval of delta t (t, t ') time accordingly.When concrete enforcement, the data monitoring number of times of Δ t time inner sensor is relevant with the sample frequency f of transducer, N (Δ t)=Δ t*f.If max (ave (p, Δ t), p_data (t)) be 0, this region does not obtain perception data, its corresponding data variation rate γ cibe 0.
Steps d, according to step a gained monitoring weight w (n k, c i), step b gained behavior factor mu kwith step c the data obtained rate of change γ ci, obtain at sub cell c ievent produce probability P r (c i) as shown in the formula:
Pr ( c i ) = λ 1 K * Σ k = 1 K w ( n k , c i ) μ k + ( 1 - λ ) γ ci Formula 4
Wherein λ, for adjusting the factor, gets the constant between 0-1, and this adjustment factor is the weighing factor to the calculating of sub cell event monitoring probability results for adjustment node behavior and data variation rate; K is sub cell c imaximum vehicle node number, the value of k is 1,2 ..., K.
Step 2, according to D-S evidence theory, the basic probability assignment function that each sub cell event occurs is set; Implementation is as follows,
In certain sub cell, have event produce if T represents, F represents in certain sub cell and produces without event, and target identification framework is expressed as that Θ={ in target identification framework, total state set is 2 for T, F} Θ=Φ, and T, F, Θ={ T, F}} use m i(T) represent sub cell c ithe elementary probability assignment of the state that " has event ", m i(F), m i(Θ) represent respectively sub cell c iin the elementary probability assignment of " without event ", " uncertain " state;
Sub cell c ibasic probability assignment function m i(T) event of calculating according to step 2 produces probability P r (c i) arrange, be each transducer n kto c ithe event of community produces probability function m i(T) ≡ Pr (c i).
Step 3, according to D-S evidence theory, calculate the upper each sub cell of sequence observing time (t, t ') and produce the confidence level functional value Bel of event i(T); Implementation is as follows,
By step 2 gained basic probability assignment function m i(T), be calculated as follows the upper sub cell c of sequence observing time (t, t ') ithe confidence level functional value Bel of generation event i(T).
Bel i(T)=m i(T) formula 5
When concrete enforcement, after the basic probability assignment function summation that obtain in sequence observing time that can also desirable multiple time interval Δ t on average, as confidence level functional value, to improve result accuracy.
Step 4, according to D-S evidence theory, merge the confidence level functional value of each sub cell and adjacent sub cell, the conflict value between the event evidence that obtains merging; Implementation is as follows,
If neighbours sub cell formation set adjacent in all directions of sub cell ci is designated as SN, by step 3 gained confidence level functional value Bel i(T), be calculated as follows and merge evidence conflict function Con (Bel i, Bel neigh (i)), obtain adjacent sub cell and carry out the conflict value of event evidence merging and be:
Con ( Bel i , Bel neigh ( i ) = log ( 1 / η ) , η = Σ 1 | SN | Bel i ( T ) * Bel neigh ( i ) ( T ) Formula 6
Wherein, Bel neigh (i)(T) be and sub cell c ithe confidence level functional value of adjacent each sub cell neigh (i), neigh (i) ∈ SN; | SN| is the sub cell number in S set N.
D-S evidence theory is used for a kind of effective inference method of processing uncertain problem, combination variance degree when D-S evidence consistency is used for reflecting specific sky between information evidence: D-S evidence compatibility function value is at interval (0, ∞).By merging the event generation evidence of adjacent sub cell, for setting up the event area attribute that conflicts with the evidence of non-event region detection, utilize this evidence conflict, classify according to event area inside and extrinsic difference, thereby judge event scope.
Step 5, judge whether each sub cell belongs to event area.For Con>=Con thsub cell, after the critical zone evidence that its place is described merges, there is larger conflict with non-event generation area, this sub cell is dropped within the scope of event area, belongs to event area.Implementation is as follows,
If the conflict Con (Bel between step 4 gained merged event evidence i, Bel neigh (i)) be more than or equal to predetermined threshold value Con thtime, judge sub cell c iin event area, otherwise judge sub cell c iin non-event area.
In every network cycle event region detection process that expires or again set out when new transducer image data reaches setting threshold, repeat above step until the application of road event monitoring finishes, can realize Real-Time Monitoring.
For the purpose of understanding effect of the present invention, provide one section of concrete testing process of embodiment to be described in detail as follows:
After Fig. 1 carries out specific monitoring road area plane information extraction and divides, lay on this basis vehicle mounted sensor network, Fig. 2 is the vehicle mounted sensor network networking schematic diagram towards road event monitoring: monitoring plane topographic map and the sub cell of extracting at Fig. 1 are cut apart on basis, when getting communication radius and being 15 unit lengths, 30 vehicle node in monitored area can form topology as shown in Figure 3 at random: i.e. (t after aforementioned monitored area completes In-vehicle networking initialization 0moment), the node being formed at random by the vehicle node of 30 carry sensors distributes and limit, and its node coordinate distributes as follows:
Node i d:1, X coordinate: 0.127567, Y coordinate: 23.5572
Node i d:2, X coordinate: 71.2967, Y coordinate: 7.43736
Node i d:3, X coordinate: 29.7266, Y coordinate: 36.0027
Node i d:4, X coordinate: 28.0816, Y coordinate: 98.529
Node i d:5, X coordinate: 35.081, Y coordinate: 93.4233
Node i d:6, X coordinate: 19.6487, Y coordinate: 85.757
Node i d:7, X coordinate: 3.83374, Y coordinate: 0.85757
Node i d:8, X coordinate: 68.9435, Y coordinate: 48.5885
Node i d:9, X coordinate: 30.0287, Y coordinate: 80.2179
Node i d:10, X coordinate: 10.061, Y coordinate: 44.8012
Node i d:11, X coordinate: 70.4172, Y coordinate: 97.3266
Node i d:12, X coordinate: 72.6966, Y coordinate: 79.0612
Node i d:13, X coordinate: 61.9608, Y coordinate: 4.0376
Node i d:14, X coordinate: 96.9546, Y coordinate: 6.83615
Node i d:15, X coordinate: 32.6472, Y coordinate: 98.5076
Node i d:16, X coordinate: 8.21131, Y coordinate: 1.80059
Node i d:17, X coordinate: 2.81991, Y coordinate: 56.7248
Node i d:18, X coordinate: 56.7574, Y coordinate: 64.8122
Node i d:19, X coordinate: 79.263, Y coordinate: 38.5052
Node i d:20, X coordinate: 41.8152, Y coordinate: 2.64595
Node i d:21, X coordinate: 99.1433, Y coordinate: 20.9906
Node i d:22, X coordinate: 94.7456, Y coordinate: 54.5885
Node i d:23, X coordinate: 106.072, Y coordinate: 62.8193
Node i d:24, X coordinate: 40.9961, Y coordinate: 94.9461
Node i d:25, X coordinate: 25.4396, Y coordinate: 52.0829
Node i d:26, X coordinate: 25.6108, Y coordinate: 89.7366
Node i d:27, X coordinate: 41.221, Y coordinate: 41.2152
Node i d:28, X coordinate: 81.8714, Y coordinate: 35.551
Node i d:29, X coordinate: 41.3889, Y coordinate: 21.9001
Node i d:30, X coordinate: 7.82525, Y coordinate: 58.7909
In the present embodiment, vehicle node can along level left, level to the right, vertically upward, four direction and Mean Speed are travelled in the speed range of 0~10 unit length/unit interval vertically downward.After initialization, each vehicle node moves according to setting path in monitored area at random according to car travel mode, and in the time running into event area, vehicle will be avoided region and detour or slow down and go slowly.
Suppose t 1moment has produced topology as shown in Figure 4, and has produced event, t in the random oval district generating 1moment produce network topological information: vehicle node coordinate, moving direction and speed are as follows.
Node i d:1, X coordinate: 1.77252, Y coordinate: 43.788 moving directions: the level speed that moves right: 3
Node i d:2, X coordinate: 46.6225, Y coordinate: 22.7485 moving directions: rate travel vertically upward: 2
Node i d:3, X coordinate: 53.5279, Y coordinate: 33.5154 moving directions: the level speed that moves right: 3
Node i d:4, X coordinate: 55.3273, Y coordinate: 58.5284 moving directions: rate travel vertically downward: 8
Node i d:5, X coordinate: 99.3313, Y coordinate: 79.0612 moving directions: rate travel vertically downward: 9
Node i d:6, X coordinate: 55.1729, Y coordinate: 5.91754 moving directions: rate travel vertically downward: 5
Node i d:7, X coordinate: 41.6474, Y coordinate: 75.1671 moving directions: the level speed that moves right: 5
Node i d:8, X coordinate: 23.5832, Y coordinate: 85.3359 moving directions: rate travel vertically downward: 5
Node i d:9, X coordinate: 3.9613, Y coordinate: 14.6123 moving directions: rate travel vertically downward: 4
Node i d:10, X coordinate: 51.8998, Y coordinate: 13.538 moving directions: rate travel vertically downward: 3
Node i d:11, X coordinate: 98.5827, Y coordinate: 83.9381 moving directions: rate travel vertically upward: 3
Node i d:12, X coordinate: 43.2621, Y coordinate: 90.0906 moving directions: rate travel vertically downward: 0
Node i d:13, X coordinate: 45.1454, Y coordinate: 66.3472 moving directions: the level speed that moves right: 6
Node i d:14, X coordinate: 46.6594, Y coordinate: 48.8083 moving directions: rate travel vertically upward: 8
Node i d:15, X coordinate: 4.50179, Y coordinate: 30.0485 moving directions: level is moved to the left speed: 0
Node i d:16, X coordinate: 89.3106, Y coordinate: 77.4346 moving directions: the level speed that moves right: 7
Node i d:17, X coordinate: 87.6556, Y coordinate: 29.9936 moving directions: rate travel vertically downward: 9
Node i d:18, X coordinate: 74.5631, Y coordinate: 58.5437 moving directions: rate travel vertically upward: 8
Node i d:19, X coordinate: 14.4151, Y coordinate: 1.26652 moving directions: rate travel vertically upward: 1
Node i d:20, X coordinate: 37.5317, Y coordinate: 34.8949 moving directions: the level speed that moves right: 4
Node i d:21, X coordinate: 11.5381, Y coordinate: 72.1641 moving directions: rate travel vertically downward: 2
Node i d:22, X coordinate: 5.62304, Y coordinate: 52.3179 moving directions: the level speed that moves right: 2
Node i d:23, X coordinate: 24.9596, Y coordinate: 75.161 moving directions: level is moved to the left speed: 5
Node i d:24, X coordinate: 86.1382, Y coordinate: 83.2575 moving directions: rate travel vertically upward: 3
Node i d:25, X coordinate: 95.8232, Y coordinate: 83.8313 moving directions: rate travel vertically downward: 3
Node i d:26, X coordinate: 28.1689, Y coordinate: 84.106 moving directions: level is moved to the left speed: 4
Node i d:27, X coordinate: 75.0298, Y coordinate: 53.8743 moving directions: rate travel vertically downward: 3
Node i d:28, X coordinate: 90.979, Y coordinate: 89.462 moving directions: rate travel vertically upward: 9
Node i d:29, X coordinate: 95.9676, Y coordinate: 96.3683 moving directions: rate travel vertically downward: 6
Node i d:30, X coordinate: 45.9746, Y coordinate: 96.411 moving directions: the level speed that moves right: 6
Calculate the monitoring weight of each vehicle node to sub cell, place according to formula 1, get perception radius r and equal sub cell radius, obtain result as follows.
Node 1 to center, sub cell, place (5,45) monitoring weight is: 0.770162
Node 2 to center, sub cell, place (45,25) monitoring weight is: 0.814986
Node 3 to center, sub cell, place (45,35) monitoring weight is: 0.42292
Node 4 to center, sub cell, place (55,55) monitoring weight is: 0.763764
Node 5 to center, sub cell, place (95,75) monitoring weight is: 0.604165
Node 6 to center, sub cell, place (55,5) monitoring weight is: 0.937754
Node 7 to center, sub cell, place (35,75) monitoring weight is: 0.556701
Node 8 to center, sub cell, place (25,85) monitoring weight is: 0.902928
Node 9 to center, sub cell, place (5,15) monitoring weight is: 0.926086
Node 10 to center, sub cell, place (45,15) monitoring weight is: 0.529802
Node 11 to center, sub cell, place (85,85) monitoring weight is: 0.0917224
Node 12 to center, sub cell, place (35,95) monitoring weight is: 0.35929
Node 13 to center, sub cell, place (45,65) monitoring weight is: 0.909663
Node 14 to center, sub cell, place (45,45) monitoring weight is: 0.72306
Node 15 to center, sub cell, place (5,35) monitoring weight is: 0.668235
Node 16 to center, sub cell, place (85,75) monitoring weight is: 0.66996
Node 17 to center, sub cell, place (75,25) monitoring weight is: 0.0929916
Node 18 to center, sub cell, place (65,55) monitoring weight is: 0.320095
Node 19 to center, sub cell, place (15,5) monitoring weight is: 0.748065
Node 20 to center, sub cell, place (35,35) monitoring weight is: 0.831077
Node 21 to center, sub cell, place (15,75) monitoring weight is: 0.701656
Node 22 to center, sub cell, place (5,55) monitoring weight is: 0.816431
Node 23 to center, sub cell, place (25,75) monitoring weight is: 0.988934
Node 24 to center, sub cell, place (75,85) monitoring weight is: 0.248423
Node 25 to center, sub cell, place (85,85) monitoring weight is: 0.274257
Node 26 to center, sub cell, place (25,85) monitoring weight is: 0.780494
Node 27 to center, sub cell, place (65,55) monitoring weight is: 0.327152
Node 28 to center, sub cell, place (85,85) monitoring weight is: 0.502638
Node 29 to center, sub cell, place (85,95) monitoring weight is: 0.263159
Node 30 to center, sub cell, place (45,95) monitoring weight is: 0.885674
Time in the oval event area producing in the present embodiment is at (t 1, t 2) interior continuing, t 2moment topology information is node coordinate, moving direction and rate travel, as follows.
Node i d:1, X coordinate: 4.77252, Y coordinate: 43.788 moving directions: rate travel vertically upward: 4
Node i d:2, X coordinate: 46.6225, Y coordinate: 20.7485 moving directions: rate travel vertically upward: 2
Node i d:3, X coordinate: 56.5279, Y coordinate: 33.5154 moving directions: the level speed that moves right: 3
Node i d:4, X coordinate: 55.3273, Y coordinate: 66.5284 moving directions: rate travel vertically downward: 1
Node i d:5, X coordinate: 99.3313, Y coordinate: 88.0612 moving directions: rate travel vertically downward: 10
Node i d:6, X coordinate: 55.1729, Y coordinate: 10.9175 moving directions: rate travel vertically downward: 5
Node i d:7, X coordinate: 46.6474, Y coordinate: 75.1671 moving directions: the level speed that moves right: 5
Node i d:8, X coordinate: 23.5832, Y coordinate: 90.3359 moving directions: rate travel vertically downward: 5
Node i d:9, X coordinate: 3.9613, Y coordinate: 18.6123 moving directions: rate travel vertically downward: 4
Node i d:10, X coordinate: 51.8998, Y coordinate: 16.538 moving directions: rate travel vertically downward: 3
Node i d:11, X coordinate: 98.5827, Y coordinate: 80.9381 moving directions: rate travel vertically downward: 4
Node i d:12, X coordinate: 43.2621, Y coordinate: 90.0906 moving directions: rate travel vertically downward: 0
Node i d:13, X coordinate: 51.1454, Y coordinate: 66.3472 moving directions: the level speed that moves right: 1
Node i d:14, X coordinate: 46.6594, Y coordinate: 40.8083 moving directions: rate travel vertically upward: 1
Node i d:15, X coordinate: 4.50179, Y coordinate: 30.0485 moving directions: level is moved to the left speed: 1
Node i d:16, X coordinate: 96.3106, Y coordinate: 77.4346 moving directions: the level speed that moves right: 8
Node i d:17, X coordinate: 87.6556, Y coordinate: 38.9936 moving directions: rate travel vertically downward: 9
Node i d:18, X coordinate: 74.5631, Y coordinate: 50.5437 moving directions: rate travel vertically upward: 1
Node i d:19, X coordinate: 14.4151, Y coordinate: 0.266518 moving direction: rate travel vertically upward: 1
Node i d:20, X coordinate: 41.5317, Y coordinate: 34.8949 moving directions: the level speed that moves right: 4
Node i d:21, X coordinate: 11.5381, Y coordinate: 74.1641 moving directions: rate travel vertically downward: 2
Node i d:22, X coordinate: 7.62304, Y coordinate: 52.3179 moving directions: rate travel vertically upward: 3
Node i d:23, X coordinate: 19.9596, Y coordinate: 75.161 moving directions: level is moved to the left speed: 5
Node i d:24, X coordinate: 86.1382, Y coordinate: 80.2575 moving directions: rate travel vertically downward: 4
Node i d:25, X coordinate: 95.8232, Y coordinate: 86.8313 moving directions: rate travel vertically downward: 4
Node i d:26, X coordinate: 24.1689, Y coordinate: 84.106 moving directions: level is moved to the left speed: 4
Node i d:27, X coordinate: 75.0298, Y coordinate: 56.8743 moving directions: rate travel vertically downward: 1
Node i d:28, X coordinate: 90.979, Y coordinate: 80.462 moving directions: rate travel vertically downward: 10
Node i d:29, X coordinate: 95.9676, Y coordinate: 92 moving directions: rate travel vertically downward: 6
Node i d:30, X coordinate: 51.9746, Y coordinate: 96.411 moving directions: the level speed that moves right: 6
Get sequence (t observing time 1, t 2), for determining in observing time whether occurred conflict by calculating D-S conflict value, infer whether have event and event location, first by t 1, t 2moment topology and according to the formula 2 computing node behavior factors, its result (getting α=0.5 in the present embodiment) as follows.
The behavior factor mu of node 1: 0.375
The behavior factor mu of node 2: 0
The behavior factor mu of node 3: 0
The behavior factor mu of node 4: 0.4375
The behavior factor mu of node 5: 0.05
The behavior factor mu of node 6: 0
The behavior factor mu of node 7: 0
The behavior factor mu of node 8: 0
The behavior factor mu of node 9: 0
The behavior factor mu of node 10: 0
The behavior factor mu of node 11: 0.625
The behavior factor mu of node 12: 0.5
The behavior factor mu of node 13: 0.416667
The behavior factor mu of node 14: 0.4375
The behavior factor mu of node 15: 0.5
The behavior factor mu of node 16: 0.0625
The behavior factor mu of node 17: 0
The behavior factor mu of node 18: 0.4375
The behavior factor mu of node 19: 0
The behavior factor mu of node 20: 0
The behavior factor mu of node 21: 0
The behavior factor mu of node 22: 0.416667
The behavior factor mu of node 23: 0
The behavior factor mu of node 24: 0.625
The behavior factor mu of node 25: 0.125
The behavior factor mu of node 26: 0
The behavior factor mu of node 27: 0.333333
The behavior factor mu of node 28: 0.55
The behavior factor mu of node 29: 0
The behavior factor mu of node 30: 0
The perception data value of setting event generation area center position in the present embodiment is 2 times of perception data value during without event, and from event center in event area with apart from linear decrease.On this basis, calculate each c according to formula 3 i(X ci, Y ci) community sensitivity primary data rate of change γ ci, wherein X cifor sub cell line index number, span 0-9, Y cifor sub cell column index number, span 0-10; Its result of calculation is with respect to each community c (X ci, Y ci) data variation rate matrix γ:
γ = 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.0253 0 0 0 0 0 0 0 0 0.0908 0.1888 0.2242 0.2007 0.1160 0 0 0 0 0 0.0651 0.2242 0.3184 0.3556 0.3306 0.2476 0.1035 0 0 0 0 0.1407 0.2945 0.3184 0.4487 0.4117 0.3184 0.1769 0 0 0 0 0.1529 0.3064 0.4117 0.4783 0.4287 0.3306 0.1889 0 0 0 0 0.1035 0.2593 0.3556 0.3965 0.3687 0.2827 0.1407 0 0 0 0 0 0.1529 0.2476 0.2827 0.2593 0.1769 0.0253 0 0 0 0 0 0 0.0780 0.1160 0.0908 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Aforementioned result of calculation substitution formula 4 is calculated to sub cell event and produce probability, the present embodiment is got adjustment factor lambda=0.5, and its result of calculation is with respect to each community c (X ci, Y ci) event occurrence rate matrix Pr.
Pr = 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.0253 0 0 0 0 0 0 0 0 0.0908 0.1888 0.2242 0.2007 0.1160 0 0 0 0.1671 0 0.0651 0.2242 0.3184 0.3556 0.3306 0.2476 0.1035 0 0 0.1444 0 0.1407 0.2945 0.3174 0.4487 0.4117 0.3184 0.1769 0 0 0.1701 0 0.1529 0.3064 0.4117 0.4062 0.2766 0.3306 0.1889 0 0 0 0 0.1035 0.2593 0.3673 0.4287 0.3687 0.2827 0.1407 0 0 0 0 0 0.1529 0.2476 0.2827 0.2593 0.1769 0.0336 0.0151 0 0 0 0 0 0.0780 0.1160 0.0908 0.1553 0.1841 0 0 0 0 0 0.0898 0 0 0 0 0 0 0
The sub cell event occurrence rate result of calculation of calculating according to step 3, by m i(T) ≡ Pr (c i) can obtain each sub cell basic probability assignment function m that event occurs i(T)=Pr (c i), that is:
m ( T ) = 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.0253 0 0 0 0 0 0 0 0 0.0908 0.1888 0.2242 0.2007 0.1160 0 0 0 0.1671 0 0.0651 0.2242 0.3184 0.3556 0.3306 0.2476 0.1035 0 0 0.14444 0 0.1407 0.2945 0.3174 0.4487 0.4117 0.3184 0.1769 0 0 0.1701 0 0.1529 0.3064 0.4117 0.4062 0.2766 0.3306 0.1889 0 0 0 0 0.1035 0.2593 0.3673 0.4287 0.3687 0.2827 0.1407 0 0 0 0 0 0.1529 0.2476 0.2827 0.2593 0.1769 0.0336 0.0151 0 0 0 0 0 0.0780 0.1160 0.0908 0.1553 0.1841 0 0 0 0 0 0.0898 0 0 0 0 0 0 0
Calculated the confidence level function of the sub cell event generation of specifying in sequence observing time by formula 5, (the t that the present embodiment calculates 1, t 2) event generation confidence level result in time series is as follows:
Bel i ( T ) = 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.0253 0 0 0 0 0 0 0 0 0.0908 0.1888 0.2242 0.2007 0.1160 0 0 0 0.1671 0 0.0651 0.2242 0.3184 0.3556 0.3306 0.2476 0.1035 0 0 0.1444 0 0.1407 0.2945 0.3174 0.4487 0.4117 0.3184 0.1769 0 0 0.1701 0 0.1529 0.3064 0.4177 0.4062 0.2766 0.3306 0.1889 0 0 0 0 0.1035 0.2593 0.3673 0.4287 0.3687 0.2827 0.1407 0 0 0 0 0 0.1529 0.2476 0.2827 0.2593 0.1769 0.0336 0.0151 0 0 0 0 0 0.0780 0.1160 0.0908 0.1553 0.1841 0 0 0 0 0 0.0898 0 0 0 0 0 0 0
In fact, for the desirable multiple time interval Δ t of section observing time in practical application, to improve result accuracy.
On aforementioned result of calculation basis, calculate the evidence conflict that each sub cell event occurs, calculating each sub cell event evidence conflict factor by formula 6 is following Con matrix:
Con = 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.0108 0 0 0 0 0 0 0 0 0 0.0687 0.2067 0.2835 0.2100 0.0622 0 0 0 0.0244 0 0.0439 0.2791 0.7289 1.1508 0.8706 0.3690 0.05757 0 0 0.0499 0 0.1224 0.5649 1.1018 2.7480 1.8859 0.7699 0.1645 0 0 0.0249 0 0.1322 0.6898 2.0637 2.4614 0.9728 0.7662 0.2016 0 0 0 0 0.06027 0.4232 1.2422 2.2878 1.1235 0.4703 0.1275 0 0 0 0 0 0.1112 0.3726 0.5565 0.4939 0.2576 0.0272 0.0026 0 0 0 0 0 0.0411 0.0846 0.0767 0.1170 0.0697 0 0 0 0 0 0 0 0 0 0 0 0 0
By the Con threshold value of setting, can detect event generation area, if i.e. Con i> Con th, all sub cell c that meet this condition iformation event generation area.In the present embodiment, get Con th=0.5, the event area detecting is for as shown in Figure 6.
When each monitoring periods expires or in the time that perception data exceedes threshold value, will trigger said process detection and more new events generation area in real time, until monitoring task finishes.
When concrete enforcement, the present invention also can adopt software modularity technology to realize, and as shown in Figure 7, can comprise following part:
(1) network scenarios initialization and the maintenance module of onboard sensor, for the urban road area modeling to monitoring and the networking of wireless Ad Hoc In-vehicle networking and the renewal being formed by vehicle, obtain sub cell information and network topology essential information that monitored area is divided.
(2) the event monitoring probability module of sub cell, for processing according to the sensing data of vehicle node collection in the division of vehicle mounted sensor network scene initialization gained road sub cell and each sub cell, the behavior change factor and the data variation rate of calculating vehicle node, obtain each sub cell event monitoring probability on this basis.
(3) the event occurrence rate module of sub cell, for event monitoring probability and the D-S evidence theory to sub cell according to the vehicle node of gained, obtains each road sub cell event occurrence rate.
(4) event generation confidence level module, for according to the sub cell event occurrence rate function result of calculation in Continuous Observation time series, obtains the event generation confidence level function result of sub cell in this time series.
(5) evidence merges conflict computing module, and for the evidence of each adjacent sub cell gained is merged, the difference degree calculating after its conflict property value merges information evidence is classified, and obtains merging the conflict value of rear evidence.
(6) event area determination module, classifies to evidence for the conflict value of calculating after merging according to evidence, judges that for the sub cell that is more than or equal to conflict threshold event occurs, obtains the event area that sub cell sequence forms.
(7) event detection trigger module, for according to network cycle timer and monitoring perception data and in the time that time-count cycle or perception data exceed setting threshold trigger event testing process, re-start event region detection.
Wherein, road modeling and the network topology parameter set of the output of the initialization of vehicle mounted sensor network scene and maintenance module, be input to event monitoring probability module and the event detection trigger module of sub cell; The global information that event checking module obtains according to netinit, determines when start to adopt method in this paper to come detection event and generation area thereof; The periodicity testing mechanism that adopts timing to detect by event checking module carrys out trigger event and detects, or trigger by data and the setting threshold of transducer collection in network, that is: when some area sensor data in network exceed predefined safety value, start the event detection procedure of this road area.The vehicle node obtaining that the event monitoring probability module of sub cell is calculated, to the event monitoring probability of each sub cell, is input to sub cell event occurrence rate module; Each sub cell event occurrence rate that event occurrence rate module calculates, is input to event generation confidence level module; Event generation confidence level in each sub cell sequence observing time that event generation confidence level module calculates, is input to evidence and merges conflict computing module; Evidence merges the evidence conflict value that conflict computing module calculates, and is input to event area determination module, for detection of event area.The event detection trigger command of the output of event detection trigger module, by the event monitoring probability module that is input to sub cell for re-executing event detection procedure.

Claims (2)

1. the vehicle mounted sensor network event region detection method based on D-S evidence theory, it is characterized in that: the urban road area of event monitoring is divided into some sub cells, in the urban road area of event monitoring, be provided with the vehicle of onboard sensor by Ad hoc mode networking, set up vehicle mounted sensor network figure, in vehicle mounted sensor network figure, forming vehicle node by the vehicle that is provided with onboard sensor, is limit between adjacent vehicle node; In the time carrying out event region detection, carry out following steps:
Step 1, distance by each vehicle node to center, sub cell, place, calculate the monitoring weight of each vehicle node to sub cell, place; Change situation according to the direction of vehicle node and speed, calculate the behavior factor of vehicle node; In conjunction with the perception physical quantity historical data of sub cell, vehicle node place, calculate perception data rate of change; And according to result of calculation obtain sub cell, place event produce probability; Implementation is as follows,
Step a, establishes the coordinate (x from certain vehicle node k nk, y nk) to sub cell, place c icenter point coordinate (x ci, y ci) geometric distance, be designated as d k,i=|| (x nk, y nk)-(x ci, y ci) ||, vehicle node k is to sub cell, place c imonitoring weight w (n k, c i) be calculated as follows,
( n k , c i ) = 1 - d k , i r d k , i ≤ r 0 d k , i > r
Wherein, the monitoring perception radius that r is onboard sensor;
Step b, establishes observing time sequence for (t, t'), and vehicle node k is v in the speed of moment t k, vehicle node k is v in the speed of moment t' k', max (v k, v k') be speed v kand v k'in higher value, if max (v k, v k')=0, the behavior factor mu of vehicle node k k=0; If max is (v k, v k') ≠ 0, the behavior factor mu of vehicle node k kbe calculated as follows,
Wherein, α is weight parameter, and θ is rate vector change to angular separation;
Step c, vehicle node k obtains sub cell, place c isequence (t observing time, t') perception physical quantity historical data, observing time sequence (t, t') the time interval is designated as Δ t, ask for the average ave (p of certain physical quantity p in time interval Δ t, Δ t)=(Δ t), wherein sum (p_data) is sub cell c in time interval Δ t to sum (p_data)/N ito the value summation of physical quantity p monitoring institute, (Δ is t) the data monitoring number of times of Δ t time inner sensor to N;
Average ave (the p of physical quantity p in time interval Δ t, the higher value of Δ t) and between the value p_data of Real-Time Monitoring institute (t') of current time t ' physical quantity p is designated as max (ave (p, Δ t), p_data (t')), if max is (ave (p, Δ t), p_data (t'))=0, data variation rate γ ci=0, if max (ave (p, Δ t), p_data (t')) ≠ 0, data variation rate γ cibe calculated as follows,
r ci = | p _ data ( t ′ ) - ave ( p , Δt ) | max ( p _ data ( t ′ ) , ave ( p , Δt ) )
Steps d, according to step a gained monitoring weight w (n k, c i), step b gained behavior factor mu kwith step c the data obtained rate of change γ ci, obtain at sub cell c ievent produce probability P r (c i) as shown in the formula:
Pr ( c i ) = λ 1 K * Σ k = 1 K w ( n k , c i ) μ k + ( 1 - λ ) γ ci
Wherein λ, for adjusting the factor, gets the constant between 0-1; K is sub cell c imaximum vehicle node number, the value of k is 1,2 ..., K;
Step 2, according to D-S evidence theory, the basic probability assignment function that each sub cell event occurs is set; Implementation is as follows,
In certain sub cell, have event produce if T represents, F represents in certain sub cell and produces without event, and target identification framework is expressed as that Θ={ in target identification framework, total state set is 2 for T, F} Θ=Ф, and T, F, Θ={ T, F}} use m i(T) represent sub cell c ithe elementary probability assignment of the state that " has event ", m i(F), m i(Θ) represent respectively sub cell c iin the elementary probability assignment of " without event ", " uncertain " state;
Sub cell c ibasic probability assignment function m i(T) event of calculating according to step 2 produces probability P r (c i) i.e. m is set i(T) ≡ Pr (c i);
Step 3, according to D-S evidence theory, calculate the upper each sub cell of sequence observing time (t, t') and produce the confidence level functional value Bel of event i(T); Implementation is as follows,
By step 2 gained basic probability assignment function m i(T), be calculated as follows the upper sub cell c of sequence observing time (t, t') ithe confidence level functional value Bel of generation event i(T),
Bel i(T)=m i(T)
Step 4, according to D-S evidence theory, merge the confidence level functional value of each sub cell and adjacent sub cell, the conflict value between the event evidence that obtains merging; Implementation is as follows,
If sub cell c iall directions on adjacent neighbours sub cell form S set N, by step 3 gained confidence level functional value Bel i(T), be calculated as follows and merge evidence conflict function Con (Bel i, Bel neigh (i)), obtain the conflict value between merged event evidence:
Con(Bel i,Bel neigh(i))=log(1/η), η = Σ 1 | SN | Bel i ( T ) * Bel neigh ( i ) ( T )
Wherein, Bel neigh (i)(T) be the confidence level functional value of the each sub cell neigh (i) adjacent with sub cell ci, neigh (i) ∈ SN; | SN| is the sub cell number in S set N;
Step 5, judge whether each sub cell belongs to event area, and implementation is as follows,
If the conflict Con (Bel between step 4 gained merged event evidence i, Bel neigh (i)) be more than or equal to predetermined threshold value Con thtime, judge sub cell c iin event area, otherwise judge sub cell c iin non-event area.
2. the vehicle mounted sensor network event region detection method based on D-S evidence theory as claimed in claim 1, it is characterized in that: periodically carry out event region detection, or the physical quantity being gathered by vehicle sensors triggers and carries out event region detection when more new data exceedes certain preset value.
CN201210125372.8A 2012-04-25 2012-04-25 Vehicular sensor networks (VSN) event region detection method based on Dempster-Shafer (D-S) evidence theory Expired - Fee Related CN102665294B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210125372.8A CN102665294B (en) 2012-04-25 2012-04-25 Vehicular sensor networks (VSN) event region detection method based on Dempster-Shafer (D-S) evidence theory

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210125372.8A CN102665294B (en) 2012-04-25 2012-04-25 Vehicular sensor networks (VSN) event region detection method based on Dempster-Shafer (D-S) evidence theory

Publications (2)

Publication Number Publication Date
CN102665294A CN102665294A (en) 2012-09-12
CN102665294B true CN102665294B (en) 2014-09-03

Family

ID=46774678

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210125372.8A Expired - Fee Related CN102665294B (en) 2012-04-25 2012-04-25 Vehicular sensor networks (VSN) event region detection method based on Dempster-Shafer (D-S) evidence theory

Country Status (1)

Country Link
CN (1) CN102665294B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102932812B (en) * 2012-11-06 2014-11-19 武汉大学 Vehicle sensor concurrent monitoring method facing road conditions
CN104427505B (en) * 2013-09-11 2018-05-11 中国移动通信集团设计院有限公司 A kind of method and device of cell scenario division
CN103824456B (en) * 2014-02-28 2016-03-30 武汉大学 A kind of vehicle mounted sensor network real-time road event recommendation method
CN107368069B (en) * 2014-11-25 2020-11-13 浙江吉利汽车研究院有限公司 Automatic driving control strategy generation method and device based on Internet of vehicles
CN106650785B (en) * 2016-11-09 2019-05-03 河南大学 Weighted evidence fusion method based on the classification of evidence and measure method for conflict
CN108777064A (en) * 2018-05-24 2018-11-09 深圳市益鑫智能科技有限公司 A kind of traffic behavior assessment system based on information fusion
CN109272745B (en) * 2018-08-20 2020-10-27 浙江工业大学 Vehicle track prediction method based on deep neural network
CN109087511B (en) * 2018-10-18 2019-07-30 长安大学 A kind of road safety message method for evaluating trust merging Dynamic Traffic Flow feature
CN109543746B (en) * 2018-11-20 2019-09-10 河海大学 A kind of sensor network Events Fusion and decision-making technique based on node reliability
CN110717511A (en) * 2019-09-04 2020-01-21 中国科学院国家空间科学中心 Mobile mode classification method for mobile self-organizing network node

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102143547A (en) * 2011-01-24 2011-08-03 中国人民大学 Continuous Top-k region query method in wireless sensor network

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7970574B2 (en) * 2005-06-22 2011-06-28 The Board Of Trustees Of The Leland Stanford Jr. University Scalable sensor localization for wireless sensor networks

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102143547A (en) * 2011-01-24 2011-08-03 中国人民大学 Continuous Top-k region query method in wireless sensor network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
zengyuanyuan.《Applying behavior recognition in road detection using vehicle sensor network,IEEE》.《computing,Networking and Communications (ICNC),2012international conference on Jan.30 2012-Feb.2 2012》.2012,全文. *
zengyuanyuan.《Applying behavior recognition in road detection using vehicle sensor network,IEEE》.《computing,Networking and Communications (ICNC),2012international conference on Jan.30 2012-Feb。2 2012》.2012,全文.

Also Published As

Publication number Publication date
CN102665294A (en) 2012-09-12

Similar Documents

Publication Publication Date Title
CN102665294B (en) Vehicular sensor networks (VSN) event region detection method based on Dempster-Shafer (D-S) evidence theory
CN102932812B (en) Vehicle sensor concurrent monitoring method facing road conditions
Dogru et al. Traffic accident detection using random forest classifier
Kothai et al. A new hybrid deep learning algorithm for prediction of wide traffic congestion in smart cities
CN106205114B (en) A kind of Freeway Conditions information real time acquiring method based on data fusion
CN102779410B (en) Parallel implementation method of multi-source heterogeneous traffic data fusion
CN113345237A (en) Lane-changing identification and prediction method, system, equipment and storage medium for extracting vehicle track by using roadside laser radar data
Vaqar et al. Traffic pattern detection in a partially deployed vehicular ad hoc network of vehicles
CN115206103B (en) Variable speed limit control system based on parallel simulation system
CN115081508B (en) Traffic running risk parallel simulation system based on traffic digital twin
Ramazani et al. A new context-aware approach to traffic congestion estimation
Lakshmi et al. Identification of traffic accident hotspots using geographical information system (GIS)
CN103824456B (en) A kind of vehicle mounted sensor network real-time road event recommendation method
CN106384507A (en) Travel time real-time estimation method based on sparse detector
CN116434523A (en) Vehicle active safety control method and device based on constraint degree in information perception scene
CN106157657A (en) A kind of moving state identification system and method for mobile subscriber
CN116092037B (en) Vehicle type identification method integrating track space-semantic features
Ghaffarpasand et al. Telematics data for geospatial and temporal mapping of urban mobility: New insights into travel characteristics and vehicle specific power
CN117148829A (en) Method and system for autonomous vehicles and non-transitory storage medium
Choudhry et al. Inferring truck activities using privacy-preserving truck trajectories data
Bhanja et al. Dynamic trafile congestion detection in VANETS using a Fuzzy rule-based system and K-means clustering
Mandal et al. RoadSpeedSense: Context-aware speed profiling from smart-phone sensors
Dong et al. Evaluating impact of remote-access cyber-attack on lane changes for connected automated vehicles
Ahanin et al. An efficient traffic state estimation model based on fuzzy C-mean clustering and MDL using FCD
Abbas et al. Evaluation of the use of streaming graph processing algorithms for road congestion detection

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140903

Termination date: 20150425

EXPY Termination of patent right or utility model