CN112651506A - Data pre-deployment method based on edge equipment rule inference in intelligent environment - Google Patents

Data pre-deployment method based on edge equipment rule inference in intelligent environment Download PDF

Info

Publication number
CN112651506A
CN112651506A CN202011545218.7A CN202011545218A CN112651506A CN 112651506 A CN112651506 A CN 112651506A CN 202011545218 A CN202011545218 A CN 202011545218A CN 112651506 A CN112651506 A CN 112651506A
Authority
CN
China
Prior art keywords
rule
window
feature
basic
sliding window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011545218.7A
Other languages
Chinese (zh)
Inventor
汪成亮
赵凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University
Original Assignee
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University filed Critical Chongqing University
Priority to CN202011545218.7A priority Critical patent/CN112651506A/en
Publication of CN112651506A publication Critical patent/CN112651506A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • G06N5/046Forward inferencing; Production systems
    • G06N5/047Pattern matching networks; Rete networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/22Indexing; Data structures therefor; Storage structures
    • G06F16/2282Tablespace storage structures; Management thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2462Approximate or statistical queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2474Sequence data queries, e.g. querying versioned data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/02Knowledge representation; Symbolic representation
    • G06N5/022Knowledge engineering; Knowledge acquisition
    • G06N5/025Extracting rules from data

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Databases & Information Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Fuzzy Systems (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Complex Calculations (AREA)

Abstract

The invention discloses a data pre-deployment method based on edge equipment rule reasoning in an intelligent environment, which comprises the following steps: s1: extracting feature calculation in the rule as a statistical unit and establishing a rule network; s2: analyzing and preprocessing the rule; s3: establishing a lightweight feature table according to a statistical unit, and calculating and storing feature values in advance; s4: establishing a direct reference relation between a rule network and a feature table, monitoring the memory usage amount of a lightweight feature table in real time, and designing LCT pre-deployment; s1 specifically includes the following steps: s101: acquiring a time series data stream; s102: and carrying out rule reasoning through the acquired data stream. The characteristic values are calculated and stored in advance according to the lightweight characteristic table established by the statistical unit, and the real-time characteristic calculation of most of original data during the rule matching period is avoided by enabling the rule network to directly reference the characteristic values in the lightweight characteristic table.

Description

Data pre-deployment method based on edge equipment rule inference in intelligent environment
Technical Field
The invention relates to the technical field of intelligent equipment data preprocessing, in particular to a data pre-deployment method based on edge equipment rule reasoning in an intelligent environment.
Background
In recent years, with the development of Wireless Sensor Networks (WSNs), smart environmental systems have been widely studied. The intelligent environment is an interactive space with a large number of edge devices, and aims to perform activities such as activity recognition, sleep monitoring and the like according to environment information acquired by sensors in the edge devices, the relationship between an intelligent environment system and the edge devices is shown in fig. 1, a plurality of edge devices are deployed in the intelligent environment, each edge device is provided with an independent data acquisition module and a rule inference module, the sensors embedded in the edge devices acquire the environment information and periodically reason the environment information and transmit the environment information to the rule inference module for rules, in the rule inference module, rules stored in a rule base are generally converted from expert knowledge in related fields, the structure of the rule base is similar to 'IF-THEN', wherein an IF (IF front part) is followed by a condition (rule front part) and THEN followed by a response (rule rear part), a working memory is a global database containing data elements, and the sensors deployed in the intelligent environment directly acquire acquired original data and inferred original data at specific frequencies The temporary data generated in the process is generally contained in a working memory, all data in the working memory can be quoted by rule conditions, a runtime system matches the data in the current working memory with rules in a rule base, a computing engine completes inference calculation including feature calculation, comparison calculation and logic calculation in rule front-parts during matching, if the inference calculation result is true, the rule is activated, the activated rule triggers corresponding response, and meanwhile, the temporary data can be generated to participate in subsequent rule inference, however, the maximum defect of the existing rule inference module is that the matching efficiency is low, more than 90% of the time of the system is used in the matching process, the real-time requirement of an intelligent environment system cannot be met, and in the intelligent environment system, the reason for the low matching efficiency mainly comprises the following two aspects:
firstly, rules are repeatedly matched, in most applications of an intelligent environment system, cross overlapping parts possibly exist among rules in a rule set on a rule front piece or a calculation unit with finer granularity, however, along with the development of the intelligent environment system, the scale of the rule set is rapidly increased, if data in a working memory is matched with each rule one by one, repeated matching of the overlapping parts cannot be avoided, the matching time complexity is exponentially increased along with the increase of the rules and the data scale,
secondly, the reasoning calculation amount is overlarge, along with the development of a wireless sensor network, more and more data are generated in an intelligent environment system, the feature calculation amount of the original data in the rule matching period is multiplied along with the increase of the data amount and the rule amount, the matching efficiency is greatly reduced by carrying out a large amount of real-time feature calculation on the original data in the rule matching period by the existing reasoning system,
for repeated matching of rules, the RETE algorithm proposed by Forgy in 1979 better solves the problem, the RETE algorithm is an incremental matching inference algorithm, and realizes mode sharing among rules to accelerate matching of the rules by establishing a rule network, at present, many inference systems (such as CLIP, JESS, DROOLS, BizTalk) based on the RETE algorithm are put into commercial use, however, these systems are usually too heavy and complex, more importantly, the RETE algorithm lacks a solution to the problem of excessive inference computation amount, so that the system cannot be directly applied to inference systems in an intelligent environment, in addition, before the rule matching is completed, a large amount of original data for inference computation needs to be cached in a memory, the contradiction between limited memory resources of edge devices and huge memory requirements becomes a major bottleneck of real-time,
disclosure of Invention
In order to solve the technical problems, the invention provides a data pre-deployment method based on edge equipment rule inference in an intelligent environment, which comprises the steps of conducting pre-analysis and pre-processing on rules, firstly establishing a rule network to avoid repeated matching of the rules, simultaneously extracting feature calculation in rule predecessor as a statistical unit, then calculating and storing feature values in advance according to a Light-Weight Characteristic Table (LCT) established by the statistical unit, avoiding real-time feature calculation of most original data during rule matching by directly leading the rule network to refer to the feature values in the Light-Weight Characteristic Table, and finally designing an LCT pre-deployment strategy to cope with the condition of limited system memory based on the Characteristic that the system has certain static predictability on the memory usage of the Light-Weight Characteristic Table.
The invention is realized by the following technical scheme:
a data pre-deployment method based on edge equipment rule inference in an intelligent environment comprises the following steps:
s1: extracting feature calculation in the rule as a statistical unit and establishing a rule network;
s2: analyzing and preprocessing the rule;
s3: establishing a lightweight feature table according to a statistical unit, and calculating and storing feature values in advance;
s4: establishing a direct reference relation between a rule network and a feature table, monitoring the memory usage amount of a lightweight feature table in real time, and designing LCT pre-deployment;
s1 specifically includes the following steps:
s101: acquiring a time series data stream;
s102: and carrying out rule reasoning through the acquired data stream.
As a preferable scheme, S101 specifically includes the following steps: in the intelligent environment system, input data is regarded as a triple sequence (streamID, timestamp, value) arranged according to a time increasing sequence, and one data stream consists of triples with the same streamID; making the infinite time sequence S arranged in time increasing order<e0,t0>,<e1,t1>,…,<en,tn>Wherein e isiIs at a time tiThe sequence element appears as a data stream; the data stream periodically adds new tuples at specific time intervals, and the growth period and the flow rate of the data stream are acquired.
As a preferable scheme, S102 specifically includes the following steps: the rule that the condition only has logic AND (Λ) to participate in connection is an atom rule;
wherein: the minimum condition which can not be continuously segmented by the logic operator in the rule is an atomic condition; the condition formed by connecting one or more atomic conditions and zero or more combined conditions through logical operators is a combined condition, and the atomic conditions are combined into the combined condition to construct a rule network.
As a preferable scheme, S2 specifically includes the following steps: let RS [ i]The representation is made by all inference cycles being periodiThe rule subset is converted into a rule network through rule analysis and pretreatment; in a rule network, each atom condition is assigned with an index number, the atom condition is analyzed, the characteristics in the atom condition are extracted, and the statistical unit sequence SU, SU, …, SU [ m ] of the rule subset is obtained through calculation]。
As a preferable scheme, S3 specifically includes the following steps: let stat1(S1,T1),stat2(S1,T2),…,statn(S1,Tn) The sliding window of the feature calculations operating on the data stream S corresponding to the sequence of preprocessed statistical units extracted from a subset of rules is SW T-T1:t],SW[t-T2:t],…,SW[t-Tm:t]Wherein m is<n; firstly, dividing the largest sliding window into multiple continuous and smaller basic windows with equal length, each sliding window can be composed of several continuous basic windows, and making the data stream subsequence in the largest sliding window be S [ t-w: t: [ ]]Let w be kb, where k denotes the number of basic windows in the maximum sliding window, b denotes the length of the basic windows, let BW, …, BW [ k-1]]Represents the basic window sequence in the maximum sliding window SW, where BW [ i [ [ i ]]=S[t-w+ib:t-w+(i+1)b]When the new basic window BW [ k ] slides forward as the sliding window slides forward]Generating, the basic window BW is expired, and the basic window BW [ i +1]]Before generation, BW [ i ] is obtained]Data summary of (1), basic window BW [ i ]]The jth element in (j) is denoted as BW [ i; j is a function of]Taking the maximum common divisor of the inference period of the rule subset and the sizes of all sliding windows as the length of the basic window, which is called basic time unit (basic time unit), and is denoted by btn,
btn=GCD(period,T1,T2,…,Tn)
order to
Figure BDA0002855501720000051
Is a set consisting of the sliding window size of the statistical unit,
Figure BDA0002855501720000052
is a sequential set consisting of the sliding window size of a statistical unit whose statistical operator is incrementally computable,
Figure BDA0002855501720000053
is an order set composed of the sizes of the sliding windows of the statistical units whose statistical operators are not incrementable, TSAb ═ TS, order
Figure BDA0002855501720000054
Is a set formed by statistical operators of statistical units on data stream S, which can be calculated in increment mode, and for the statistical units which can be calculated in increment mode, the difference between two adjacent elements in TSA is formed into a set
Figure BDA0002855501720000055
Figure BDA0002855501720000056
In the sliding window SW [ t-ts [ ]i:t]Characteristic value of (c) is SW [ t-tsi-1:t]On the basis of the incremental calculation of SW [ t-ts ]i-1-diffi:t-tsi-1]Characteristic value of (1), wherein
Figure BDA0002855501720000057
For sliding window SW [ t-tsi:t]If, if
Figure BDA0002855501720000058
Figure BDA0002855501720000059
And statjThe size of all sliding windows of the operation is not larger than tsiThen the digest stat should be maintained on that windowj(SW[t-tsi:t]) Directly calculating statistical units which cannot be subjected to incremental calculation, and referring to the referred features during calculation if one feature is the referenceable feature of the other feature for all statistical units operating the same sliding window size;
the lightweight characteristic table is composed of columns of each element grouping in a sliding window with the size of TS, U, ND and U, wherein the characteristic value of original data is directly calculated according to the columns of the window grouping with the size of the window of the btu, and the abstract of a basic window is stored; obtaining intermediate abstracts for incremental calculation by summarizing the abstracts of the basic windows according to the columns of which the window size is the size of each element ND; obtaining characteristic values which can be calculated in an incremental manner by summarizing the summaries of the columns grouped by TSA [ i ] and ND [ i ] according to the window size of the TSA [ i ] grouping columns; and obtaining the characteristic value which can not be subjected to incremental calculation by adopting a mode of directly calculating the original data according to the column of which the window size is grouped for the size of each element of the TSB, wherein the lightweight characteristic table only caches the original data required by the characteristic calculation which can not be subjected to incremental calculation, and only keeps the abstract and does not keep the original data for the basic window.
In conclusion, the invention has the following beneficial effects:
the invention designs a rule analysis and pretreatment module which constructs a rule network by analyzing a rule set and extracts a statistical unit, then, the invention uses the obtained statistical unit to construct an online data structure lightweight feature table, which adopts an incremental mode to calculate the feature and store the feature value, the reasoning work is completed by leading the rule network to directly refer to the characteristic value stored in the lightweight characteristic table and carrying out network screening and network propagation, thereby avoiding most real-time feature calculation during rule matching, and finally, the LCT pre-deployment scheme designed based on the characteristic that the memory occupancy of the system has certain static predictability solves the problem of system memory limitation, even under the condition of large data volume, rule set and maximum window size, the scheme can still remarkably improve the performance of the inference system.
Drawings
FIG. 1 is a schematic diagram of an intelligent environment system and edge device configuration in the background of the invention;
fig. 2 is a diagram of DPDS architecture in an embodiment of the present invention;
FIG. 3 is a diagram of a rule network translation in an embodiment of the invention;
FIG. 4 is a schematic structural diagram of a sliding window and a basic window in an embodiment of the present invention;
FIG. 5 is a schematic diagram of the structure of an LCT in an embodiment of the invention;
fig. 6 is a schematic diagram of LCT deployment in an embodiment of the present invention.
Detailed Description
This specification and claims do not intend to distinguish between components that differ in name but not function. In the following description and in the claims, the terms "include" and "comprise" are used in an open-ended fashion, and thus should be interpreted to mean "include, but not limited to. "substantially" means within an acceptable error range, and a person skilled in the art can solve the technical problem within a certain error range to substantially achieve the technical effect.
The terms in upper, lower, left, right and the like in the description and the claims are combined with the drawings to facilitate further explanation, so that the application is more convenient to understand and is not limited to the application.
The present invention will be described in further detail with reference to the accompanying drawings.
The DPDS has a structure as shown in fig. 2, a rule set can be divided into rule subsets with different inference periods according to inference periods of rules in the rule set, a rule network and a lightweight feature table are respectively constructed for the rule subsets by using a rule parsing and preprocessing module, and inference work is completed by enabling the rule network to directly refer to feature values in the lightweight feature table and completing screening and propagation of the rule network, and triggering corresponding responses by combination conditions matched with the feature values.
In an intelligent environment system, the intelligent realization relies on the process that each edge device inputs the time sequence data stream collected from the sensor into a rule reasoning module for rule reasoning, and the rule reasoning comprises the following steps: the method comprises the steps of performing characteristic calculation on original data within a time limit, comparing a characteristic value with a threshold value, connecting a comparison result by using a logical operator and the like, wherein the comparison operation and the logical operation are low in calculation cost compared with the characteristic calculation, in order to avoid time delay caused by performing real-time characteristic calculation on the original data during rule matching of a large number of rules, a rule analysis and preprocessing module is firstly designed, a rule set is analyzed by using the module, a rule network is constructed and a statistical unit corresponding to each atomic condition in the rule network is obtained, an online data structure lightweight characteristic Table (Light-Weight characteristic Table, LCT) is designed by using the statistical unit for calculating and storing the characteristic value quoted by the atomic condition, in addition, the system in DPDS has certain static predictability to the memory usage of the LCT, the invention provides an LCT pre-deployment scheme, which solves the problem of system memory limitation by endowing a lightweight feature table with priority and taking the lightweight feature table with lower offline priority when the memory budget is insufficient.
Time series data stream
In the smart environment system, the input data is regarded as a sequence of triples (streamID, timestamp, value) arranged in time increasing order, and one data stream is composed of triples with the same streamID, which may represent vibrations, for example, in a sleep monitoring application.
Definition 1 (data stream) a data stream is an infinite time series S arranged in time-increasing order<e0,t0>,<e1,t1>,…,<en,tn>,eiIs at a time tiSequence elements that occur.
Periodically adding new tuples to the data stream at a specific time interval (e.g. one second), which is called the growth period of the data stream, and the reciprocal of the growth period is called the flow rate of the data stream, for example, if the growth period of the data stream S is 1 second (flow rate is 1/second), and the current time isAt t, the data stream is added with a new element S [ t +1] with a time stamp of t +1 after 1 second]If there is no new tuple at the time of the growth cycle, interpolation is used for padding, and if there are a plurality of new tuples during the growth cycle, the new tuples are replaced by the result of the aggregation of the plurality of tuples]Representing the value of the data stream S, at time tkIndicates a stream with streamID k, Sk[i:j]Representing a data stream S from time i to jkA subsequence of (2).
Rule reasoning
In an intelligent environment system, a rule reflects the behavior characteristics of human solving problems, and is a knowledge representation method, the rule contains attributes (RID, period), wherein RID is the ID of the rule, period is the inference period of the rule, different rules may have different inference periods, and the structure of the rule can be represented as C1∧C2∨(C3∧C4)→A1Form (1), wherein Ci,AiRepresenting atomic conditions and responses, respectively, the normal form existence theorem indicates that any propositional formula has an equivalent Disjunctive Normal Form (DNF), e.g., propositional formula (x)1∨y1)∧(x2∨y2)∧…∧(xn∨yn) And (x)1∧…∧xn-1∧xn)∨(x1∧…∧xn-1∧yn)∨…∨(y1∧…∧yn-1∧yn) Equivalence, rule C based on the theorem of existence of the norm1∧C2∨(C3∧C4)→A1Can be translated into the following two atomic rules:
Figure BDA0002855501720000091
definition 2 (atomic rules) the rule that the condition only participates in the connection with a logical and (Λ) is called an atomic rule.
Each atomic rule may be abstracted as a tree structure in which the atomic conditions represent leaf nodes in response to representing a root node, by being divided into a plurality of nodes having the same inference periodThe atomic conditions shared by the trees formed by the rules are combined into combined conditions, and a rule network can be constructed, wherein each rule network is constructed by a rule subset consisting of rules with the same reasoning period, the reasoning period is also the reasoning period of the rule subset, and the rule network is composed of a rule C1∧C2∨(C3∧C4)→A1The rule network is constructed as shown in FIG. 3, FIG. 3(a) shows rule C1∧C2∨(C3∧C4)→A1Into two atomic rules and forming two rule trees, fig. 3(b) constructs a rule network by further combining the atomic conditions into combined conditions, and typical examples of predicates and operators that compose and combine atomic conditions are as follows:
and (3) time predicate: a time limit (e.g., a sliding window) or the like,
and (3) data predicate: comparing operators, such as <, > and ≧ and the like; statistical operators, such as MAX, MIN, AVG, SD (standard definition), VAR (variance), etc., logical operators: logical AND (. lamda.), logical OR (. V.), logical No-).
Definition 3 (atomic conditions) refers to the minimum condition in a rule that cannot be continued to be segmented by a logical operator as an atomic condition.
Definition 4 (combination condition) a condition consisting of one or more atomic conditions, and zero or more combination conditions connected by logical operators is referred to as a combination condition.
In the conventional inference system, the inference process is a process of taking a data stream as an input of a rule network, then screening and propagating original data through the network, and finally triggering a corresponding response by a combined condition matched with the original conditionkT) is represented in the data stream SkOf (2) a subsequence Sk[t-T:t]Feature computation with the operator stat, where the subsequence Sk[t-T:t]Is performed by a sliding window with a time interval T in the data stream SkThe above-defined s-time sequence, in the intelligent environment system, the user is more concerned about the environment recentlyFor a period of time, this means that the most recently arrived data is more important than the older data, so the sliding process of the sliding window removes the data that entered the earliest window and receives the most recently arrived data.
Definition 5 (sliding window) let T be a length of time, T > T be a time of change, then SW T-T: T is a sliding window with a time interval T of S, where T and T are in the same units, and T is the time distance from the starting observation time of S.
In the conventional inference system, the inference process is a process of taking a data stream as an input of a rule network, then screening and propagating original data through the network, and finally triggering a corresponding response by a combined condition matched with the original conditionkT) is represented in the data stream SkOf (2) a subsequence Sk[t-T:t]Feature computation with the operator stat, where the subsequence Sk[t-T:t]Is performed by a sliding window with a time interval T in the data stream SkThe above-defined s-time sequence is that in the intelligent environment system, the user is relatively interested in the situation of the environment in the latest period of time, which means that the data arriving in the latest period of time is more important than the old data, so the sliding process of the sliding window is to remove the data entering the earliest window and receive the latest data.
Definition 5 (sliding window) let T be a length of time, T > T be a time of change, then SW T-T: T is a sliding window with a time interval T of S, where T and T are in the same units, and T is the time distance from the starting observation time of S.
In the existing inference system, the inference process is a process of taking a data stream as an input of a rule network, then screening and propagating original data through the network, and finally triggering a corresponding response by a combined condition matched with the original conditionGiven a time length T and a current time T, stat (S)kT) is represented in the data stream SkOf (2) a subsequence Sk[t-T:t]Feature computation with the operator stat, where the subsequence Sk[t-T:t]Is performed by a sliding window with a time interval T in the data stream SkThe above-defined s-time sequence is that in the intelligent environment system, the user is relatively interested in the situation of the environment in the latest period of time, which means that the data arriving in the latest period of time is more important than the old data, so the sliding process of the sliding window is to remove the data entering the earliest window and receive the latest data.
Definition 5 (sliding window) let T be a length of time, T > T be a time of change, then SW T-T: T is a sliding window with a time interval T of S, where T and T are in the same units, and T is the time distance from the starting observation time of S.
Rule parsing and preprocessing
Let RS [ i]The representation is made by all inference cycles being periodiThe rule subset formed by the rules can be converted into a rule network by using a rule analysis and preprocessing module, and the rule network carries out the network screening and network propagation period, namely the inference period of the rule network is also periodiIn a rule network, each atomic condition is assigned an index number, abbreviated as CID, and an atomic condition generally includes two calculation operations of performing feature calculation on raw data within a time limit and comparing a feature value with a threshold value, where the former is relatively expensive in calculation cost compared with the latter, and by using a rule parsing and preprocessing module, the atomic condition can be further parsed and feature in the atomic condition can be extracted to calculate a statistical unit sequence SU, …, SU [ m ] m of a rule subset]The statistical unit is defined as:
<CID,stat,wSize,streamID,period,value>
wherein CID represents the index number of the atomic condition corresponding to the statistical unit, stat represents the statistical operator, and comprises an attribute (isinC, ref), isinC represents whether incremental calculation is possible, ref represents the characteristic that can be quoted, and wSize represents the sliding window of the characteristic calculation operationSize, streamID represents the data stream ID of the feature calculation operation, period represents the calculation cycle of the feature calculation, value represents the feature value, the feature value is initialized to 0, and the incremental calculation represents: given a set of historical sample values h1,h2,…,hMA set of incremental sample values a1,a2,…,aNFull volume sample value h1,h2,…,hM,a1,a2,…,aNThe statistical value of (2) can be directly calculated from the indexes of the historical samples and the incremental samples, for example, the average value of the total amount of samples can be calculated as:
Figure BDA0002855501720000121
the variance of the full-scale sample can be calculated as
Figure BDA0002855501720000122
Wherein
Figure BDA0002855501720000123
Represents the mean and variance of the historical samples,
Figure BDA0002855501720000124
representing the mean and variance of incremental samples, the referenceable feature of a feature representing other features of the sample that may be referred to in computing the feature, e.g. sample kurtosis
Figure BDA0002855501720000125
The variance of the sample may be referenced and the variance of the sample
Figure BDA0002855501720000126
The mean of the sample can be quoted, where m4Representing the center distance of a 4-order sample, the reference relationship of the features has transitivity, which means that when the referenceable feature of the feature a is b and the referenceable feature of b is c, c is also the referenceable feature of a, and typical statistical features in the application of the intelligent environment system are summarized as table 1:
TABLE 1
Figure BDA0002855501720000131
Assuming that the feature corresponding to the statistical unit sequence obtained by analyzing a rule subset by the rule analysis and preprocessing module is calculated as stat1(S1,T1),stat1(S1,T2),stat2(S1,T1),stat2(S1,T3) Wherein sliding window size T1≠T2≠T3And T1<T2,stat1Can be calculated incrementally, has no referenceable feature, stat2Not incrementally computable, referenceable feature stat1Due to stat1Can be incrementally calculated, so that the full sample S1[t-T2:t]May be in the sample S1[t-T1:t]On the basis of which the sample S is incrementally calculated1[t-T2:t-T1]Obtaining, avoiding repeating the sample S based on SU and SU1[t-T1:t]For stat2(S1,T1) Due to stat1Is stat2Can quote the feature and stat2(S1,T1) And stat1(S1,T1) The sliding windows of the operations of (1) are aligned with each other, so that at the calculation stat2(S1,T1) Times can refer directly to stat1(S1,T1) The characteristic value of (1), avoiding the possibility of referring to characteristic stat1(S1,T1) For stat2(S1,T3) Although it may refer to the feature being stat1But due to the sliding window SW T-T it operates3:t]Not in harmony with stat1(S1,T1),stat1(S1,T2) Sliding window of operation SW T-T1:t],SW[t-T2:t]Are aligned with each other and therefore cannot directly refer to their feature values if a feature stat1Can cite a feature of stat2Then call stat1For reference features, let's call stat2For a referenced feature, when there is a referenceable feature in the statistical unit sequence where the feature is located, the referenced feature is also an incrementally computable feature, and the sliding window in which they operate is a non-aligned window, the statistical unit sequence needs to be preprocessed to fully exploit the advantages of the referenceable feature and the incremental computation.
Lightweight feature Table (Light-Weight probabilistic Table, LCT)
In DPDS, the rule network needs to have the capability of screening the eigenvalues of the input data stream on the sliding window, so the system needs to maintain the state of these eigenvalues at some historical time, in the present invention, this function is provided by a lightweight signature table (LCT) which calculates and stores in memory the eigenvalues of the statistical units corresponding to the atomic conditions, one lightweight signature table corresponds to one rule network, which is defined by a set of columns grouped by the same sliding window size, for convenience of description, only the case where the number of rule subsets and data streams is 1 will be discussed later.
Let stat1(S1,T1),stat2(S1,T2),…,statn(S1,Tn) The sliding window of the feature calculations operating on the data stream S corresponding to the sequence of preprocessed statistical units extracted from a subset of rules is SW T-T1:t],SW[t-T2:t],…,SW[t-Tm:t]Wherein m is<n, for the feature calculation of the incremental calculation, it is necessary to maintain the abstract for the sliding window, first, the present invention divides the largest sliding window into a plurality of continuous and smaller basic windows with equal length, each sliding window can be composed of a plurality of continuous basic windows, and fig. 4 shows the relationship between the sliding window and the basic window.
Let the data stream subsequence in the maximum sliding window be S [ t-w: t ], assuming that w is kb, where k denotes the number of basic windows in the maximum sliding window and b denotes the length of the basic windows, let BW, …, BW [ k-1] denote the basic window sequence in the maximum sliding window SW, where BW [ i ] ═ S [ t-w + ib: t-w + (i +1) b ], as the sliding window slides forward, when a new basic window BW [ k ] is generated, the basic window BW expires, the length setting of the basic window is very important, because before the basic window BW [ i +1] is generated, the data digest in BW [ i ] must be acquired, the jth element in the basic window BW [ i ] is denoted as BW [ i; j ], the invention takes the maximum common divisor of the inference period of the rule subset and the sizes of all sliding windows as the length of a basic window, which is called basic time unit (basic time unit) and is represented by btn.
btn=GCD(period,T1,T2,…,Tn)
In order to obtain the incrementally computable characteristic values in the sliding window, it is first necessary to maintain a summary of the basic window, taking the calculation of the average value of the sliding window as an example, for the basic window BW i]The abstract to be maintained is
Figure BDA0002855501720000151
By summarizing the digests maintained in the basic window, a sliding window SW [ T-T ] can be obtainedj:t]Summary of (1)
Figure BDA0002855501720000152
When the new basic window BW [ k ]]When generated, the abstract of the sliding window is updated to
Figure BDA0002855501720000153
Figure BDA0002855501720000154
Order to
Figure BDA0002855501720000155
Is a set consisting of the sliding window size of the statistical unit,
Figure BDA0002855501720000156
is a sequential set consisting of the sliding window size of a statistical unit whose statistical operator is incrementally computable,
Figure BDA0002855501720000157
is an order set composed of the sizes of the sliding windows of the statistical units whose statistical operators are not incrementable, TSAb ═ TS, order
Figure BDA0002855501720000158
Is a set formed by statistical operators of statistical units on data stream S, which can be calculated in increment mode, and for the statistical units which can be calculated in increment mode, the difference between two adjacent elements in TSA is formed into a set
Figure BDA0002855501720000159
Figure BDA00028555017200001510
Then in the sliding window SW [ t-ts ]i:t]The characteristic value of (c) may be in SW [ t-tsi-1:t]On the basis of the incremental calculation of SW [ t-ts ]i-1-diffi:t-tsi-1]Obtaining a characteristic value of (A), wherein
Figure BDA0002855501720000161
For sliding window SW [ t-tsi:t]If, if
Figure BDA0002855501720000162
And statjThe size of all sliding windows of the operation is not larger than tsiThen the digest stat should be maintained on that windowj(SW[t-tsi:t]) And for the statistical units which cannot be incrementally calculated, adopting a direct calculation mode, and for all the statistical units operating the same sliding window size, if the situation that one feature is a referenceable feature of another feature exists, referring to the quoted feature during calculation.
The lightweight characteristic table is composed of columns of each element grouping in a sliding window with the size of TS, U, ND and U, wherein the characteristic value of original data is directly calculated according to the columns of the window grouping with the size of the window of the btu, and the abstract of a basic window is stored; obtaining intermediate abstracts for incremental calculation by summarizing the abstracts of the basic windows according to the columns of which the window size is the size of each element ND; obtaining characteristic values which can be calculated in an incremental manner by summarizing the summaries of the columns grouped by TSA [ i ] and ND [ i ] according to the window size of the TSA [ i ] grouping columns; and obtaining the characteristic value which can not be subjected to incremental calculation by adopting a mode of directly calculating the original data according to the column of which the window size is grouped for the size of each element of the TSB, wherein the lightweight characteristic table only caches the original data required by the characteristic calculation which can not be subjected to incremental calculation, and only keeps the abstract and does not keep the original data for the basic window.
The goal column stores atomic condition-referenced feature values, and to illustrate the lightweight feature table, FIG. 5 shows a lightweight feature table constructed from a simple subset of rules with an inference period of 5 minutes in an elderly sleep monitoring application in which geophones are mounted in bed for monitoring body movements and postures during sleep, the simple subset of rules being as follows:
R1:IF AVG(S1,4MIN)<=0.632AND VAR(S1,11MIN)<=0.221THEN A1=1
R2:IF MAX(S1,7MIN)>1.112AND AVG(S1,9MIN)>0.436AND KU RTOSIS(S1,9)>0THEN A2=1
the rule subset corresponds to TSA {4MIN,7MIN,9MIN,11MIN }, ND {4MIN,3MIN,2MIN,2MIN }, TSB {11MIN }, SS { MAX, AVG, VAR }, btu } 1MIN,
in the lightweight feature table shown in fig. 5, basic window digests of all historical times from the current time to the maximum sliding window size are stored, digests of columns grouped by elements ND of the sliding window size are obtained by summarizing the basic window digests, and feature values of each statistical unit are obtained step by incremental calculation,
LCT pre-deployment policy
For the light weight characteristic table for storing the characteristic value, the characteristic value should be stored in the memory as much as possible to reduce the system I/O, however, in the intelligent environment system, the available resource of the edge device is limited, for the intelligent environment system with large scale rule, the characteristic value will cause the quantity and the memory occupation quantity of the light weight characteristic table to be greatly increased along with the multiple increase of the rule scale, and further will increase the data exchange times between the memory and the disk, so the time and the strategy of the data exchange need to be properly arranged to adapt to the condition that the memory in the intelligent environment system is limited, in the intelligent environment system, the memory occupation of the system mainly comprises a rule network, the light weight characteristic table and the original data of the cache, the memory occupation quantity based on the system has the characteristic of certain static predictability, namely, after the rule set is determined, the quantity and the memory occupation quantity of the light weight characteristic can be known, an LCT pre-deployment strategy based on rule set pre-analysis and pre-processing is designed.
Assuming that the set of sliding window sizes in a statistical unit of a subset of rules is TS, TSA, TSB respectively represents the set of sliding window sizes of statistical units whose operators are incrementally calculable and non-incrementally calculable,
Figure BDA0002855501720000171
the number of elements is alpha and beta, the number of elements of the set SS is gamma, the basic time unit is btu, and the maximum number of the entries of the middle column is in the lightweight feature table constructed by the rule subset
Figure BDA0002855501720000172
Figure BDA0002855501720000173
Wherein the number of entries of the columns grouped by the basic window size is
Figure BDA0002855501720000174
The number of entries of columns grouped by sliding window size for ND elements does not exceed
Figure BDA0002855501720000181
The number of each list item of the target column is 1, so that the number of the LCT list items is at most
Figure BDA0002855501720000182
And the memory size occupied by each table entry in the LCT is notIf the size of the memory occupied by the lightweight feature table exceeds gamma multiplied by A, the size of the memory occupied by the lightweight feature table does not exceed
Figure BDA0002855501720000183
Figure BDA0002855501720000184
For data stream with flow rate of lambda, the original data size to be buffered is TSB [ beta ] with maximum sliding window operated by operator which can not be calculated in increment mode]Let N represent the number of rules in the rule set, and N represent the number of atomic conditions, because the rule network is a network structure organized by a tree, the total memory occupancy of the rule network is (2N-1+ N) xA, where A represents the memory occupancy constant of each element (abstract, atomic condition, response), considering the condition of limited system memory, for the lightweight feature table with high update frequency, it should be stored in the memory as much as possible, for this reason, the lightweight feature table with high update frequency is given high priority, the lightweight feature table with low update frequency is given low priority, assuming that the available memory resource budget in the system is M, when the memory occupancy of the rule network, the lightweight feature table and the original data of all the rule subsets is greater than M, the lightweight feature table and the original data are offline according to the priority, and opening a cache block in the memory until the total occupied amount of the new memory is less than M.
For the offline lightweight feature table, information such as TSA, TSB, SS, etc. is stored in the memory, and when each basic window is created, the original data and digest contained in the basic window are written into the disk, and each basic window has its index information:
<streamID,BWstart,BWend>
and pre-fetching all basic window digests between t-TSA (alpha) and t and all original data between t-TSB (beta) and t before the inference time of the rule network corresponding to the offline lightweight feature table comes, and constructing the offline lightweight feature table in a cache block in advance.
Also using the foregoing subset of rules in an elderly sleep monitoring application as an example, fig. 6 illustrates the construction of an offline lightweight profile.
The present embodiment is only for explaining the present invention, and it is not limited to the present invention, and those skilled in the art can make modifications of the present embodiment without inventive contribution as needed after reading the present specification, but all of them are protected by patent law within the scope of the claims of the present invention.

Claims (5)

1. A data pre-deployment method based on edge equipment rule inference in an intelligent environment is characterized by comprising the following steps:
s1: extracting feature calculation in the rule as a statistical unit and establishing a rule network;
s2: analyzing and preprocessing the rule;
s3: establishing a lightweight feature table according to a statistical unit, and calculating and storing feature values in advance;
s4: establishing a direct reference relation between a rule network and a feature table, monitoring the memory usage amount of a lightweight feature table in real time, and designing LCT pre-deployment;
s1 specifically includes the following steps:
s101: acquiring a time series data stream;
s102: and carrying out rule reasoning through the acquired data stream.
2. The method for pre-deploying data based on edge device rule inference in an intelligent environment according to claim 1, wherein the step S101 specifically comprises the steps of: in the intelligent environment system, input data is regarded as a triple sequence (streamID, timestamp, value) arranged according to a time increasing sequence, and one data stream consists of triples with the same streamID; making the infinite time sequence S arranged in time increasing order<e0,t0>,<e1,t1>,…,<en,tn>Wherein e isiIs at a time tiThe sequence element appears as a data stream; the data stream periodically adds new tuples at specific time intervals, and the growth period and the flow rate of the data stream are acquired.
3. The method for pre-deploying data based on edge device rule inference in an intelligent environment according to claim 1, wherein said S102 specifically comprises the steps of: let the rule that the condition has only logical and (Λ) participating in the connection be an atomic rule;
wherein: the minimum condition which can not be continuously segmented by the logic operator in the rule is an atomic condition; the condition formed by connecting one or more atomic conditions and zero or more combined conditions through logical operators is a combined condition, and the atomic conditions are combined into the combined condition to construct a rule network.
4. The method for pre-deploying data based on edge device rule inference in an intelligent environment as claimed in claim 1, wherein said S2 specifically comprises the following steps: let RS [ i]The representation is made by all inference cycles being periodiThe rule subset is converted into a rule network through rule analysis and pretreatment; in a rule network, each atom condition is assigned with an index number, the atom condition is analyzed, the characteristics in the atom condition are extracted, and the statistical unit sequence SU, SU, …, SU [ m ] of the rule subset is obtained through calculation]。
5. The method for pre-deploying data based on edge device rule inference in an intelligent environment as claimed in claim 1, wherein said S3 specifically comprises the following steps: let stat1(S1,T1),stat2(S1,T2),…,statn(S1,Tn) The sliding window of the feature calculations operating on the data stream S corresponding to the sequence of preprocessed statistical units extracted from a subset of rules is SW T-T1:t],SW[t-T2:t],…,SW[t-Tm:t]Wherein m is<n; firstly, the largest sliding window is divided into a plurality of continuous and smaller basic windows with equal length, each sliding window can be combined by a plurality of continuous basic windows, and the number in the largest sliding window is enabled to beThe data stream subsequence is S [ t-w: t, assuming w ═ kb, where k denotes the number of basic windows in the maximum sliding window, b denotes the length of the basic windows, let BW, …, BW [ k-1 [ ] -1]Represents the basic window sequence in the maximum sliding window SW, where BW [ i [ [ i ]]=S[t-w+ib:t-w+(i+1)b]When the new basic window BW [ k ] slides forward as the sliding window slides forward]Generating, the basic window BW is expired, and the basic window BW [ i +1]]Before generation, BW [ i ] is obtained]Data summary of (1), basic window BW [ i ]]The jth element in (j) is denoted as BW [ i; j is a function of]Taking the maximum common divisor of the inference period of the rule subset and the sizes of all sliding windows as the length of the basic window, which is called basic time unit (basic time unit), and is denoted by btn,
btn=GCD(period,T1,T2,…,Tn)
order to
Figure FDA0002855501710000033
Is a set consisting of the sliding window size of the statistical unit,
Figure FDA0002855501710000034
is a sequential set consisting of the sliding window size of a statistical unit whose statistical operator is incrementally computable,
Figure FDA0002855501710000035
is a sequential set consisting of the sliding window sizes of statistical units for which the statistical operator is not incrementally calculable,
Figure FDA0002855501710000036
Figure FDA0002855501710000037
order to
Figure FDA0002855501710000038
Is a set of statistical operators that can be incrementally computed over the data stream S by statistical unitsStatistical units for quantitative calculation, let the difference between two adjacent elements in TSA form a set
Figure FDA0002855501710000031
Figure FDA0002855501710000032
In the sliding window SW [ t-ts [ ]i:t]Characteristic value of (c) is SW [ t-tsi-1:t]On the basis of the incremental calculation of SW [ t-ts ]i-1-diffi:t-tsi-1]Characteristic value of (1), wherein
Figure FDA0002855501710000039
For sliding window SW [ t-tsi:t]If, if
Figure FDA00028555017100000310
Figure FDA00028555017100000311
And statjThe size of all sliding windows of the operation is not larger than tsiThen the digest stat should be maintained on that windowj(SW[t-tsi:t]) Directly calculating statistical units which cannot be subjected to incremental calculation, and referring to the referred features during calculation if one feature is the referenceable feature of the other feature for all statistical units operating the same sliding window size;
the lightweight characteristic table is composed of columns of each element grouping in a sliding window with the size of TS, U, ND and U, wherein the characteristic value of original data is directly calculated according to the columns of the window grouping with the size of the window of the btu, and the abstract of a basic window is stored; obtaining intermediate abstracts for incremental calculation by summarizing the abstracts of the basic windows according to the columns of which the window size is the size of each element ND; obtaining characteristic values which can be calculated in an incremental manner by summarizing the summaries of the columns grouped by TSA [ i ] and ND [ i ] according to the window size of the TSA [ i ] grouping columns; and obtaining the characteristic value which can not be subjected to incremental calculation by adopting a mode of directly calculating the original data according to the column of which the window size is grouped for the size of each element of the TSB, wherein the lightweight characteristic table only caches the original data required by the characteristic calculation which can not be subjected to incremental calculation, and only keeps the abstract and does not keep the original data for the basic window.
CN202011545218.7A 2020-12-24 2020-12-24 Data pre-deployment method based on edge equipment rule inference in intelligent environment Pending CN112651506A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011545218.7A CN112651506A (en) 2020-12-24 2020-12-24 Data pre-deployment method based on edge equipment rule inference in intelligent environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011545218.7A CN112651506A (en) 2020-12-24 2020-12-24 Data pre-deployment method based on edge equipment rule inference in intelligent environment

Publications (1)

Publication Number Publication Date
CN112651506A true CN112651506A (en) 2021-04-13

Family

ID=75359765

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011545218.7A Pending CN112651506A (en) 2020-12-24 2020-12-24 Data pre-deployment method based on edge equipment rule inference in intelligent environment

Country Status (1)

Country Link
CN (1) CN112651506A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113627613A (en) * 2021-08-17 2021-11-09 北京计算机技术及应用研究所 Rule reasoning method for realizing edge-side cooperation
CN117992804A (en) * 2024-04-07 2024-05-07 东海实验室 Time sequence data stream pattern recognition method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100138698A1 (en) * 2007-05-10 2010-06-03 Onotprise Gmbh Reasoning architecture
CN106777029A (en) * 2016-12-08 2017-05-31 中国科学技术大学 A kind of distributed rule automotive engine system and its construction method
CN108199900A (en) * 2018-01-23 2018-06-22 重庆大学 For the distributing inference node optimization distribution method of intelligent environment
CN110298601A (en) * 2019-07-05 2019-10-01 上海观安信息技术股份有限公司 A kind of real time business air control system of rule-based engine
CN111814981A (en) * 2020-06-23 2020-10-23 中国科学院软件研究所 Distributed real-time rule inference scheduling method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100138698A1 (en) * 2007-05-10 2010-06-03 Onotprise Gmbh Reasoning architecture
CN106777029A (en) * 2016-12-08 2017-05-31 中国科学技术大学 A kind of distributed rule automotive engine system and its construction method
CN108199900A (en) * 2018-01-23 2018-06-22 重庆大学 For the distributing inference node optimization distribution method of intelligent environment
CN110298601A (en) * 2019-07-05 2019-10-01 上海观安信息技术股份有限公司 A kind of real time business air control system of rule-based engine
CN111814981A (en) * 2020-06-23 2020-10-23 中国科学院软件研究所 Distributed real-time rule inference scheduling method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CHENG-LIANG WANG 等: "Study on Optimal Allocation of Inference Nodes for Distributed Inference in Smart Environment", 《2019 IEEE SMARTWORLD,UBIQUITOUS INTELLIGENCE & COMPUTING,ADVANCED & TRUSTED COMPUTING, SCALABLE COMPUTING & COMMUNICATIONS, CLOUD & BIG DATA COMPUTING, INTERNET OF PEOPLE AND SMART CITY INNOVATION》, 9 April 2020 (2020-04-09), pages 508 - 513 *
ERWIN ADI 等: "Machine learning and data analytics for the IoT", 《NEURAL COMPUTING AND APPLICATIONS》, 11 May 2020 (2020-05-11), pages 16205 - 16233, XP037262596, DOI: 10.1007/s00521-020-04874-y *
汪成亮 等: "智能环境下基于边缘设备规则推理的数据预部署研究", 《电子学报》, vol. 50, no. 10, 31 October 2022 (2022-10-31), pages 2347 - 2360 *
汪成亮;黄心田;: "智能环境下基于雾计算的推理节点优化分配研究", 《电子学报》, vol. 48, no. 01, 15 January 2020 (2020-01-15), pages 35 - 43 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113627613A (en) * 2021-08-17 2021-11-09 北京计算机技术及应用研究所 Rule reasoning method for realizing edge-side cooperation
CN113627613B (en) * 2021-08-17 2024-02-06 北京计算机技术及应用研究所 Rule reasoning method for realizing edge-end coordination
CN117992804A (en) * 2024-04-07 2024-05-07 东海实验室 Time sequence data stream pattern recognition method and device

Similar Documents

Publication Publication Date Title
Tian et al. The discrete-time GI/Geo/1 queue with multiple vacations
Chen et al. Large-scale behavioral targeting
CN111832825A (en) Wind power prediction method and system integrating long-term and short-term memory network and extreme learning machine
Narvekar et al. Predicting user's web navigation behavior using hybrid approach
Chen et al. A survey of approximate quantile computation on large-scale data
CN112651506A (en) Data pre-deployment method based on edge equipment rule inference in intelligent environment
CN110322693A (en) A kind of traffic data complementing method, system, equipment and medium
Du et al. An EMD-and GRU-based hybrid network traffic prediction model with data reconstruction
Wang et al. TATCN: time series prediction model based on time attention mechanism and TCN
Akdere et al. Database-support for continuous prediction queries over streaming data
Gaber et al. A holistic approach for resource-aware adaptive data stream mining
Lan et al. An N-policy discrete-time Geo/G/1 queue with modified multiple server vacations and Bernoulli feedback
CN111771195A (en) Stream processing apparatus and data stream processing method
Devagiri et al. Split-merge evolutionary clustering for multi-view streaming data
Nagendra et al. Layered processing of skyline-window-join (SWJ) queries using iteration-fabric
Harth et al. Convey intelligence to edge aggregation analytics
Park et al. Adaptive optimization for multiple continuous queries
Patil et al. LATEST: learning-assisted selectivity estimation over spatio-textual streams
Huang et al. Estimating missing data for sparsely sensed time series with exogenous variables using bidirectional-feedback echo state networks
Dell’Aquila et al. Accuracy estimation in approximate query processing
Angelini et al. Time-Varying Poisson Autoregression
CN116170351B (en) Network flow prediction method based on space-time diagram attention mechanism
Xiao et al. Improving Neural Network Time Series Prediction with a GA-BFGS Grown Dynamic Architecture
Shanshan et al. A deadline-sensitive approach for real-time processing of sliding windows
Durbeck et al. Kalman filter driven estimation of community structure in time varying graphs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination