CN101551789A - Interval propagation reasoning method of Ising graphical model - Google Patents

Interval propagation reasoning method of Ising graphical model Download PDF

Info

Publication number
CN101551789A
CN101551789A CNA2009100688415A CN200910068841A CN101551789A CN 101551789 A CN101551789 A CN 101551789A CN A2009100688415 A CNA2009100688415 A CN A2009100688415A CN 200910068841 A CN200910068841 A CN 200910068841A CN 101551789 A CN101551789 A CN 101551789A
Authority
CN
China
Prior art keywords
theta
alpha
prime
variable
ising
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CNA2009100688415A
Other languages
Chinese (zh)
Inventor
廖士中
殷霞
陈亚瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CNA2009100688415A priority Critical patent/CN101551789A/en
Publication of CN101551789A publication Critical patent/CN101551789A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/29Graphical models, e.g. Bayesian networks

Abstract

The present invention discloses an interval propagation reasoning method based on Ising graphical model, wherein the method mainly comprises the following steps: firstly establishing an Ising mean field lopping computation tree based on Ising graphical model; and using a mean field interval propagation algorithm based on the Ising mean field lopping computation tree for computing the expected boundary of root node variate. The reasoning method of the invention has lower computing complexity and has an effective reasoning precision through the expected boundary of variate.

Description

The interval propagation reasoning method of Ising graph model
Technical field
The present invention relates to the approximate variation inference method on approximate probability inference method, especially the Ising graph model on the graph model.
Background technology
1.Ising graph model
Ising graph model (Ising graphical model) originates from statistical physics, is based on the Markov random field model of two-value random vector, for fields such as graphical analysis and natural language processing provide important modeling method.It is to be based upon graph structure G=(wherein set of node V is corresponding to Bernoulli random vector x={x for V, the E) probability model on 1..., x n∈ 0,1} n, limit collection E concerns corresponding to the condition independence between variable.(x θ) is the exponential family probability density distribution p of Ising graph model
p ( x ; θ ) = exp { ψ ( x ; θ ) - A ( θ ) } , A ( θ ) = log Σ x exp { ψ ( x ; θ ) } , ψ ( x ; θ ) = Σ i ∈ V θ i x i + Σ ( i , j ) ∈ E θ ij x i x j .
Wherein, θ i, θ IjThe representation model parameter, and ∀ ( i , j ) ∉ E , θ Ij=0; The logarithm partition function of A (θ) representation model.
On the Ising graph model, the key issue of probability inference is to calculate logarithm partition function A (θ) and marginal probability distribution p (x i), for general Ising graph model, accurately inference method calculates computation complexity exponential increase with the model scale of A (θ), thus developed various approximate resoning methods, as the method for sampling, variational method etc.
2.Ising average field
Ising average field (Ising mean field) is a kind of basic variation reasoning (varitionalinference) method on the Ising graph model, its basic thought is by the variation conversion probability inference problem to be converted into the functional extreme value problem, and calculates partition function lower bound and variable expectation approximate value by finding the solution functional extreme value.This method has simple and clear variational form, propinquity effect preferably, is the important tool of handling the large-scale complex data.
The variation reasoning is by minimizing the KL distance between free distribution q (x) and the former distribution p (x), the probability inference problem is converted into the functional extreme value problem, and calculates logarithm partition function A (θ) and variable expectation by finding the solution functional extreme value.KL distance between distribution q (x) and the p (x) is
KL ( q ( x ) | | p ( x ) ) = Σ x q ( x ) log q ( x ) p ( x ) .
Carry out variation and change by minimizing KL distance
A ( θ ) = max q ( x ) { Σ x q ( x ) ψ ( x ; θ ) + H ( q ( x ) ) } , - - - ( 1 )
Wherein, entropy function H ( q ( x ) ) = - Σ x q ( x ) log q ( x ) .
Because it is higher accurately to find the solution the computational complexity of variation formula (1), the constraint subclass can be handled freely distributing in Ising average field M tract ⊆ M On, calculate lower bound and the variable of A (θ) and expect approximate value.M TractOn free distribution q (x) be defined in non-intersect variable bunch { c 1..., c mOn complete decomposed form q ( x ) = Π α = 1 m q α ( c α ) , Promptly
M tract = { q ( x ) | q ( x ) = Π α = 1 m q α ( c α ) } ,
Then Ising average field variation formula is
A ( θ ) ≥ max q ( x ) ∈ M tract { Σ x q ( x ) ψ ( x ; θ ) + H ( q ( x ) ) } . - - - ( 2 )
M the average field that can obtain variation formula (2) according to Eulerian equation is iterative, wherein corresponding to variable bunch c αThe average field iteratively be
q α ( c α ) ∝ exp { Σ i ∈ V v α θ ^ i x i + Σ ( i , j ) ∈ E c α θ ij x i x j } , - - - ( 3 )
Wherein,
1.
Figure A20091006884100058
Expression variable bunch c αCorresponding set of node.
2. Expression variable bunch c αOn the limit collection.
3.
Figure A200910068841000510
The expression distribution parameter θ ^ i = θ i + Σ t ∈ N ( c α , i ) θ it μ t ,
And N ( c α , i ) = { t | ( t , i ) ∈ E , i ∈ V c α , t ∉ V c α } ,
The variable expectation μ t = Σ c β q β ( c β ) x t , X wherein t∈ c β
Can calculate variable expectation approximate value according to iterative (3), and variable is expected that bringing variation formula (2) into can calculate logarithm partition function lower bound.
Finding the solution functional extreme value is the important step of variation reasoning, also is the calculating core of reasoning process.Directly utilize the iterative functional convergency value that calculates of functional, but its intactly iterative process make whole model information depth intersection, not only computational complexity is higher, and is unfavorable for increasing fresh information, so developed the approximate variation inference method under the incomplete iteration.
Existing approximate variation inference method comprises based on the local training method (local training method) of conviction propagation with based on conviction propagates the BP-SAW method.Local training method utilization conviction message iteration tentatively several times comes the computation model partition function, can under local message, train separately model parameter, reduced computational complexity, also made things convenient for the increase fresh information, but local training method is difficult to measure the computational accuracy that approximate conviction is propagated.The BP-SAW method is calculated tree at graph model SAW (self-avoiding walk) and is gone up the propagation of execution limited number of time conviction, calculate the marginal probability APPROXIMATE DISTRIBUTION, and the APPROXIMATE DISTRIBUTION error bound have been provided based on the message error concept, but this method needs to carry out message propagation on whole model, and computation complexity is higher.
Existing these approximate variation inference methods are mainly propagated based on conviction and are carried out approximate treatment research, and seldom consider other variation inference methods, as average field inference method; Simultaneously computational accuracy is the important indicator of approximate variation reasoning research, and these methods or be difficult to the analytical calculation precision, as local training method, or computation complexity is higher, as the BP-SAW algorithm.
Summary of the invention
The objective of the invention is to overcome the above-mentioned deficiency of prior art, provide a kind of can under low computational complexity, provide variable expectation circle in the inference method of Ising graph model.The technical solution used in the present invention is:
A kind of interval propagation reasoning method based on the Ising graph model calculates tree by definition Ising average field, and realize expectation interval propagation reasoning process on the calculating tree, comprises the following steps:
(1) tree is calculated in definition Ising average field: under the field reasoning of Ising average, and variable bunch c γThe calculating tree-model be a four-tuple T (D γ, R, M, Q), wherein:
1) D γ: with c γVariable cluster knot point set D for root node γ={ c γ∪ Ch (c γ) ∪ Ch (Ch (c γ)) ∪ ..., wherein, Ch (c γ) expression c γThe child node collection, Ch (c γ)={ c β| x i∈ c γ, x j∈ c β, (i, j) ∈ E, γ ≠ β }, Ch (Ch (c γ)) expression variables set Ch (c γ) child node set: Ch ( Ch ( c γ ) ) = ∪ c β ∈ Ch ( c γ ) Ch ( c β ) ,
2) R: set of relations R={<c α, c β| c α∈ D γ, c β∈ Ch (c α), wherein, relation<c α, c βExpression c βBe c αChild node,
3) M: calculate the bottom-up one way propagation of message on the tree, note M α - out = { μ i | i ∈ V c α } Expression c αThe output message collection,
M α - in = U c β ∈ Ch ( c α ) M β - out Expression c αInput message set, then M={M α-out| c α∈ D γ,
4) Q: probability distribution collection Q={q α(c αθ ' α) | c α∈ D γ, and θ ij x i x j ∝ exp { Σ i ∈ V c α θ ′ i x i + Σ ( i , j ) ∈ E c α θ ij x i x j } ,
Wherein, θ ′ i = θ i + Σ μ t ∈ M α - in θ it μ t ,
(2) definition Ising average field lopping computation tree: under the field reasoning of Ising average, variable bunch c γBeta pruning to calculate tree-model be a four-tuple T c(D γ, R, M, Q), wherein:
1) D γ: with c γVariable cluster knot point set D for root node γ={ c γ∪ CCh (c γ) ∪ CCh (CCh (c γ)) ∪ ..., wherein, CCh (c γ) expression c γThe child node collection, CCh (c γ)=Ch (c γ) An (c γ), CCh (CCh (c γ)) expression variables set CCh (c γ) lopping child node collection: CCh ( CCh ( c γ ) ) = ∪ c β ∈ CCh ( c γ ) CCh ( c β ) ,
2) R: set of relations R={<c α, c β| c α∈ D γ, c β∈ CCh (c α), wherein, relation<c α, c βExpression c βBe c αChild node,
3) M: calculate the bottom-up one way propagation of message on the tree, note M α - out = { μ i | i ∈ V c α } Expression c αThe output message collection,
M α - in = ∪ c β ∈ Ch ( c α ) M β - out Expression c αInput message set, then M={M α-out| c α∈ D γ,
4) Q: probability distribution collection Q={q α(c αθ ' α) | c α∈ D γ, and θ ij x i x j ∝ exp { Σ i ∈ V c α θ ′ i x i + Σ ( i , j ) ∈ E c α θ ij x i x j } ,
Wherein, θ ′ i = θ i + Σ μ t ∈ M α - in θ it μ t ;
(3) on lopping computation tree, according to c αThe input message area between calculate this variable bunch the probability distribution interval, make M α-in Int.Expression variable bunch c αSet between the input message area, promptly M α - in int . = { [ μ t l , μ t u ] | t ∈ V c β , c β ∈ Ch ( c α ) } , Variable bunch c αBetween the parameter region of probability distribution be
θ i ′ l = min { θ i + Σ μ t ∈ M α - in θ it μ t | μ t l ≤ μ t ≤ μ t u } ,
θ i ′ u = max { θ i + Σ μ t ∈ M α - in θ it μ t | μ t l ≤ μ t ≤ μ t u } ,
θ α ′ l = { θ i ′ l , θ ij | i , j ∈ V c α , ( i , j ) ∈ E c α } ,
θ α ′ u = { θ i ′ u , θ ij | i , j ∈ V c α , ( i , j ) ∈ E c α } .
Make A={a 1, a 2..., B={b 1, b 2... the equipotentiality set one to one of expression element, and A≤B={a i≤ b i| i=1,2 ..., variable cluster knot point c then αThe probability distribution interval be { q α ( c α ; θ ′ α ) | θ α ′ l ≤ θ ′ α ≤ θ α ′ u } ;
(4), utilize sum-product algorithm to calculate variable bunch c based on the probability distribution interval of variable bunch αThe output message interval: at probability distribution q α(c αθ ' α) go up and move sum-product algorithm calculating variable x i∈ c αExpectation μ i, i.e. μ i=Sum-Prod (q α(c αθ ' α)), as given parameter θ ' iDuring interval, μ iExtreme value be taken at end points place between parameter region, that is:
μ i l = min { Sum - Prod ( q α ( c α ; θ ′ α ) ) | ( θ ′ 1 , θ ′ 2 , · · · ) ∈ { θ 1 ′ l , θ 1 ′ u } × { θ 2 ′ l , θ 2 ′ u } × · · · } ,
μ i u = max { Sum - Prod ( q α ( c α ; θ ′ α ) ) | ( θ ′ 1 , θ ′ 2 , · · · ) ∈ { θ 1 ′ l , θ 1 ′ u } × { θ 2 ′ l , θ 2 ′ u } × · · · }
Variable cluster knot point c then αThe interval set of output message is M α - out int . = { [ μ i l , μ i u ] | i ∈ V c α } , During calculating, according to the interval of bottom variable cluster knot point input message, the bottom-up message interval propagation that successively carries out calculates root variable bunch variable expectation interval.
Interval propagation reasoning method based on the Ising graph model of the present invention, calculate tree by definition Ising average field, according to the interval [0 1] of bottom variable cluster knot point input message, the bottom-up message interval propagation that successively carries out calculates root variable bunch variable expectation circle.The present invention can calculate root variable bunch variable expectation circle by using propagation algorithm between the average place on the lopping computation tree of Ising average field.This inference method has lower computational complexity, and has provided effective reasoning precision by variable expectation circle.
Description of drawings
Fig. 1: tree is calculated in Ising graph model and 3 layers of average field thereof;
Fig. 1 (a): 3 * 3 two-dimentional trellis Ising graph model;
Fig. 1 (b): the free distributed architecture in average field of 3 * 1 piecemeals;
Fig. 1 (c): based on Fig. 1 (b) structure, with variable bunch c 1Calculate tree-model for the k=3 layer average field of root, reach the bottom-up communication process of message;
Fig. 2: on 3 * 3 two dimensional graph models,, be the k=3 layer calculating tree of root with node 1 based on 3 * 3 free distributed architectures;
Fig. 3: on 3 * 3 two dimensional graph models,, be the lopping computation tree of the k=3 layer of root with node 1 based on 3 * 3 free distributed architectures;
Fig. 4: the free distributed architecture based on 3 * 1, the variable expectation circle that the MFIP algorithm provides on k (k=2,3,4) the layer lopping computation tree relatively.Wherein μ represents the expectation of variable, and the solid line at variable place is represented k=2 from left to right successively, 3,4 o'clock variable expectation circle, some expression variable expectation exact value;
Fig. 5: gravitation Ising graph model G 2Last variable expectation circle compares, and wherein, μ represents the variable expectation value, and solid line is represented the variable expectation circle that the MFIP algorithm provides on 2 layers of lopping computation tree, and dotted line is represented the variable expectation circle that the BP-SAW algorithm provides, and some expression variable is expected exact value;
Fig. 6: repulsion Ising graph model G 3Last variable expectation circle relatively, all same Fig. 5 of meaning of representing of μ, solid line, dotted line, point wherein.
Embodiment
At first inference method of the present invention is described in detail below.
1.Ising tree is calculated in the average field
It is to represent average field iterative computation process on the Ising graph model with tree-building version that tree is calculated in Ising average field.
Under the field reasoning of definition 1:Ising average, variable bunch c γThe calculating tree-model be a four-tuple T (D γ, R, M, Q).
Wherein:
5) D γ: with c γVariable cluster knot point set D for root node γ={ c γ∪ Ch (c γ) ∪ Ch (Ch (c γ)) ∪ ...
Wherein, Ch (c γ) expression c γThe child node collection, Ch (c γ)={ c β| x i∈ c γ, x j∈ c β, (i, j) ∈ E, γ ≠ β }, Ch (Ch (c γ)) expression variables set Ch (c γ) child node set: Ch ( Ch ( c γ ) ) = ∪ c β ∈ Ch ( c γ ) Ch ( c β ) .
6) R: set of relations R={<c α, c β| c α∈ D γ, c β∈ Ch (c α).
Relation<c wherein α, c βExpression c βBe c αChild node.
7) M: calculate the bottom-up one way propagation of message on the tree, note M α - out = { μ i | i ∈ V c α } Expression c αThe output message collection,
M α - in = ∪ c β ∈ Ch ( c α ) M β - out Expression c αInput message set, then M={M α-out| c α∈ D γ.
8) Q: probability distribution collection Q={q α(c αθ ' α) | c α∈ D γ, and θ ij x i x j ∝ exp { Σ i ∈ V c α θ ′ i x i + Σ ( i , j ) ∈ E c α θ ij x i x j } ,
Wherein θ ′ i = θ i + Σ μ t ∈ M α - in θ it μ t .
Ising average field lopping computation tree is a kind of special shape that Ising calculates tree, and it carries out the lopping gained by the node of recalling that the average field is calculated on the tree.
Under the field reasoning of definition 2:Ising average, variable bunch c γBeta pruning to calculate tree-model be a four-tuple T c(D γ, R, M, Q).
Wherein:
5) D γ: with c γVariable cluster knot point set D for root node γ={ c γ∪ CCh (c γ) ∪ CCh (CCh (c γ)) ∪ ...
Wherein, CCh (c γ) expression c γThe child node collection, CCh (c γ)=Ch (c γ) An (c γ),
CCh (CCh (c γ)) expression variables set CCh (c γ) lopping child node collection:
CCh ( CCh ( c γ ) ) = ∪ c β ∈ CCh ( c γ ) CCh ( c β ) .
6) R: set of relations R={<c α, c β| c α∈ D γ, c β∈ CCh (c α).
Relation<c wherein α, c βExpression c βBe c αChild node.
7) M: calculate the bottom-up one way propagation of message on the tree, note M α - out = { μ i | i ∈ V c α } Expression c αThe output message collection,
M α - in = ∪ c β ∈ Ch ( c α ) M β - out Expression c αInput message set, then M={M α-out| c α∈ D γ.
8) Q: probability distribution collection Q={q α(c αθ ' α) | c α∈ D γ, and θ ij x i x j ∝ exp { Σ i ∈ V c α θ ′ i x i + Σ ( i , j ) ∈ E c α θ ij x i x j } ,
Wherein θ ′ i = θ i + Σ μ t ∈ M α - in θ it μ t .
Theorem 1: under the field reasoning of Ising average, tree T (D is calculated in k layer average field γ, R, M Q) goes up root node c γProbability distribution, be equivalent to c under k average field iteration γProbability distribution.
2. propagation algorithm between the average place
Calculate tree design MFIP algorithm based on Ising.
The MFIP basic idea is that on Ising average field calculating tree T, according to the interval [01] of bottom variable cluster knot point input message, the bottom-up message interval propagation that successively carries out calculates root variable bunch variable expectation interval.The basic calculating unit of MFIP algorithm is a communication process between message area on the variable bunch.For the variable cluster knot point c that calculates on the tree α, make q α(c αθ ' α) expression node probability distribution, [μ i l, μ i u] expression variable x i∈ c αThe expectation interval, [θ ' i l, θ ' i u] expression probability distribution relevant parameter interval.Variable cluster knot point c αOn the interval propagation process comprise between message area output procedure between input and message area.
In the 1st step, (message interval input MII), is according to c to input process between message area αThe input message area between calculate this variable bunch the probability distribution interval.Make M α-in Int.Expression variable bunch c αSet between the input message area, promptly
M α - in int . = { [ μ t l , μ t u ] | t ∈ V c β , c β ∈ Ch ( c α ) } .
According to the probability distribution of calculating the last variable cluster knot point of tree as can be known, variable bunch c αBetween the parameter region of probability distribution be
θ i ′ l = min { θ i + Σ μ t ∈ M α - in θ it μ t | μ t l ≤ μ t ≤ μ t u } ,
θ i ′ u = max { θ i + Σ μ t ∈ M α - in θ it μ t | μ t l ≤ μ t ≤ μ t u } ,
θ α ′ l = { θ i ′ l , θ ij | i , j ∈ V c α , ( i , j ) ∈ E c α } ,
θ α ′ u = { θ i ′ u , θ ij | i , j ∈ V c α , ( i , j ) ∈ E c α } .
Make A={a 1, a 2..., B={b 1, b 2... the equipotentiality set one to one of expression element, and A≤B={a i≤ b i| i=1,2 ..., variable cluster knot point c then αThe probability distribution interval be { q α ( c α ; θ ′ α ) | θ α ′ l ≤ θ ′ α ≤ θ α ′ u } .
In the 2nd step, (message interval output MIO), based on the probability distribution interval of variable bunch, utilizes sum-product algorithm to calculate variable bunch c to output procedure between message area αThe output message interval.At probability distribution q α(c αθ ' α) go up and move sum-product algorithm calculating variable x i∈ c αExpectation μ i, i.e. μ i=Sum-Prod (q α(c αθ ' α)).According to Ising graph model character μ as can be known iAbout coefficient θ ' 1, θ ' 2... have strictly monotone, so as given parameter θ ' iDuring interval, μ iExtreme value be taken at end points place between parameter region, promptly
μ i l = min { Sum - Prod ( q α ( c α ; θ ′ α ) ) | ( θ ′ 1 , θ ′ 2 , · · · ) ∈ { θ 1 ′ l , θ 1 ′ u } × { θ 2 ′ l , θ 2 ′ u } × · · · } ,
μ i u = max { Sum - Prod ( q α ( c α ; θ ′ α ) ) | ( θ ′ 1 , θ ′ 2 , · · · ) ∈ { θ 1 ′ l , θ 1 ′ u } × { θ 2 ′ l , θ 2 ′ u } × · · · } .
Variable cluster knot point c then αThe interval set of output message is M α - out int . = { [ μ i l , μ i u ] | i ∈ V c α } .
Calculate tree T at k layer Ising average field, the bottom-up message interval propagation that successively carries out, it is interval to calculate bunch variable expectation of root variable.K layer Ising calculates on the tree T, and the formalized description of MFIP algorithm is as follows:
Data:T(D γ,R,M,Q),k
Result: { [ μ s l , μ s u ] | s ∈ V c γ }
Begin
For?layer?l←k?to?1?do
For?all?c α?of?layer?l?do
If?l≡k?then M α - in int . ← { [ 0,1 ] , [ 0,1 ] , · · · }
Else M α - in int . ← ∪ c β ∈ Ch ( c α ) M β - out int .
End
( θ α ′ l , θ α ′ u ) ← MII ( M α - in int . )
{ q α ( c α ; θ ′ α ) | θ α ′ l ≤ θ ′ α ≤ θ α ′ u }
M α - out int . ← MIO ( { q α ( c α ; θ ′ α ) | θ α ′ l ≤ θ ′ α ≤ θ α ′ u } )
End
End
{ q γ ( c γ ; θ ′ γ ) | θ γ ′ l ≤ θ ′ γ ≤ θ γ ′ u }
{ [ μ s l , μ s u ] | s ∈ V c γ }
End
3. expect the boundary based on the variable of lopping computation tree MFIP algorithm
Theorem 2: order { [ μ s l , μ s u ] | s ∈ V c γ } The interval set of variable expectation that expression MFIP algorithm provides, μ s *Variable x on the expression Ising graph model sThe exact value of expectation.Then at average field lopping computation tree T c(D γ, R, M, Q) on, the expectation of variable that the MFIP algorithm provides is interval expects the boundary for variable, promptly μ s l ≤ μ s * ≤ μ s u , ∀ x s ∈ c γ .
Below in conjunction with embodiment, the present invention will be further described.
1. set up Ising average field lopping computation tree based on the Ising graph model
Two-dimentional trellis Ising graph model to 3 * 3 is shown in Fig. 1 (a).
The designated model parameter generates general Ising graph model G at random 1i∈ (0.25,0.25), θ Ij∈ (2,2)), gravitation Ising graph model G 2i∈ (0.25,0.25), θ Ij∈ (0,2)) and repulsion Ising graph model G 3i∈ (0.25,0.25), θ Ij∈ (2,0)).For graph model G 1, set up number of plies k=2 respectively based on freely distributing of 3 * 1 piecemeals, 3,4 lopping computation tree (k=3 is as Fig. 1 (c)).For graph model G 2, G 3, set up 2 layers of lopping computation tree-model based on freely distributing of 3 * 1 piecemeals respectively.
2. calculate root variable bunch variable expectation circle
(the mean field interval propagation of propagation algorithm between the average place, MFIP) basic thought is on the lopping computation tree of Ising average field, interval [0 1] according to bottom variable cluster knot point input message, the bottom-up message interval propagation that successively carries out calculates root variable bunch variable expectation circle.
For different graph models, relatively MFIP algorithm, associating tree algorithm (junction tree, JT) and BP-SAW algorithm (self-avoiding walk).The compactness on the efficient of comparison algorithm and variable expectation circle.
For G 1, operation MFIP algorithm computation variable expectation circle on lopping computation tree; At G 1Last operation associating tree algorithm is calculated the exact value of variable expectation, and the result as shown in Figure 4.To interpretation as can be known, the interval convergence rapidly of variable expectation that provides with the increase MFIP algorithm of lopping computation tree number of plies k, and the MFIP algorithm provides variable and expects the boundary when k=2.
For G 2, G 3, operation MFIP algorithm computation variable expectation circle on lopping computation tree; At G 2, G 3On move BP-SAW algorithm computation variable expectation circle respectively; Then at G 2, G 3On move the exact value of JT algorithm computation variable expectation respectively, the result is respectively shown in Fig. 5, Fig. 6 and table 1.To interpretation as can be known, for gravitation graph model G 2, compact than expectation circle that BP-SAW algorithm provides in the variable expectation circle that the MFIP algorithm provides on 2 layers of lopping computation tree; For repulsion graph model G 3, in 9 groups of data of Ising graph model, based on the MFIP algorithm of 2 layers of lopping computation tree than BP-SAW algorithm expectation circle tight have 5 groups, pine have 2 groups, other 2 groups of data presentation BP-SAW algorithms do not provide the expectation circle.
At last, as shown in Table 1, compare with the BP-SAW algorithm, the MFIP algorithm has higher efficient.Be that MFIP is a kind of efficiently approximate easily variation inference method; The variable expectation circle that the MFIP algorithm provides on 2 layers of lopping computation tree has higher compactness.
Table 1: variable expectation circle compares, and wherein, μ represents the variable expectation, and t represents the average operating time of algorithm.
Figure A20091006884100121

Claims (1)

1. the interval propagation reasoning method of an Ising graph model calculates tree by definition Ising average field, and realize expectation interval propagation reasoning process on the calculating tree, it is characterized in that, comprises the following steps:
(1) tree is calculated in definition Ising average field: under the field reasoning of Ising average, and variable bunch c γThe calculating tree-model be a four-tuple T (D γ, R, M, Q), wherein:
1) D γ: with c γVariable cluster knot point set D for root node γ={ c γ∪ Ch (c γ) ∪ Ch (Ch (c γ)) ∪ ..., wherein, Ch (c γ) expression c γThe child node collection, Ch (c γ)={ c β| x i∈ c γ, x j∈ c β, (i, j) ∈ E, γ ≠ β }, Ch (Ch (c γ)) expression variables set Ch (c γ) child node set: Ch ( Ch ( c γ ) ) = ∪ c β ∈ Ch ( c γ ) Ch ( c β ) ,
2) R: set of relations R={<c α, c β| c α∈ D γ, c β∈ Ch (c α), wherein, relation<c α, c βExpression c βBe c αChild node,
3) M: calculate the bottom-up one way propagation of message on the tree, note M α - out = { μ i | i ∈ V c α } Expression c αThe output message collection,
M α - in = ∪ c β ∈ Ch ( c α ) M β - out Expression c αInput message set, then M={M α-out| c α∈ D γ,
4) Q: probability distribution collection Q={q α(c αθ ' α) | c α∈ D γ, and θ ij x i x j ∝ exp { Σ i ∈ V c α θ ′ i x i + Σ ( i , j ) ∈ E c α θ ij x i x j } ,
Wherein, θ ′ i = θ i + Σ μ t ∈ M α - in θ it μ t ,
(2) definition Ising average field lopping computation tree: under the field reasoning of Ising average, variable bunch c γBeta pruning to calculate tree-model be a four-tuple T c(D γ, R, M, Q), wherein:
1) D γ: with c γVariable cluster knot point set D for root node γ={ c γ∪ CCh (c γ) ∪ CCh (CCh (c γ)) ∪ ..., wherein, CCh (c γ) expression c γThe child node collection, CCh (c γ)=Ch (c γ) An (c γ), CCh (CCh (c γ)) expression variables set CCh (c γ) lopping child node collection: CCh ( CCh ( c γ ) ) = ∪ c β ∈ CCh ( c γ ) CCh ( c β ) ,
2) R: set of relations R={<c α, c β| c α∈ D γ, c β∈ CCh (c α), wherein, relation<c α, c βExpression c βBe c αChild node,
3) M: calculate the bottom-up one way propagation of message on the tree, note M α - out = { μ i | i ∈ V c α } Expression c αThe output message collection,
M α - in = ∪ c β ∈ Ch ( c α ) M β - out Expression c αInput message set, then M={M α-out| c α∈ D γ,
4) Q: probability distribution collection Q={q α(c αθ ' α) | c α∈ D γ, and θ ij x i x j ∝ exp { Σ i ∈ V c α θ ′ i x i + Σ ( i , j ) ∈ E c α θ ij x i x j } , Wherein, θ ′ i = θ i + Σ μ t ∈ M α - in θ it μ t ;
(3) on lopping computation tree, according to c αThe input message area between calculate this variable bunch the probability distribution interval, make M α-in Int.Expression variable bunch c αSet between the input message area, promptly M α - in int . = { [ μ t l , μ t u ] | t ∈ V c β , c β∈ Ch (c α), variable bunch c αBetween the parameter region of probability distribution be
θ i ′ l = min { θ i + Σ μ t ∈ M α - in θ it μ t | μ t l ≤ μ t ≤ μ t u } ,
θ i ′ u = max { θ i + Σ μ t ∈ M α - in θ it μ t | μ t l ≤ μ t ≤ μ t u } ,
θ α ′ l = { θ i ′ l , θ ij | i , j ∈ V c α , ( i , j ) ∈ E c α } ,
θ α ′ u = { θ i ′ u , θ ij | i , j ∈ V c α , ( i , j ) ∈ E c α } .
Make A={a 1, a 2..., B={b 1, b 2... the equipotentiality set one to one of expression element, and A≤B={a i≤ b i| i=1,2 ..., variable cluster knot point c then αThe probability distribution interval be
{ q α ( c α ; θ ′ α ) | θ α ′ l ≤ θ ′ α ≤ θ α ′ u } ;
(4), utilize sum-product algorithm to calculate variable bunch c based on the probability distribution interval of variable bunch αThe output message interval: at probability distribution q α(c αθ ' α) go up and move sum-product algorithm calculating variable x i∈ c αExpectation μ i, i.e. μ i=Sum-Prod (q α(c αθ ' α)), as given parameter θ ' iDuring interval, μ iExtreme value be taken at end points place between parameter region, that is:
μ i l = min { Sum - Prod ( q α ( c α ; θ ′ α ) ) | ( θ ′ 1 , θ ′ 2 , · · · ) ∈ { θ 1 ′ l , θ 1 ′ u } × { θ 2 ′ l , θ 2 ′ u } × · · · } ,
μ i u = max { Sum - Prod ( q α ( c α ; θ ′ α ) ) | ( θ ′ 1 , θ ′ 2 , · · · ) ∈ { θ 1 ′ l , θ 1 ′ u } × { θ 2 ′ l , θ 2 ′ u } × · · · } ,
Variable cluster knot point c then αThe interval set of output message is M α - out int . = { [ μ i l , μ i u ] i ∈ V c α } , During calculating, according to the interval of bottom variable cluster knot point input message, the bottom-up message interval propagation that successively carries out calculates root variable bunch variable expectation interval.
CNA2009100688415A 2009-05-14 2009-05-14 Interval propagation reasoning method of Ising graphical model Pending CN101551789A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNA2009100688415A CN101551789A (en) 2009-05-14 2009-05-14 Interval propagation reasoning method of Ising graphical model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNA2009100688415A CN101551789A (en) 2009-05-14 2009-05-14 Interval propagation reasoning method of Ising graphical model

Publications (1)

Publication Number Publication Date
CN101551789A true CN101551789A (en) 2009-10-07

Family

ID=41156037

Family Applications (1)

Application Number Title Priority Date Filing Date
CNA2009100688415A Pending CN101551789A (en) 2009-05-14 2009-05-14 Interval propagation reasoning method of Ising graphical model

Country Status (1)

Country Link
CN (1) CN101551789A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104951531A (en) * 2015-06-17 2015-09-30 深圳大学 Method and device for estimating user influences in social networking services based on graph simplification technology

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104951531A (en) * 2015-06-17 2015-09-30 深圳大学 Method and device for estimating user influences in social networking services based on graph simplification technology
WO2016202209A1 (en) * 2015-06-17 2016-12-22 深圳大学 Method and device for estimating user influence in social network using graph simplification technique
CN104951531B (en) * 2015-06-17 2018-10-19 深圳大学 Simplify the user influence in social network evaluation method and device of technology based on figure

Similar Documents

Publication Publication Date Title
CN104346629B (en) A kind of model parameter training method, apparatus and system
CN103413551B (en) Based on the method for distinguishing speek person of sparse dimension reduction
CN105469611B (en) A kind of short-term traffic flow forecasting model method
CN108231086A (en) A kind of deep learning voice enhancer and method based on FPGA
CN107608953B (en) Word vector generation method based on indefinite-length context
Liu et al. Feedback message passing for inference in Gaussian graphical models
CN103413174A (en) Short-term wind speed multi-step prediction method based on deep learning method
CN105096614A (en) Newly established crossing traffic flow prediction method based on generating type deep belief network
CN103150383B (en) A kind of event evolution analysis method of short text data
CN105913077A (en) Data clustering method based on dimensionality reduction and sampling
Dabhi et al. Hybrid wavelet-postfix-GP model for rainfall prediction of Anand Region of India
CN105608271A (en) Decomposition and optimization based short-term wind speed time series prediction method
CN107798426A (en) Wind power interval Forecasting Methodology based on Atomic Decomposition and interactive fuzzy satisfying method
CN113128206B (en) Question generation method based on word importance weighting
US20220367057A1 (en) Missing medical diagnosis data imputation method and apparatus, electronic device and medium
CN106295690A (en) Time series data clustering method based on Non-negative Matrix Factorization and system
CN106354889A (en) Batch process unequal-length time period synchronization method based on LWPT-DTW (lifting wavelet package transform-dynamic time warping)
Kortchemski et al. Simply generated non-crossing partitions
CN101551789A (en) Interval propagation reasoning method of Ising graphical model
CN106355091A (en) Communication source positioning method based on biological intelligence
CN106971170A (en) A kind of method for carrying out target identification using one-dimensional range profile based on genetic algorithm
CN106446546A (en) Meteorological data complement method based on automatic convolutional encoding and decoding algorithm
Kadir Bayesian inference of autoregressive models
CN116341720A (en) Multi-fan wind speed and direction prediction method based on dynamic graph convolution and transformation
CN107133348A (en) Extensive picture concentrates the proximity search method based on semantic consistency

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20091007