# New Technology

Trending Technology Machine Learning, Artificial Intelligent, Block Chain, IoT, DevOps, Data Science

## Friday, 27 July 2018

It is a type of Graphical model which represent certain independence relation or conditional independence relationship between the different variables in the domain.

A graphical model that efficiently encodes the joint probability distribution for a large set of variables

A Bayesian Network for a set of variable (nodes)
X = {X1,.....Xn}

Arcs represent probabilistic dependence among variables

Lack of an arc denotes a conditional independence

The network structure S is a directed acyclic graph

A set P of local probability distributions at each node (Conditional Probability Table)

Bayesian network represent the efficiently the joint probability distribution of the variables.

Representation a Bayesian Belief Network

Conditional probability table associated with each node specifies the conditional distribution for the variable give its immediate present in the graph.

Each node is asserted to be conditionally independent of its non-descendants, given its immediate parents.

Inference  in Bayesian Networks

The computes posterior probabilities given evidence about some nodes

Exploits probabilistic independence for efficient computation.

Unfortunately exact inference of probabilities in general for an arbitrary of Bayesian Network is known to be NP-hard

In theory, Approximate technique (Such as Monte corlo methods) can also be NP-hard, though in practice, many such methods were shown to be useful.

Efficient algorithm  leverage the structure of the graph.

Application of Bayesian Network

• Diagnosis : P (cause | symptom) = ?
• Prediction : P (symptom | cause) = ?
• Classification : P (class | data)
• Decision-making (given a cost function)

- Structure of the graph ⇔ Conditional independence relations
In general,
p(X1,X2,...Xn) = π p(Xi | parents(Xi) )
- Requires that graph is a cyclic (no directed cycles)
- 2 components to a Bayesian Network
• The graph structure (conditional independence assumptions)
• The numerical probabilities (for each variable given its parents)
Examples :-

Hidden Markov Model (HMM)

Assumptions:
1. hidden state sequence is Markov
2. observation Yt is conditionally independent of all other variables given St
Widely used in sequence learning eg, speech recognition tagging
Inference is linear in n

Learning Bayesian Belief Networks

1. The network structure is given in advance and all the variables are fully observable in the training examples.
- estimate the conditional probabilities.

2. The network structure is given in advance but only some of the variables are observable in the training data.
- Similar to learning the weights for the hidden units of a Neural Net: Gradient Ascent Procedure

3. The network structure is not known in advance.
- Use a heuristic search or constraint-based technique to search through potential structure.

#### 1 comment:

1. How we can generate some data using prior information ?