# Baynesian Networks - Apl Jhu

Τεχνίτη Νοημοσύνη και Ρομποτική

7 Νοε 2013 (πριν από 4 χρόνια και 8 μήνες)

973 εμφανίσεις

1

Baynesian Networks

J. M. Akinpelu

2

Conditional Probability

1.

2.
If
E

and
F

are independent, then

3.
Law of Total Probability

1
1
)
(
)
|
(
)
(
)
(
i
i
i
i
i
F
P
F
E
P
F
E
P
E
P
)
|
(
)
|
(
)
|
(
G
F
P
G
F
E
P
G
F
E
P

)
(
)
|
(
E
P
F
E
P

3

Conditional Probability

4.
Bayes’ Theorem:

5.
Chain Rule

)
(
)
(
)
|
(
)
(
)
(
)
|
(
F
P
E
P
E
F
P
F
P
F
E
P
F
E
P

.
)
|
(
)
|
(
)
|
(
)
(
)
(
1
1
2
1
3
1
2
1
1

n
n
n
E
E
E
P
E
E
E
P
E
E
P
E
P
E
E
P

4

Conditional Independence

Let
E
,
F
, and
G

be events.
E

and
F

are
conditionally independent

given

G

if

An equivalent definition is:

).
|
(
)
|
(
)
|
(
G
F
P
G
E
P
G
F
E
P

).
|
(
)
|
(
G
E
P
G
F
E
P

5

Baynesian Networks

A Baynesian network (BN) is a probabilistic
graphical model that represents a set of
variables and their independencies

Formally, a BN is a directed acyclic graph
(DAG) whose nodes represent variables, and
whose arcs encode the conditional
independencies between the variables

6

Baynesian Network
-

Example

From Charniak

family
-
out (fo)

bowel
-
problem (bp)

dog
-
out (do)

light
-
on (lo)

hear
-
bark (hb)

7

Bayesian Networks

“Over the last few years, a method of reasoning using
probabilities, variously called belief networks,
Bayesian networks, knowledge maps, probabilistic
causal networks, and so on, has become popular
within the AI community”
-

from Chaniak

Applications include medical diagnosis, map
learning, language understanding, vision, and
heuristic search.
In particular, this method is playing
an increasingly important role in the design and
analysis of machine learning algorithms.

8

Baynesian Networks

Two interpretations

Causal

BNs are used to model situations where causality
plays a role, but our understanding is incomplete,
so that we must describe things probabilistically

Probabilistic

BNs allow us to calculate the conditional
probabilities of the nodes in a network given that
some of the values have been observed.

9

Probabilities in BNs

Specifying the probability distribution for a BN
requires:

The prior probabilities of all the root nodes (nodes
without parents)

The conditional probabilities of all non
-
root nodes given
all possible combinations of their direct parents

BN representation can yield significant savings in the
number of values needed to specify the probability
distribution

If variables are binary, then 2
n

1 values are required for
the complete distribution, where
n

is the number of
variables

10

Probabilities in BNs
-

Example

From Charniak

family
-
out

bowel
-
problem

dog
-
out

light
-
on

hear
-
bark

P(lo | fo) = .6

P(lo | ¬fo) = .05

P(hb | do) = .7

P(hb | ¬do) = .01

P(fo) = .15

P(bp) = .01

P(do | fo bp) = .99

P(do | fo ¬bp) = .90

P(do | ¬fo bp) = .97

P(do | ¬fo ¬bp) = .3

11

Calculating Probabilities
-

Example

What is the probability that the lights are out?

P(lo) = P(lo | fo) P(fo) + P(lo | ¬fo) P(¬fo)

= .6 (.15) + .05 (.85)

= 0.1325

12

Calculating Probabilities
-

Example

What is the probability that the dog is out?

P(do) = P(do | bp fo) P(bp fo) + P(do | bp ¬fo) P(bp ¬fo) +

P(do |
¬
bp fo) P(
¬bp
fo) + P(do | bp ¬fo) P(
¬bp
¬fo)

= P(do | bp fo) P(bp) P(fo) + P(do | bp ¬fo) P(bp) P(¬fo) +

P(do | ¬bp fo) P(¬bp) P(fo) + P(do | bp ¬fo) P(¬bp) P(¬fo)

= .99(.15)(.01) + .90(.15)(.99) + .97(.85)(.01) + .3(.85)(.99)

= 0.4

13

Types of Connections in BNs

family-out
dog-out
bowel-problem
family-out
dog-out
hear-bark
family-out
dog-out
light-on
a

b

a

b

c

a

c

b

c

Linear

Converging

Diverging

14

Independence Assumptions

Linear connection
: The two end variables are usually
dependent on each other. The middle variable renders them
independent
.

Converging connection
: The two end variables are usually
independent of each other. The middle variable renders
them
dependent
.

Divergent connection
: The two end variables are usually
dependent on each other. The middle variable renders them
independent
.

family-out
dog-out
bowel-problem
family-out
dog-out
hear-bark
family-out
dog-out
light-on
15

Inference in Bayesian Networks

A basic task for BNs is to compute the posterior probability
distribution for a set of
query

variables, given values for some
evidence

variables. This is called
inference

or
belief updating
.

The input to a BN inference evaluation is a set of evidences:
e.g.,

E = { hear
-
bark = true, lights
-
on = true }

The outputs of the BN inference evaluation are conditional
probabilities

P(
X
i

= v | E)

where
X
i

is a variable in the network.

16

Inference in Bayesian Networks

Types of inference:

Diagnostic

Causal

Intercausal (Explaining Away)

Mixed

17

Diagnostic Inference

Inferring the probability of a cause
based on evidence of an effect

Also known as “bottom up”
reasoning

Q

E

18

Example: Diagnostic Inference

Given that the dog is out, what’s the
probability that the family is out? That the
dog has a bowel problem? What’s the
probable cause of the dog being out?

34
.
4
.
0
)
15
(.
90
.
)
(
)
(
)
|
(
)
|
(

do
P
fo
P
fo
do
P
do
fo
P
024
.
4
.
0
)
01
(.
973
.
)
(
)
(
)
|
(
)
|
(

do
P
bp
P
bp
do
P
do
bp
P
family out

bowel problem

dog out

19

Causal Inference

E

Q

Inferring the probability of an effect
based on evidence of a cause

Also known as “top down”
reasoning

20

Example: Causal Inference

What is the probability that the dog is out given that the family is out?

P(do | fo) = P(do | fo bp) P(bp) + P(do | fo ¬bp) P(¬bp)

= .99 (.01) + .90 (.99)

= 0.90

What is the probability that the dog is out given that he has a bowel problem?

P(do | bp) = P(do | bp fo) P(fo) + P(do | bp ¬fo) P(¬fo)

= .99 (.15) + .97 (.85)

= 0.973

family out

bowel problem

dog out

21

Intercausal Inference (Explaining
Away)

Involves two causes that "compete"
to "explain" an effect

The causes become conditionally
dependent given that their common
effect is observed, even though
they are marginally independent.

Q

E

E

22

Explaining Away
-

Example

153
.
973
.
0
)
15
(.
99
.
)
|
(
)
(
)
|
(
)
(
)
|
(
)
(
)
|
(
)
|
(
)
(
)
(
)
|
(

bp
do
P
f o
P
bp
f o
do
P
bp
P
bp
do
P
bp
P
bp
f o
P
bp
f o
do
P
bp
do
P
bp
do
f o
P
bp
do
f o
P
Evidence of the bowel problem
“explains away” the fact that
the dog is out.

family out

bowel problem

dog out

What is the probability that the family is out
given that the dog is out and has a bowel
problem?

23

Mixed Inference

E

Q

E

Combines two or more diagnostic,
causal, or intercausal inferences

24

Example: A Lecturer’s Life

Dr. Ann Nicholson spends 60% of her work time in her office. The
rest of her work time is spent elsewhere. When Ann is in her
office, half the time her light is off (when she is trying to hide from
students and get some real work done). When she is not in her
office, she leaves her light on only 5% of the time. 80% of the time
she is in her office, Ann is logged onto the computer. Because she
sometimes logs onto the computer from home, 10% of the time she
is not in her office, she is still logged onto the computer.

1.
Draw the corresponding BN.

2.
Specify the conditional probabilities associated with each
node.

3.
Suppose a student checks Dr. Nicholson’s login status and
sees that she is logged on. What effect does this have on
the student’s belief that Dr. Nicholson’s light is on.

25

Example: A Lecturer’s Life

in office

light on

computer on

P(io) = .6

P(lo | io) = .5

P(lo | ¬io) = .05

P(co | io) = .8

P(co | ¬io) = .1

26

Example: A Lecturer’s Life

52
.
)
4
(.
1
.
)
6
(.
8
.
)
(
)
|
(
)
(
)
|
(
)
(
923
.
0
52
.
)
6
(.
8
.
)
(
)
(
)
|
(
)
|
(

io
P
io
co
P
io
P
io
co
P
co
P
co
P
io
P
io
co
P
co
io
P
27

Example: A Lecturer’s Life

465
.
52
.
)
4
)(.
1
(.
05
.
)
6
)(.
8
(.
5
.
)
(
)
(
)
|
(
)
|
(
)
(
)
|
(
)
|
(
)
(
)
(
)
|
,
(
)
(
)
|
,
(
)
(
)
,
(
)
|
(

co
P
io
P
io
co
P
io
lo
P
io
P
io
co
P
io
lo
P
co
P
io
P
io
co
lo
P
io
P
io
co
lo
P
co
P
co
lo
P
co
lo
P
28

Example: Medical Diagnosis

A patient presents to a doctor with shortness of breath. The
doctor considers that possible causes are tuberculosis, lung cancer
and bronchitis. Other additional information that is relevant is
whether the patient has recently visited Asia (where tuberculosis
is more prevalent), whether or not the patient is a smoker (which
increases the chances of cancer and bronchitis). A positive X
-
ray
would indicate either TB or lung cancer. (Example from
(Lauritzen, 1988).)

1.
Draw the corresponding BN.

2.
Construct the probability of tuberculosis given that the
patient has shortness of breath and a positive X
-
ray.

3.
Construct the probability that the patient has a positive X
-
ray given he is a smoker.

29

Example: Medical Diagnosis

visited Asia

smoker

tuberculosis

lung cancer

positive X
-
ray

shortness of breath

bronchitis

30

Example: Medical Diagnosis

)
(
)
|
(
)
|
(
)
(
)
|
(
)
|
(
)
(
)
|
(
)
|
(
)
(
)
|
,
(
)
(
)
|
,
(
)
(
)
|
(
)
,
|
(
)
,
(
)
,
,
(
)
,
|
(
TB
P
TB
SB
P
TB
PX
P
TB
P
TB
SB
P
TB
PX
P
TB
P
TB
SB
P
TB
PX
P
TB
P
TB
SB
PX
P
TB
P
TB
SB
PX
P
TB
P
TB
SB
P
SB
TB
PX
P
SB
PX
P
SB
PX
TB
P
SB
PX
TB
P

31

Example: Medical Diagnosis

)
(
)
|
(
)
(
)
|
(
)
(
)
(
)
(
)
|
(
)
|
(
;
)
(
)
(
)
|
(
)
|
(
)
(
)
(
)
|
(
)
|
(
)
(
)
|
(
)
|
(
)
(
)
(
)
|
,
(
)
(
)
|
,
(
)
(
)
,
(
)
|
(
SM
P
SM
LC
P
SM
P
SM
LC
P
LC
P
LC
P
SM
P
SM
LC
P
LC
SM
P
LC
P
SM
P
SM
LC
P
LC
SM
P
SM
P
LC
P
LC
SM
P
LC
PX
P
LC
P
LC
SM
P
LC
PX
P
SM
P
LC
P
LC
SM
PX
P
LC
P
LC
SM
PX
P
SM
P
SM
PX
P
SM
PX
P

32

Computation in Bayesian networks is NP
-
hard. All
algorithms for computing the probabilities are
exponential to the size of the network.

There are two ways around the complexity barrier:

Algorithms for special subclass of networks, e.g., singly
connected networks. (A singly connected network is one
in which the underlying undirected graph has no more
than one path between any two nodes.)

Approximate algorithms.

The computation for a singly connected network is
linear to the size of the network.

Evaluating BNs

33

In
-
class Exercise

You have a new burglar alarm installed. It is reliable for detecting
burglaries, but also responds to minor earthquakes. Two
neighbors (John, Mary) promise to call you at work when they
hear the alarm. John almost always calls when he hears the alarm,
but confuses the alarm with phone ringing (and calls then too).
Mary likes loud music and sometimes misses the alarm!

1.
Draw the BN.

2.
Using the information on the next chart:

a.
Estimate the probability of an alarm.

b.
Given a burglary, estimate the probability of i.) a call
from John; ii.) a call from Mary.

c.
Estimate the probability of a burglary, given i.) a call
from John; ii.) a call from Mary.

34

In
-
class Exercise

P(B) = 0.01

P(E) = 0.02

P(A | B, E) = 0.95

P(A | B,
¬
E) = 0.94

P(A |
¬
B, E) = 0.29

P(A |
¬
B,
¬
E) = 0.001

P(J | A) = 0.90

P(J |
¬
A) = 0.05

P(M | A) = 0.70

P(M |
¬
A) = 0.01

35

References

1.
Eugene Charniak, Bayesian Networks without Tears,
www.cs.ubc.ca/~murphyk/
Bayes
/
Charniak
_91.pdf
.

2.
BN Lecture,
www.csse.monash.edu.au/~annn/443/L2
-
4.ps
.

3.
BN Lecture,
www.cs.cmu.edu/~awm/381/lec/bayesinfer/bayesinf.ppt
.

4.
Judea Pearl, Causality: Models, Reasoning, and Inference,
Cambridge University Press, 2000.