# Difference between revisions of "NME130/Information theory"

From MurrayWiki

Line 11: | Line 11: | ||

#* Example: achievability (in sketch form) of the channel coding theorem | #* Example: achievability (in sketch form) of the channel coding theorem | ||

#** Can probably be done in 1-2 lectures of 1.5 hours each | #** Can probably be done in 1-2 lectures of 1.5 hours each | ||

− | + | * Entropy will have be introduced, but probably not entropy rate | |

− | + | ||

=== Tracy === | === Tracy === |

## Revision as of 19:27, 27 May 2009

### Michelle

- Tried to figure out what people wanted to see
- Decided that the way to go is to pull out a small piece that can be done in its entirety, but gives a sense of the point of view

#### Outline

- Assumptions underlying information theory
- Convenient versus critical

- Heart of the matter
- Long sequences of random variables are "easy" to predict (weak law, AEP)
- This piece current takes 3.5 lectures * 1.5 hours = ~ 6 hours

- Example: achievability (in sketch form) of the channel coding theorem
- Can probably be done in 1-2 lectures of 1.5 hours each

- Long sequences of random variables are "easy" to predict (weak law, AEP)

- Entropy will have be introduced, but probably not entropy rate