CDS 110b: Random Processes
See current course homepage to find most recent page available.
|Course Home||L7-2: Sensitivity||L8-1: Robust Stability||L9-1: Robust Perf||Schedule|
This lecture presents an introduction to random processes.
- Quick review of random variables
- Random Processes
References and Further Reading
- Hoel, Port and Stone, Introduction to Probability Theory - this is a good reference for basic definitions of random variables
- Apostol II, Chapter 14 - another reference for basic definitions in probability and random variables
Frequently Asked Questions
Q: Can you explain the jump from pdfs to correlations in more detail?
The probability density function (pdf), tells us how the value of a random process is distributed at a particular time:
You can interpret this by thinking of as a separate random variable for each time
The correlation for a random process tells us how the value of a random process at one time, is related to the value at a different time . This relationship is probabalistic, so it is also described in terms of a distribution. In particular, we use the joint probability density function, to characterize this:
Given any random process, descibes (as a density) how the value of the random variable at time is related (or "correlated") with the value at time . We can thus describe a random process according to its joint probability density function.
In practice, we don't usually describe random processes in terms of their pdfs and joint pdfs. It is usually easier to describe them in terms of their statistics (mean, variance, etc). In particular, we almost never describe the correlation in terms of joint pdfs, but instead use the correlation function:
The utility of this particular function is seen primarily through its application: if we know the correlation for one random process and we "filter" that random process through a linear system, we can compute the correlation for the corresponding output process (we'll see this in the next lecture).