In this paper from 2012 (full text here), Leemis and McQueston show a diagram of how probability distributions are related to each other. As I liked it really much, I extracted the chart from the pdf, turned it into a poster, and printed a giant version of it to stick on the wall of my apartment. I thought I would also share it here:

The full-size vector graphic version (as pdf) can be downloaded here.

### Some explanation

Things can be random in many different ways. It’s tempting to think “if it’s not deterministic, then it’s random and we don’t know anything about it”, but that would be wrong. There is an entire bestiary of probability distributions, with different shapes and different properties, that tell you how likely the possible outcomes are relative to each other. What’s interesting is that each distribution describes the outcome of a particular class of stochastic processes, so by looking at how something is distributed, it’s possible to understand better the process that created it. One can even combine simple processes together or morph their parameters to build more complicated processes. *The map above tells you how the probability distribution changes when you do that. *

Let’s look at an example. You are typing on a keyboard. Every time you push a button, there is a certain probability *p* that you will hit the wrong one. This super simple process is called the Bernoulli process, it corresponds to the Bernoulli distribution that you can find near the top-right corner of the map. Now you type a whole page, consisting of *n* characters. How many errors will you make? This is just a sum of *n* Bernoulli processes, so we look at the map and follow the arrow that says \sum{X_i}, and we reach the binomial distribution^{1}i.i.d. means “Independent and Identically Distributed”. We are assuming your typos are independent from each other.. The number of errors per page follows a binomial distribution with mean *np* and variance *np(1-p)*. If you write a book with 1000 characters per page and make one typo per hundred characters, the variance of the number of typos from page to page will be 1000*0.01*0.99=9.9^{2}ISN’T THAT FASCINATING?.

Let’s complicate things a little bit. Instead of using a typewriter, you are writing with a pen. From time to time, your pen will slip and make an ugly mark. How many ugly marks will you get per page? Again, the map has you covered: this time, instead of having *n* discrete button presses, we have an infinite number of infinitesimal opportunities for the pen to screw up, so n\to\infty, and *p *must also become infinitesimally small so that *np* is finite, otherwise you would just be making an infinite number of ugly marks, and I know you are better than that. Thus, according to the map, the number of screwups per page follows a Poisson distribution. A handy property of the Poisson distribution is that the mean happens to be equal to the variance. So if your pen screws up 10 times per page, you also know the variance will be 10.

You can go on and explore the map on your own (riddle: what is the amount of ink deposited by your pen per page distributed like?). So far, I would say I have encountered only half of the map’s distributions in real life, so there is still a lot of *terra incognita* for me.