Bernoulli Processes

Flip 𝑛 coins, each of which comes up heads with probability 𝑝: this is a Bernoulli process (so named because each coin is just a Bernoulli random variable).

Regarding this process, we can ask several interesting questions:

  1. What’s the chance that we get exactly π‘˜ heads?
  2. What’s the chance that the index of the first coin which comes up heads is 𝑖 (one-indexed)?
  3. If 𝑛 is infinite, what is the expected value of 𝑖 (first index w/ heads)?

Currently, events are discrete, since they are in chunks of one coin flip at a time. How can we make them continuous instead?

Poisson Processes

Instead of a process parameterized by 𝑝, we now use πœ† to denote the expected # of successes (heads) within each one-second interval. In other words, the chance of a success in an interval of size d𝑑 is πœ† d𝑑.

First off, what is 𝑃(0,𝜏), the chance that there are no successes at all in a 𝜏-second interval? Note that

𝑃(0,𝜏+d𝑑)=𝑃(0,𝜏)(1βˆ’πœ† d𝑑)=𝑃(0,𝜏)βˆ’πœ†π‘ƒ(0,𝜏)d𝑑

This differential equation is precisely the definition of the exponential function! That is:

𝑃(0,𝜏)=π‘’βˆ’πœ†πœ

What about 𝑃(1,𝜏) the chance of one success in a 𝜏-second interval? Integrating over all possible times for this single success, we get that:

𝑃(1,𝜏)=π‘’βˆ’πœ†πœβˆ«πœ0πœ† d𝑑1βˆ’(πœ† d𝑑)=π‘’βˆ’πœ†πœβˆ«πœ0πœ† d𝑑=π‘’βˆ’πœ†πœπœ†πœ

Essentially, we start assuming that all trials failed, then choose one failure to β€œtoggle” to a success. Notice that the denominator vanishes because 1βˆ’πœ† d𝑑=1.

Now, in general, what is 𝑃(π‘˜,𝜏), the chance of π‘˜ successes in a 𝜏-second interval? Applying the same logic, we get that

𝑃(π‘˜,𝜏)=π‘’βˆ’πœ†πœ(πœ†πœ)π‘˜π‘˜!

Another derived random variable to consider is 𝑇, the time of first success. Note that 1βˆ’cdf(𝑇)=𝑃[𝑇β‰₯𝜏]=𝑃(0,𝜏)=π‘’βˆ’πœ†πœ. From this, we can derive the PDF:

pdf(𝑇)=βˆ’ddπœπ‘ƒ[𝑇β‰₯𝜏]=πœ†π‘’βˆ’πœ†πœ

Note that 𝔼[𝑇] is still 1πœ†, since 𝔼[𝑇]=d𝑑+(1βˆ’πœ† d𝑑)𝔼[𝑇].

A more interesting problem: what’s 𝔼[𝑇𝑛]? We can use a similar argument as we did for expected value (taking advantage of the fact that the Poisson process is memoryless) to get that

𝔼[𝑇𝑛]=(1βˆ’πœ†d𝑑)(𝔼[𝑇𝑛]+d𝑑 𝔼[π‘‡π‘›βˆ’1])

Rearranging, we get:

πœ†d𝑑 𝔼[𝑇𝑛]=d𝑑 𝔼[π‘‡π‘›βˆ’1]

Here, we use the fact that (1βˆ’πœ†d𝑑)d𝑑=d𝑑 since (d𝑑)2=0. From here, we can show that

𝐸[𝑇𝑛]=𝑛!πœ†π‘›

As a direct consequence, Var(𝑇)=1πœ†2.

Now, let 𝑁 be the number of success in a 𝜏-second interval. Then, we can show that 𝐸[𝑁]=Var(𝑁)=πœ†πœ. The fact that 𝐸[𝑁]=Var(𝑁) is actually very special, since typically mean and variance have different dimensions and thus this equality is highly unlikely. However, 𝑁 is a β€œcounting” variable, which means it’s dimensionless and thus the above expression makes perfect sense.