Introduction to Discrete Structures 2
Handout on Chernoff Bounds
We have seen Markov’s and Chebychev’s inequalities as examples for tools that let us estimate
how likely (or unlikely) it is that the value of a random variable X is in the “tail” of the
distribution - in other words, they provide us with bounds on the probability that the value
of X is not too much more (or less) than the expectation of X. The advantage of these
inequalities is that they work for any random variable for which we know the expectation
(and possibly the variance). The trade-off is that they usually don’t give very tight bounds.
For certain random variables (such as for example the Binomial random variable and the
Poisson random variable), the probability of being more than a certain value above the
expectation (or less than a certain value below) drops exponentially the larger the distance
to the expectation.
The following inequality takes this fact into consideration. It is an example of a class of
inequalities called Chernoff bound’s. The inequality below is not the tightest one known,
but it is fairly easy to use.
Theorem 1 (Chernoff Bound). Let X be a Binomial random variable (or a Poisson random
variable) with expectation E[X]. Let 0 < δ ≤ 1, then
X < (1 − δ)E[x]
Now how can we use this inequality? Consider the following example.
Problem 1. Assume you are playing a game against a friend and your probability of winning
it is 2
. You play the game 30 times. What is the probability that you wins less than half
You can view this problem as a sequence of 30 Bernoulli trials, view a success as you winning
a trial, and thus have a success probabilty p = 2
. If S30 is the Binomial random variable
counting the number of successes, you are trying to compute Pr
S30 < 15
. Of course we
can compute this value exactly (even though it might be tedious). Alternatively, we can use
the Chernoff bound from above.
Since S30 is a Binomial random variable, we know that E[S30] = np = 30 · 23 =