The Gambler's Ruin Problem

Spencer Mateega

24 Aug 2022

Playing a fair-odds game, a gambler will inevitably go bankrupt against the house. This may be suprising given the game is "fair."

The Gambler’s Ruin, a famous walk problem, has many forms. Relevant examples include betting red or black in roulette and playing poker against an equivalently skilled player with seemingly infinite funds.

We will consider the problem's most basic case where we have a gambler, initially starting with some amount of finite money \(s\), who plays a casino game. Compared to the gambler, the casino has nearly infinite capital. For simplicity, we will assume the casino has infinite capital \(N\).

Our gambler will continue to play the game by wagering $1 bets, stopping only if he is out of money ("ruined"). It is assumed that each wager is independent and identically distributed with odds of winning \(p\) and odds of losing \(q\) (\(1-p\)). The game need not be fair, though we will soon show that when \(p=q=0.5\), the gambler will inevitably ruin.

The gambler’s current fortune can be modeled as a random walk.

Let \(R_n\) denote the funds of the gambler after the \(n\)th wager.

The game stops if \(R_n = 0\) or \(R_n = N\). Given \(N\) is infinite, the game will only stop if \(R_n = 0\), if it stops at all.

Let \(P_i = P(R_i = N)\) denote the probability the gambler wins (reaches \(N\)) when \(R_n = i\).

Clearly, \(P_0 = 0\) and \(P_N = 1\).

Thus, we must find \(P_i\) for \(i \in \mathbb{Z} \wedge 0 < i < N\)

At \(P_i\), the gambler can either win his bet with probability \(p\) or lose his bet with probability \(q\). If the gambler wins, he now has probability \(P_{i+1}\) of winning the game, and if the gambler loses, he now has probability \(P_{i-1}\) of winning the game. Hence,

\[P_i =p(P_{i+1}) + q(P_{i-1})\]

Since \(p+q=1\), this equation can be rewritten as \((p+q)P_i = p(P_{i+1}) + q(P_{i-1})\). Thus,

\[P_{i+1} - P_i = \frac{q}{p}(P_i - P_{i-1}) \]

Considering the case when the gambler has $1 left (\(i=1\)), we see

\[P_{2} - P_1 = \frac{q}{p}(P_1 - P_{0})\]

Since \(P_0 = 0\),

\[P_{2} = P_1 + \frac{q}{p}(P_1)\]

Considering \(i=2\),

\begin{equation} \label{eq1} \begin{split} P_{3} - P_2 & = \frac{q}{p}(P_2 - P_{1}) \\ & = (\frac{q}{p})^2(P_1) \end{split} \end{equation}

Quickly we see

\[P_{i+1} - P_i = (\frac{q}{p})^i(P_1) \mid i \in \mathbb{Z} \wedge 0 < i < N\]

Rewritting \(P_{i+1} - P_i\) as \(\sum_{k=1}^{i} P_{k+1} - P_k\), \(P_{i+1} - P_i = P_1 \sum_{k=1}^{i} (\frac{q}{p})^k\)


\begin{equation} \label{eq2} \begin{split} P_{i+1} & = P_i + P_1 \sum_{k=1}^{i} (\frac{q}{p})^k \\ & = P_1 \sum_{k=0}^{i} (\frac{q}{p})^k \\ & = P_1 (\frac{1 - (\frac{q}{p})^{i+1}}{1 - \frac{q}{p}}) \end{split} \end{equation}

Solving for \(P_i\) by setting \(i = N - 1 \),

\begin{equation} \label{eq3} \begin{split} P_N & = P_1 (\frac{1 - (\frac{q}{p})^{N}}{1 - \frac{q}{p}}) \\ P_1 & = \frac{P_N}{(\frac{1 - (\frac{q}{p})^{N}}{1 - \frac{q}{p}})} \\ & = \frac{1}{(\frac{1 - (\frac{q}{p})^{N}}{1 - \frac{q}{p}})} \\ & = \frac{1 - \frac{q}{p}}{1 - (\frac{q}{p})^{N}} \end{split} \end{equation}

Thus, \(P_i = \frac{1 - (\frac{q}{p})^i}{1 - (\frac{q}{p})^{N}}\). Given \(N\) is infinitely large,

\[ \lim_{N\to\infty} P_i = \left\{ \begin{array}{ll} 0, & p\leq 0.5 \\ 1 - (\frac{q}{p})^i > 0, & p > 0.5 \\ \end{array} \right. \]

Meaning, a gambler playing until he ruins (if he ever does) will eventually ruin if \(p \leq 0.5\) and becomes infinitely wealthy if \(p > 0.5\).

A note on Markov chains

The Gambler's Ruin is often solved with a Markov chain approach. For the key case of \(p=0.5\), we can quickly confirm our conclusion using a property of Markov chains.

Our random walk is a Markov chain (MC) with a given number of states that occur at each value of \(i\) (a step). The states are \(R_N \in \mathbb{Z} \wedge 0 \leq i \leq N\). The transition distribution of our MC has transition probabilities \(p_{i,j} \mid i,j \in \mathbb{Z} \wedge 0 \leq i \leq N\). Since \(p_{0,0} = p_{N,N} = 1\), we know that \(p_{0,0}\) and \(p_{N,N}\) must be absorbing states. Given a Markov chain with at least one absorbing state will eventually be absorbed and \(p_{N,N}\) cannot actually be reached, the MC must eventually be absorbed at \(p_{0,0}\). This confirms our conclusion that a gambler with fair odds will always be ruined in the long run.