\documentclass[12pt,twoside]{article}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Packages
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\usepackage{amssymb}
\usepackage{amsmath}
\usepackage{amscd}
\usepackage{bbm}
\usepackage[hmarginratio=1:1]{geometry}
\usepackage{titlesec}
\usepackage{mathrsfs}
\usepackage{enumerate}
\usepackage{graphicx}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Modified Title
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\newcommand{\makehwtitle}[4]{
\begin{center}
\Large
\textbf{#1 \,-- \:#2}
\\
#3 \\
#4
\normalsize
\end{center}
}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Problem Headings
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\newcounter{ProblemNumber}
\newcounter{SubProblemNumber}[ProblemNumber]
\renewcommand{\theProblemNumber}{\arabic{ProblemNumber}}
\renewcommand{\theSubProblemNumber}{\alph{SubProblemNumber}}
\newcommand{\problem}[1]{
\stepcounter{ProblemNumber}
\subsection*{#1}
\hrulefill
\vspace{12pt}
}
\newcommand{\nproblem}{
\problem{Problem \theProblemNumber}
}
\newcommand{\soln}[1]{\subsection*{#1}}
\newcommand{\solution}{\soln{Solution}}
\renewcommand{\part}{
\stepcounter{SubProblemNumber}
\soln{Part (\theSubProblemNumber)}
}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% New Commands
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\newcommand\numberthis{\addtocounter{equation}{1}\tag{\theequation}}
\newcommand{\deriv}[3][]{\ensuremath{\frac{\textrm{d}^{#1}#2}{\textrm{d}{#3}^{#1}}}}
\newcommand{\pderiv}[3][]{\ensuremath{\frac{\partial^{#1}#2}{\partial{#3}^{#1}}}}
\renewcommand{\d}{\ensuremath{\,\textrm{d}}}
\newcommand{\qed}{\ensuremath{\Box}}
\newcommand{\absval}[1]{\ensuremath{\left|#1\right|}}
\begin{document}
\begin{center}
\huge\textbf{Homework 4} \normalsize \\ Due Friday, July 21 2017
\end{center}
\problem{Problem 1 (5 points)}
Suppose that you perform the following experiment: At 5pm, you place a single atom of some radioactive substance in a container on your desk. (Let's not worry about how we move or even find a single atom.) This atom is known to decay with a rate constant of $\lambda = 0.001$ per second. In the morning, you check on your atom and find that it did not decay overnight. You then watch it for one more hour and record whether or not it decays while you watch.
\begin{enumerate}[(a)]
\item If you get to the office at 8am, what is the probability that your atom decays in the hour you watch it?
\item Now suppose that you arrive an hour late, and so you watch your atom from 9am to 10am. What is the probability that your atom decays in the hour you watch it?
\item In contrast, suppose that you are watching your friend run a marathon and that this friend can finish a marathon, on average, in 4 hours. If you show up at the finish line 3 hours into the race and you know that your friend hasn't finished yet, you have a certain probability of seeing them finish in the next hour. If you instead show up at the finish line 4 hours into the race and you know that your friend hasn't finished yet, do you have a higher or lower probability of seeing them finish in the next hour? Why is this different from the radioactive decay example?
\end{enumerate}
\part
The key to this problem is to notice that, since exponential decay is memoryless, it doesn't matter how long the atom sat on our desk before we got to the office. All that matters is that it was still there when we got in. To make this more explicit, we know that $\textrm{Pr}\big[$Atom didn't decay between 5pm and 8am and did decay between 8am and 9am $\:\vert\:$ Atom was undecayed at 8am$\big] = \textrm{Pr}\big[$Atom decayed before 1 hour $\:\vert\:$ Atom was undecayed at time zero$\big]$. From class, we know that the probability that an atom decayed before time $t$ given that it was undecayed at time zero is given by
\begin{equation*}
\textrm{Pr}\left[\textrm{Decay before time }t\:\vert\:\textrm{Undecayed at }0\right] = \textrm{Pr}\left[T \leq t\right] = 1 - e^{-\lambda t}.
\end{equation*}
The other thing to note is that $\lambda$ is in units per second, so we need to convert our time to seconds. Since $1$ hour is $3600$ seconds, the probability we want is
\begin{equation*}
1 - e^{-\lambda\cdot 3600} = 1 - e^{-0.001\cdot 3600} \approx 0.97.
\end{equation*}
This means that there is a roughly $97\%$ chance that the atom will decay by 9am.
\part
Because exponential decay is memoryless, this is exactly the same as in part a. Since the atom was still undecayed when you got to the office, it doesn't matter that you were an hour late. Therefore, the probability that it decays by 10am is still roughly $97\%$.
\part
The specific numbers in this problem are not particularly important. The main idea is that knowing how long your friend has been running is useful information when predicting when they will finish. If you show up after three hours, there is a good chance that they won't finish in the next hour, but if you show up after four hours and they haven't already finished, you can be fairly sure they will be done soon. As an example, suppose that your friend's time is uniformly distributed between 3.5 and 4.5 hours. If you show up after three hours, you will have a $50\%$ chance of seeing them finish in the next hour. If you show up after four hours and they haven't already finished, then you are guaranteed to see them finish in the next hour (in the next half hour, in fact).
\problem{Problem 2 (10 points)}
Suppose that you have $N$ radioactive atoms on your desk. Assume that the decay time of each atom is exponentially distributed with rate constant $\lambda$. That is, if $T_i$ is the (random) time at which the $i$th atom decays, then
\begin{equation}
\textrm{Pr}\left[T_i > t\right] = e^{-\lambda t},
\end{equation}
where $t$ is the amount of time since you placed all the atoms on your desk. Furthermore, assume that all the decay times are independent. Define
\begin{equation}
T^{*} = \min\left\{ T_1, T_2, \dotsc, T_N\right\}
\end{equation}
be the (random) time at which the first atom decays. Find the cumulative distribution function (cdf) and probability density function (pdf) of $T^{*}$. What is the expected value of $T^{*}$? What is the standard deviation of $T^{*}$? (Remember, the standard deviation of a random variable $X$ is defined as $\sigma(X) = \sqrt{\langle X^2 \rangle - \langle X \rangle^2}$, where $\langle \cdot \rangle$ denotes the expected value of a random variable.)
\solution
Notice that if $T^* > t$, then the earliest decay has not happened by time $t$, and therefore none of the atoms have decayed by time $t$. Similarly, if $T^* \leq t$, then the first decay happened before time $t$, and therefore at least one of the atoms has already decayed by time $t$. (It is possible that more than one of the atoms has decayed by time $t$ if $T^*
\leq t$.) If we let $M(t)$ be the number of undecayed atoms left at time $t$ (to match our notation from class we would want to call this $N(t)$, but we are already using the letter $N$ in this problem), then this means that
\begin{align*}
\textrm{Pr}\left[T^* > t\right] &= \textrm{Pr}\left[M(t) = N\right],
\end{align*}
because all $N$ atoms are still left, so we have (using the formula derived in class)
\begin{align*}
\textrm{Pr}\left[T^* > t\right] &= \frac{N!}{N!(N - N)!}\left(1 - e^{-\lambda t}\right)^{N - N}\left(e^{-\lambda t}\right)^N \\
&= e^{-N\lambda t}.
\end{align*}
The cumulative distribution function is (by definition)
\begin{align*}
F_{T^*}(t) &= \textrm{Pr}\left[T^* \leq t\right] \\
&= 1 - \textrm{Pr}\left[T^* > t\right] \\
&= 1 - e^{-N\lambda t}.
\end{align*}
The probability density function is therefore
\begin{equation*}
f_{T^*}(t) = \deriv{}{t}F_{T^{*}}(t) = N\lambda e^{-N\lambda t}.
\end{equation*}
The expected value of $T^{*}$ is therefore
\begin{align*}
\langle T^{*}\rangle &= \int_{0}^{\infty}t f_{T^*}(t)\d t \\
&= \int_{0}^{\infty}N\lambda te^{-N\lambda t}\d t \\
&= \left.-te^{-N\lambda t}\right|_{t=0}^{\infty} + \int_{0}^{\infty}e^{-\lambda t}\d t \\
&= \left.\frac{-1}{N\lambda}e^{-\lambda t}\right|_{t=0}^{\infty} \\
&= \frac{1}{N\lambda}.
\end{align*}
Similarly, we have
\begin{align*}
\langle (T^{*})^2 \rangle &= \int_{0}^{\infty}t^2f_{T^*}(t)\d t \\
&= \int_{0}^{\infty}N\lambda t^2e^{-N\lambda t}\d t \\
&= \left.-t^2e^{-\lambda t}\right|_{t=0}^{\infty} + \int_{0}^{\infty}2te^{-N\lambda t} \d t \\
&= \left.-\frac{2t}{N\lambda}e^{-N\lambda t}\right|_{t=0}^{\infty} + \int_{0}^{\infty}\frac{2}{N\lambda}e^{-N\lambda t}\d t \\
&= \left.-\frac{2}{(N\lambda)^2}e^{-N\lambda t}\right|_{t=0}^{\infty} \\
&= \frac{2}{(N\lambda)^2}.
\end{align*}
The standard deviation is given by
\begin{align*}
\sigma(T^*) &= \sqrt{\langle (T^*)^2\rangle - \langle T^*\rangle^2} \\
&= \sqrt{\frac{2}{(N\lambda)^2} - \left(\frac{1}{N\lambda}\right)^2} \\
&= \sqrt{\frac{1}{(N\lambda)^2}} \\
&= \frac{1}{N\lambda}.
\end{align*}
\problem{Problem 3 (10 points)}
Suppose that you start the day with $N$ radioactive atoms on your desk, and that the time to decay $T$ for each atom is identically and independently distributed, with
\begin{equation}
\textrm{Pr}\left[T > t\right] = e^{-\lambda t}.
\end{equation}
Let $T_n$ be the (random) amount of time that you have to wait until exactly $n$ of the atoms have decayed. Find the cdf and pdf of $T_n$ for each $n = 0, 1, \dotsc, N$. Find the expected value of $T_N$.
\textbf{Extra Credit}: Find the expected value of $T_n$ for arbitrary $n = 0, 1, \dotsc, N$.
\solution
Note that if $T_n > t$, then the $n$th decay has not yet happened at time $t$, so there are fewer than $n$ decayed atoms (and therefore more than $N - n$ undecayed atoms). Similarly, if $T_n \leq t$, then the $n$th decay has already happened by time $t$, so there are at least $n$ decayed atoms at time $t$, and therefore at most $N - n$ undecayed atoms. If we let $M(t)$ be the number of undecayed atoms at time $t$ (again, we called this $N(t)$ in class, but we are already using the letter $N$) then the cumulative distribution function for $T_n$ is therefore
\begin{align*}
F_{T_n}(t) &= \textrm{Pr}\left[T_n \leq t\right] \\
&= \textrm{Pr}\left[M(t) \leq N - n\right] \\
&= \textrm{Pr}\left[M(t) = 0 \textrm{ or } M(t) = 1 \textrm{ or } \dotsc \textrm{ or } M(t) = N - n \right] \\
&= \sum_{k=0}^{N - n}\textrm{Pr}\left[M(t) = k\right] \\
&= \sum_{k=0}^{N - n}\frac{N!}{k!(N - k)!}\left(1 - e^{-\lambda t}\right)^{N - k}\left(e^{-\lambda t}\right)^k
\end{align*}
Using the fact that
\begin{equation*}
\sum_{k=0}^{N}\frac{N!}{k!(N - k)!}\left(1 - e^{-\lambda t}\right)^{N - k}\left(e^{-\lambda t}\right)^k = 1,
\end{equation*}
we can also write our cdf as
\begin{equation*}
F_{T_n}(t) = 1 - \sum_{k=N - n + 1}^{N}\frac{N!}{k!(N - k)!}\left(1 - e^{-\lambda t}\right)^{N - k}\left(e^{-\lambda t}\right)^k.
\end{equation*}
In particular, the cdf for $T_0$ is just $F_{T_0}(t) = 1$, because there are always at least 0 decayed atoms. (In other words, we know that zero decays first happened at time $t = 0$ because all the atoms were undecayed when we put them on the desk.) In addition, the cdf for $T_1$ is exactly the same as our answer from problem 2, because $T_1$ is the same as $T^*$.
The probability density function for $T_n$ is therefore
\begin{align*}
f_{T_n}(t) &= \deriv{}{t}F_{T_n}(t) \\
&= \deriv{}{t}\sum_{k=0}^{N - n}\frac{N!}{k!(N - k)!}\left(1 - e^{-\lambda t}\right)^{N - k}\left(e^{-\lambda t}\right)^k \\
&= \sum_{k=0}^{N - n}\frac{N!}{k!(N - k)!}\deriv{}{t}\left[\left(1 - e^{-\lambda t}\right)^{N - k}e^{-k\lambda t}\right] \\
&= \sum_{k=0}^{N - n}\frac{N!}{k!(N - k)!}\left[\lambda(N - k)\left(1 - e^{-\lambda t}\right)^{N - k - 1}e^{-(k+1)\lambda t} - k\lambda\left(1 - e^{-\lambda t}\right)^{N - k}e^{-k\lambda t}\right].
\end{align*}
In particular, the pdf for $T_N$ is
\begin{equation*}
f_{T_N}(t) = \lambda N\left(1 - e^{-\lambda t}\right)^{N - 1}e^{-\lambda t}.
\end{equation*}
The expected value of $T_N$ is therefore
\begin{equation*}
\int_{0}^{\infty}\lambda Nt\left(1 - e^{-\lambda t}\right)^{N - 1}e^{-\lambda t}\d t.
\end{equation*}
We will use the following (which is true for all $n$):
\begin{align*}
\langle T_n \rangle &= \int_{0}^{\infty}tf_{T_n}(t)\d t \\
&= \int_{0}^{\infty}t\deriv{}{t}F_{T_n}(t) \\
&= -\int_{0}^{\infty}t\deriv{}{t}\left[1 - F_{T_n}(t)\right]\d t \\
&= -\left[t\left[1 - F_{T_n}(t)\right]_{t=0}^{\infty} - \int_{0}^{\infty}1 - F_{T_n}(t)\d t\right] \\
&= \int_{0}^{\infty}1 - F_{T_n}(t)\d t.
\end{align*}
We therefore have
\begin{align*}
\langle T_N \rangle &= \int_{0}^{\infty}1 - F_{T_N}(t)\d t \\
&= \int_{0}^{\infty}1 - \left(1 - e^{-\lambda t}\right)^N\d t.
\end{align*}
If we let $u = 1 - e^{-\lambda t}$, then $\d u = \lambda(1 - u)\d t$, so
\begin{align*}
\langle T_N \rangle &= \frac{1}{\lambda}\int_{0}^{1}\frac{1 - u^N}{1 - u}\d u.
\end{align*}
This integrand is the partial sum of a geometric series, so we have
\begin{align*}
\langle T_N \rangle &= \frac{1}{\lambda}\int_{0}^{1}\sum_{n=1}^{N}u^n\d u \\
&= \sum_{n=1}^{N}\int_{0}^{1}\frac{u^n}{\lambda}\d u \\
&= \sum_{n=1}^{N}\left.\frac{u^{n+1}}{n\lambda}\right|_{u=0}^{1} \\
&= \sum_{n=1}^{N}\frac{1}{n\lambda}.
\end{align*}
The integrals for arbitrary $\langle T_n \rangle$ are somewhat messier, but there is a straightforward argument for what it should be. If we have $N$ atoms, then (by problem 2) we expect to wait $1/N\lambda$ for the first decay. After that, there will be $N-1$ atoms left, so we expect to wait an additional $1/(N-1)\lambda$ for the next decay. We can continue this for as many decays as we want, and so we have
\begin{equation*}
\langle T_n \rangle = \sum_{n=N - n + 1}^{N}\frac{1}{n\lambda}.
\end{equation*}
This argument only works exactly because exponential decay is memoryless.
\problem{Problem 4 (15 points)}
Many rare events can be modeled with a similar probabilistic description. For example, let $N(t)$ be the number of large floods that will occur in Phoenix in the next $t$ years. From our perspective, this is essentially a random phenomenon, so we will define
\begin{equation}
P_n(t) = \textrm{Pr}\left[N(t) = n\right],
\end{equation}
for every $n = 0, 1, \dotsc$. That is, $P_n(t)$ is the probability that there will be exactly $n$ large floods in the next $t$ years.
We will assume that $P_0$ has the memoryless property discussed in class. That is, the chance of not having a flood next year is independent of the floods from this year. To be more precise,
\begin{equation} \label{eq:memoryless}
P_0(t + s) = P_0(t)P_0(s)
\end{equation}
Furthermore, we will assume that, over a short enough interval of time $\Delta t$, the probability of two floods both happening in the same interval is approximately zero. That is, we don't expect to have two 100-year floods in the same week. This means that
\begin{equation} \label{eq:pn1}
P_{n+1}(t + \Delta t) = P_{n+1}(t)P_0(\Delta t) + P_n(t)P_1(\Delta t).
\end{equation}
In addition, we will assume that the probability of one flood happening in a short time span is roughly proportional to the length of time, so that
\begin{equation} \label{eq:p1dt}
P_1(\Delta t) = \lambda\Delta t
\end{equation}
and
\begin{equation} \label{eq:p0dt}
P_0(\Delta t) = 1 - \lambda\Delta t + o(\Delta t).
\end{equation}
(Remember that $o(x)$ means that $\lim_{x\to 0} o(x)/x = 0$.)
\begin{enumerate}[(a)]
\item Show that, in the limit as $\Delta t\to 0$,
\begin{equation} \label{eq:p0}
\deriv{P_0(t)}{t} = -\lambda P_0(t)
\end{equation}
with $P_0(0) = 1$, and
\begin{equation} \label{eq:pn}
\deriv{P_{n+1}(t)}{t} = \lambda P_n(t) - \lambda P_{n+1}(t)
\end{equation}
with $P_{n+1}(0) = 0$.
\item Solve \eqref{eq:p0} for $P_0(t)$. Use this solution to solve \eqref{eq:pn} for $P_1(t)$ and use that solution to solve \eqref{eq:pn} for $P_2(t)$.
\item Find $P_n(t)$ for all $n$. (The easiest way to do this is to find a pattern in the first few $P_n$, then guess the formula. To show that your guess is correct, suppose that the formula you found holds for some integer $n = k$, then use \eqref{eq:pn} to show that it also holds for $n = k + 1$.)
\item If $\lambda = 0.01$, the floods are called ``100 year floods''. What is the probability that there is exactly one 100 year flood in the next one hundred years? What is the probability that there are no 100 year floods in the next one hundred years? What is the probability that there is at least one 100 year flood in the next one hundred years?
\end{enumerate}
\part
From equation \eqref{eq:memoryless}, we know that $P_0(t + \Delta t) = P_0(t)P_0(\Delta t)$. Using \eqref{eq:p0dt}, we therefore have
\begin{align*}
P_0(t + \Delta t) &= P_0(t)\left(1 - \lambda \Delta t + o(\Delta t)\right) \\
&= P_0(t) - \lambda \Delta t P_0(t) + o(\Delta t).
\end{align*}
Therefore,
\begin{equation*}
\frac{P_0(t + \Delta t) - P_0(t)}{\Delta t} = -\lambda P_0(t).
\end{equation*}
Taking the limit as $\Delta t$ goes to zero, we obtain
\begin{equation*}
\deriv{P_0}{t} = -\lambda P_0.
\end{equation*}
Since we know that no floods have happened in the first zero seconds, we must have $P_0(0) = 1$.
Similarly, using equations \eqref{eq:pn1}, \eqref{eq:p1dt} and \eqref{eq:p0dt}, we obtain
\begin{align*}
P_{n+1}(t + \Delta t) &= P_{n+1}(t)\left(1 - \lambda\Delta t + o(\Delta t)\right) + P_n(t)\lambda\Delta t \\
&= P_{n+1}(t) - \lambda \Delta t P_{n+1}(t) + \lambda \Delta t P_n(t) + o(\Delta t),
\end{align*}
for any $n \geq 0$. Therefore,
\begin{equation*}
\frac{P_{n+1}(t + \Delta t) - P_{n+1}(t)}{\Delta t} = \lambda P_n(t) - \lambda P_{n+1}(t).
\end{equation*}
Taking the limit as $\Delta t$ goes to zero, we get
\begin{equation*}
\deriv{P_{n+1}}{t} = \lambda P_n(t) - \lambda P_{n+1}(t).
\end{equation*}
Since we know that there have not been any floods in the first zero seconds, we must have $P_{n+1}(0) = 0$.
\part
Equation \eqref{eq:p0} is separable, and we have
\begin{align*}
\frac{1}{P_0}\deriv{P_0}{t} = -\lambda, &\textrm{ so } \\
\int\frac{1}{P_0}\deriv{P_0}{t}\d t = -\int\lambda\d t, &\textrm{ and } \\
\int\frac{1}{P_0}\d P_0 = -\int\lambda\d t, &\textrm{ and therefore } \\
\ln P_0 = -\lambda t + C, &\textrm{ so } \\
P_0(t) = Ae^{-\lambda t}.
\end{align*}
Substituting the initial condition, we find that
\begin{equation*}
P_0(t) = e^{-\lambda t}.
\end{equation*}
Now that we have $P_0$, we can plug $n = 0$ into equation \eqref{eq:pn} to get
\begin{equation*}
\deriv{P_{1}}{t} + \lambda P_1 = \lambda e^{-\lambda t}.
\end{equation*}
This is a first order linear equation, so there are many ways to solve it. Probably the easiest is to use an integrating factor. We get
\begin{align*}
e^{\lambda t}\deriv{P_1}{t} + \lambda e^{\lambda t}P_1 = \lambda, &\textrm{ so } \\
\deriv{}{t}\left[e^{\lambda t}P_1(t)\right] = \lambda, &\textrm{ and therefore } \\
e^{\lambda t}P_1(t) = \lambda t + C, &\textrm{ which means } \\
P_1(t) = \lambda t e^{-\lambda t} + Ce^{-\lambda t}.
\end{align*}
Using the initial conditions, we find that $C = 0$, so
\begin{equation*}
P_1(t) = \lambda t e^{-\lambda t}.
\end{equation*}
Similarly, we can substitute $n = 1$ into \eqref{eq:pn}, obtaining
\begin{align*}
\deriv{P_2}{t} + \lambda P_2 = \lambda^2 te^{-\lambda t}.
\end{align*}
Again, we can solve this using an integrating factor, so we get
\begin{align*}
e^{\lambda t}\deriv{P_2}{t} + \lambda e^{\lambda t}P_2 = \lambda^2 t, &\textrm{ so } \\
\deriv{}{t}\left[e^{\lambda t}P_1(t)\right] = \lambda^2 t, &\textrm{ and therefore } \\
e^{\lambda t}P_2(t) = \frac{\lambda^2 t^2}{2} + C, &\textrm{ so } \\
P_2(t) = \frac{\lambda^2 t^2}{2}e^{-\lambda t} + Ce^{-\lambda t}.
\end{align*}
Using the initial conditions, we find that $C = 0$ again, so
\begin{equation*}
P_2(t) = \frac{\lambda^2 t^2}{2}e^{-\lambda t}.
\end{equation*}
\part
Solving for a few more $P_n(t)$, you should find that the general formula appears to be
\begin{equation*}
P_n(t) = \frac{\lambda^n t^n}{n!}e^{-\lambda t}.
\end{equation*}
This certainly works for $n = 0$, $n = 1$ and $n = 2$. Now suppose that the formula holds for some integer $n = k$. We therefore have
\begin{equation*}
\deriv{P_{k+1}}{t} + \lambda P_{k+1} = \lambda P_k(t) = \frac{\lambda^{k+1} t^k}{k!}e^{-\lambda t}.
\end{equation*}
Once again, this equation is first order and linear, so we have
\begin{align*}
e^{\lambda t}\deriv{P_{k+1}}{t} + \lambda e^{\lambda t}P_{k+1} = \frac{\lambda^{k+1}t^k}{k!}, &\textrm{ so } \\
\deriv{}{t}\left[e^{\lambda t}P_{k+1}\right] = \frac{\lambda^{k+1}t^k}{k!} + C, &\textrm{ and } \\
e^{\lambda t}P_{k+1} = \frac{\lambda^{k+1}t^{k+1}}{(k+1)!} + C, &\textrm{ and therefore } \\
P_{k+1}(t) = \frac{\lambda^{k+1}t^{k+1}}{(k+1)!}e^{-\lambda t} + Ce^{-\lambda t}. &
\end{align*}
Once again, substituting the initial conditions gives us $C = 0$, so
\begin{equation*}
P_{k+1}(t) = \frac{\lambda^{k+1}t^{k+1}}{(k+1)!}e^{-\lambda t}.
\end{equation*}
This completes the proof by induction, and so we are done.
\part
The probability that there is exactly one 100 year flood in the next one hundred years is given by
\begin{equation*}
P_1(100) = \frac{0.01\cdot 100}{1!}e^{-0.01\cdot 100} = e^{-1} \approx 0.37.
\end{equation*}
The probability that there are no floods in the next 100 years is given by
\begin{equation}
P_0(100) = e^{-0.01\cdot 100} = e^{-1} \approx 0.37.
\end{equation}
The probability that there as at least one 100 year flood in the next 100 years is given by
\begin{equation}
\sum_{n=1}^{\infty}P_n(100) = 1 - P_0(100) = 1 - e^{-1} \approx 0.63.
\end{equation}
\end{document}