Crpytography 2 -- Probability Theory
Previous chapter: Historical Ciphers & Cryptanalysis
This is a learning note of a course in CISPA, UdS. Taught by Nico Döttling
This chapter is a basic probability concept recap and clarify the syntax when writing them, useful if your memory start fading. Remember,
$$\textbf{Cryptography is impossible without randomness!}$$
Probability Spaces
Components | Definition | Remarks |
---|---|---|
$\Omega$ | Sample Space | A finite set |
$\omega\in\Omega$ | Elementary Events / Outcomes | |
$Pr:\omega\rightarrow\mathbb{R}$ | Probability Function | $\forall\omega\in\Omega. 0\leq Pr[\omega]\leq1, \underset{\omega\in\Omega}{\sum}Pr[\omega] = 1$ |
$E\subseteq\Omega$ | Event | $\Pr[E] = \underset{\omega\in\Omega}{\sum}Pr[\omega], \text{ where } Pr[\varepsilon] = 0$ |
Examples
- Fair coin: $\Omega = {0,1}, Pr[0]=Pr[1]=1/2$
- Biased coin: $\Omega = {0,1}, Pr[1]=p, Pr[0]=1-p$
- Fair dice: $\Omega = {1,2,3,4,5,6}, Pr[1]=Pr[2]=\dots=Pr[6]=1/6$
- Uniform Randomness: $\Omega = {1.\dots,N}, Pr[1]=Pr[2]=\dots=Pr[N]=1/N$
Set Operations on events
Operations | Definition |
---|---|
$Pr[E_1\cap E_2] := Pr[E_1\wedge E_2] := Pr[E_1 \textbf{ AND } E_2] := Pr[E_1, E_2]$ | Intersection, Logical ‘AND’ |
$Pr[E_1\cup E_2] := Pr[E_1\vee E_2] := Pr[E_1 \textbf{ OR } E_2]$ | Intersection, Logical ‘OR’ |
$Pr[\overline{E}] := Pr[\neg E] := Pr[\textbf{NOT }E] \text{ where }\overline{E}=\Omega\setminus E$ | Complement, Logical ‘Negation’ |
$A\subseteq B$, $A\Rightarrow B$ | The event A implies the event B |
Properties of Probability Spaces
For all events $A, B\subseteq\Omega$ :
- $Pr[A\cup B] = Pr[A]+Pr[B]-Pr[A\cap B]$
- It implies $Pr[A\cup B]\leq Pr[A]+Pr[B]$ (Union Bound)
- $Pr[\overline{A}] = 1 - Pr[A]$
- If $A\subseteq B$ then $Pr[A]\leq Pr[B]$
Conditional Probabilities
For two events $A$ and $B$, we define the conditional probability
$$Pr[A\mid B] = \frac{Pr[A\cap B]}{Pr[B]}$$
Independence
Events $A, B\subseteq \Sigma$ are independent if
$$Pr[A\cap B] = Pr[A]\cdot Pr[B]$$
Which also implies
$$Pr[A\mid B] = Pr[A]\text{ and }Pr[B\mid A] = Pr[B]$$
$A$ and $B$ are independent $\neq$ $A$ and $B$ are disjoint! $A$ and $B$ are disjoint if $Pr[A\cap B] = \varnothing$
Baye’s Rule
- A mathematical rule for inverting conditional probabilities
- allowing one to find the probability of a cause given its effect.
- This is especially useful when analyzing larger processes composed of smaller ones
$$\text{For all }\varnothing\neq A,B\subseteq\Sigma\text{, it holds that }Pr[A\mid B] = \frac{Pr[B\mid A]\cdot Pr[A]}{Pr[B]}$$
Proof
$$Pr[A\mid B] = \frac{Pr[A\cap B]}{Pr[B]}\ , \ Pr[B\mid A] = \frac{Pr[B\cap A]}{Pr[A]}$$
$$\because Pr[A\cap B] = Pr[B\cap A]$$
$$\therefore Pr[A\cap B] = Pr[B\mid A]\cdot Pr[A], Pr[A\mid B] = \frac{Pr[B\mid A]\cdot Pr[A]}{Pr[B]}$$
Law of Total Probability
- Sometimes we want to infer the probability of an event given conditional probabilities of this event
- This will also be useful to compute/bound probabilities by a case analysis
$$\text{Let }B_1,\dots,B_n\text{ be events that partition the probability space }\Sigma\text{, i.e. }B_1\cup\dots\cup B_n=\Sigma$$
$$\text{ and }B_i\cap B_j=\varnothing\text{ for }i\neq j\text{. Then it holds for every event }A\text{ that }\overset{n}{\underset{i=1}{\sum}}Pr[A\mid B_i]\cdot Pr[B_i]=Pr[A]$$
Next post: Probability Theory
Further Reading: History of cryptography
Crpytography 2 -- Probability Theory