Tell Me the Odds: A 15-Page Introduction to Bayes Theorem
While studying data analytics at UC Berkeley Extension, I came across the Bayes Theorem but I didn't have the time to really appreciate it due to the fast pace set by the course curriculum. So now that I'm on "maternity leave", I continue learning by reading Tell Me the Odds by Scott Hartshorn (it's free on Kindle). This is officially my fourth book for the year (again, not counting the technical articles I've read prior to this book). I'd typically read through such a book quite rapidly; however, taking care of my son now takes a big chunk of my time. So I only read the book when I find five to ten minutes between feeds, diaper changes, burps, and bottle cleaning.
The author simplified the explanation of Bayes Theorem using, as an example, the probability of obtaining a die from a bag of dice based on the outcome of a roll. That sounded interesting because I typically encounter statistics problems in which I'm supposed to calculate the probability of a particular outcome (e.g., the probability of obtaining a head in a coin toss, which is one out of two; or the probability of obtaining a five after rolling a six-sided die, which is one out of six). I've never been asked to calculate the probability of getting a particular die based on the outcome of a roll. He used tables and figures as visualisation aids that help me understand the concept.
Such an explanation is highly appreciated because the equation of Bayes Theorem is intimidating for non-Math or Stats majors (like me), and is something I learn because I see potential applications of it in my work.
P(A|B) = [P(A) * P(B|A)]/P(B)
where:
P(A|B) is the probability of the outcome (e.g., rolling a five from a die picked from a bag of dice)
P(A) is the initial probability (e.g., getting a five-sided die in a bag of dice)
P(B|A) is the likelihood of the result obtained (e.g., rolling a five from any die)
P(B) is the normalising constant (i.e., the sum of the probabilities of all dice that can roll a five)
This example, the dice in the bag, does not explain the book cover though. So I proceeded to read the second application: being an astronaut about to navigate through an asteroid field. A robot assistant has declared the odds of successfully navigating through the asteroid field (this is the P(A)) but it does not know other factors that can change the astronaut's probability of success (like the astronaut's skill level and experience, where the astronaut will begin the journey, etc); this is where the likelihood and the normalisation constant come in. Obviously, this is a more complex example because it keeps adding new factors into the story, leading to more calculations of probabilities and likelihoods.
After my first read of the book, I still don't have a good handle of the theorem and will continue to review the text just to understand it better. It's a good thing that the book is on Kindle (for free!) and I can reread the book as many times as I want.
I wonder how I can apply Bayes Theorem in the work that I'm doing...
Comments
Post a Comment
Thank you for dropping by!
Before moving on, please share your thoughts or comments about the post. :)
Thanks again!