Bayes by hand

You may have come across the phrase “conjugate prior” in your reading. This is an important concept historically, and in my opinion, demonstrating the phenomenon is useful in a very practical sense in terms of solidifying your understanding of how Bayes’ Theorem works.

The historical piece is that back in the day, before computers or whatever, the only way to do Bayesian analysis was if you had conjugate priors. That is to say, you needed your prior distribution and your likelihood distribution to match up magically to produce a workable posterior. This webpage (http://www.johndcook.com/conjugate_prior_diagram.html) gives a list of several such magical pairs, and provides the PMF (probability mass function) for each. The reason these pairs of distributions are special (“conjugate”) is because when you multiply them together, they will make a new distribution that also just happens to be theoretically definable. It is technically possible to multiply any two distributions together, of course, but a lot of the time it will result in a weird, unique snowflake of a distribution that isn’t definable in terms of a nice, neat PMF. If the posterior distribution you get is a well-behaved function that we have a PMF for, it’s very easy to work with – you can take any value from that function and easily calculate its probability, for example. If your posterior distribution is a unique snowflake, you have no formula (PMF) for getting probabilities from it.

Nowadays, we don’t do it by hand at all – JAGS brute forces a posterior distribution for us, regardless of how ugly and/or unique it might be. So it’s no longer necessary to stick to conjugate priors to do Bayesian analysis. This is still a nice problem to work through, though, since it makes you think about 1) what the likelihood function is, and 2) how exactly information from your data get combined with a prior distribution to result in your posterior. To see the example we walked through in club today, check out my homework assignment from the super awesome summer class I took on Bayesian Modeling at ICPSR. This document is my answers to our first homework assignment:
gamma-Poisson demonstation

desmos graph showing the relationship between the prior and posterior distributions in this example (thank you, John!): https://www.desmos.com/calculator/hzt65i2top

Homework! If you want additional practice (and why wouldn’t you, right?), show that the beta and binomial are another conjugate pair. See the new post on doing Bayes by hand for the answers.

Post a comment

You may use the following HTML:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>