Category: Announcements and Resources, a.k.a. BayesWatch

Registration!

You can actually register for Bayes Club now! Sorry for the huge delay on that.

Remember that you don’t need to register for Bayes Club to attend. Anyone is welcome to just show up. There are two differences between taking Bayes Club for credit (registering) vs. just showing up:

1) It shows up on your transcript, i.e. you get official credit for it, obviously.

2) You are required to provide a little evidence that you’re actually participating in Bayes Club, or you run the risk of failing or getting an incomplete. The exact nature of this “evidence” is twofold: You have to actually show up for Bayes Club (we’ll take attendance), and you have to send a short write-up to Sanjay, the professor of record, by the end of the term discussing what you learned and/or did in Bayes Club that term.

So basically, you should register if you want it to show up on your transcript and you’re not worried about being able to meet the requirements. If either of those things isn’t true, then just plan to show up without registering. Don’t worry – we won’t run out of space (and if we do, we’ll just find a bigger room).

CRN: 17487

Normal vs. t-dist priors?

We model most continuous variables with normal distributions because 1) many of them genuinely are normally distributed, 2) the normal distribution is mathematically convenient, and 3) it’s a pretty ingrained habit (honestly, that is probably the real reason in most cases). In Bayesian modeling in particular, there’s the additional attraction that normal distributions form conjugate priors when other less elegant distributions (like t) won’t. This is really important if you’re doing Bayesian modeling by hand*, but it doesn’t make a difference if you’re relying on a sampler (e.g. MCMC) to generate your posterior. In some cases, there continues to be a legacy preference for mathematically tidy distributions left over from when we didn’t have the computing power to just do everything by sheer brute force, which I think accounts for some of the preference for normal distributions on priors in modern Bayesian modeling.

However, you may have noticed some Bayesians using t-distributions when you might have otherwise expected a normal – usually a Cauchy distribution, which is t with 1 degree of freedom. So what’s the difference?

John D. Cook has a nice explanation of one important difference on his blog (which often has good Bayes content, by the way): http://www.johndcook.com/blog/2010/08/30/robust-prior-illustration/

Enjoy.

* But don’t do that. Seriously. If your computer breaks, just take a couple days off and write some poetry or something.

A Quick Reading for Tomorrow

Tomorrow, for our first meeting of Bayes Club for the term, we’ll be going through some JAGS code, and talking more about MCMC algorithms. If you have a moment, take 5-10 minutes to look through a StackExchange post here. It’s a list of suggestions that people had for explaining MCMC to a beginner (which we all qualify as), and was really helpful to me. Some of the posts are clearer than others, so feel free to just skim over the page looking for things that make the most immediate sense.

See you tomorrow!

Summer Bayes!

Nothing says “summer vacation” like taking a bunch of advanced stats classes, ammiright?

Here are a couple summer classes on Bayesian analysis for social scientists. Please comment on this post if you know of other classes or training opportunities for this summer (or email me, and I’ll add it into the body of the post itself).

 

JAGS for MATLAB users

Kirsten found a JAGS package for MATAB!

http://psiexp.ss.uci.edu/research/programs_data/jags/

Also, WinBUGS (another Gibbs sampler, like JAGS) runs happily with MATLAB, in case you’re curious. https://code.google.com/p/matbugs/

Both the Lee & Wagenmakers text (http://bayesmodels.com/) and the Kruschke text (http://www.indiana.edu/~kruschke/DoingBayesianDataAnalysis/) have WinBUGS code, so that might be a good choice for those of you with data structures ready in MATLAB formats.