**July 02, 2018 at 09:30 - July 06, 2018**
- BCAM

**Garritt L. PAGE, Brigham Young University and BCAM**

DATES: 2-6 July 2018 (5 sessions)

TIME: 09:30 - 11:30 (a total of 10 hours)

This course is a very gentle introduction to parametric Bayesian modeling. The course will introduce student to the basics Bayesian inference and computing typically employed to fit some commonly employed Bayesian models.

OBJECTIVES

- Introduce statistical inference from a Bayesian perspective

- Formulate commonly employed statistical models from a Bayesian perspective

- Fit models using Markov Chain Monte Carlo techniques

PROGRAMME

1. One Parameter Models

1.1. Introduce Bayesian philosophy and Definiti's theorem

1.2. Updating state of knowledge from prior distribution to posterior distribution

1.3. Point estimation from a one parameter conjugate model

1.4. Credible intervals from a one parameter conjugate model

1.5. Posterior Predictive distributions

2. Multiparameter conjugate models

2.1. Fully conjugate model

2.2. Gibbs sampling

3. Linear Regression models

4. Hierarchical Models

5. Latent Variable Models (mainly Finite Mixture models)

PREREQUISITES

There are no formal prerequisites, but some basic knowledge of statistics and probability will make material more accessible. Being exposed to basic programming syntax and logic will be useful. The intended audience is anyone who needs a flexible statistical environment for their research. Master and PhD students are encouraged to participate. The course will follow in some measure Peter Hoff's very accessible book "A First Course in Bayesian Statistical Methods".

***Registration is free, but inscription is required before 27th June: **So as to inscribe go to

https://bit.ly/2JcsdD5 and fill the registration form. Student grants are available. Please, let us know if you need support for travel and accommodation expenses when you fill the form.