Exercise 4

Question 1.

Assume that a distribution of Y belongs to a one-parametric parametric family of distributions f(y|θ), where a priori θ ~ π(θ). After getting the first observation y1 we update the distribution of θ by calculating its posterior distribution p(θ|y1) and use it as a prior distribution for θ before we observe y2. Having y2, we update the distribution of θ again by deriving its posterior distribution. Show that one will get the same posterior distribution of θ if instead of caclulating it sequentially, he/she would use the whole data (i.e. y1 and y2) and the original prior π(θ).

Question 2.

Let $Y_1,\ldots,Y_n \sim f_\theta(y)$, where $f_\theta$ belongs to the exponential family, i.e. $supp f_\theta$ does not depend on $\theta$ and $f_\theta(y)=\exp\{c(\theta)T(y)+d(\theta)+S(y)\}$.
  1. Show that the conjugate prior family for $\theta$ is $\pi_{a,b}(\theta) \propto \exp\{a c(\theta)+bd(\theta)\}$.
  2. Find a conjugate prior for the parameter $\lambda$ of the Poisson distribution Pois(λ).

Question 3.

Let $Y_1,\ldots,Y_n \sim Pois(\lambda)$.
  1. Assume $a\; priori$ that $\lambda \sim \exp(\theta)$. Is it a conjugate prior for Poisson data?
  2. Estimate $\lambda$ and $p=P(Y=0)$ w.r.t. the quadratic loss.
  3. Assume now that $\theta$ is also unknown and find its empirical Bayes estimator. What are the resulting estimators for $\lambda$ and $p$?

Question 4.

Suppose that $Y_1,\ldots,Y_n \sim \exp(\theta)$, where $EY=1/\theta$.
  1. Find a noninformative prior for $\theta$ according to the Jefrreys' rule and the corresponding posterior distribution.
  2. Estimate $\theta$ w.r.t. the quadratic error and compare the resulting Bayesian estimator with the MLE.
  3. Repeat the previous paragraph for estimating $\mu=EY$ and $p=P(Y \geq a)$.
  4. In addition to the sample of Y's, we have another independent sample $X_1,\ldots,X_n \sim \exp(\phi), EX=1/\phi$ and again, we use the noninformative prior for $\phi$. We are interested in the ratio $\theta/\phi$. Find its posterior distribution, the posterior mean and compare it with the MLE for this ratio.
(hint: recall some of basic distributions you know like $\chi^2, F$, etc.)

Question 5.

Three friends, Optimist, Realist and Pessimist, are going to participate in a certain kind of gambling game in casino when they do not know the probability of win p. Motivated by an exciting course in Bayesian statistics, they decided to apply Bayesian analysis. Optimist chose a prior on p from the conjugate family of distributions. In addition, he believes that the chances are 50%-50% (i.e. E(p)=1/2) with Var(p)=1/36. Realist chose a uniform prior U[0,1].
  1. Show that both priors belong to the family of Beta distributions and find their parameters for the priors of Optimist and Realist.
    (hint: $E\left(Beta(\alpha,\beta)\right)=\frac{\alpha}{\alpha+\beta},\; Var\left(Beta(\alpha,\beta)\right)=\frac{\alpha \beta}{(\alpha+\beta)^2 (\alpha+\beta+1)}$)
  2. Pessimist does not have any prior beliefs and decides to choose the noninformative prior according to the Jeffreys' rule. What is his prior? Does it also belong to the Beta family?
  3. Being poor students in Statistics, Optimist, Realist and Pessimist do not have enough money to gamble separately, so they decided to play together. They played the game 25 times and won 12 times. What is the posterior distribution of each one of them?
  4. Find the corresponding posterior means and calculate the corresponding 95% Bayesian credible intervals for p.
    (hint: use the fact that if p~Beta(α,β) and ρ=p/(1-p) is the odds, then (β/α)ρ ~ F2α,2β - those who do not believe it, can (should!) easily verify it!)
  5. Three friends told their classmate Skeptic about the exciting Bayesian analysis each one of them has done. However, Skeptic is, naturally, very skeptical about Bayesian approach and does not belive in any priors - his credo is "In G-d we trust... All others bring data". So he decided to perform a "classical" (non-Bayesian) analysis of the same data. What is his estimate and the 95%-confidence interval for p? Compare the results and comment them briefly.

Question 6

The waiting time for a bus at a given corner at a certain time of day is known to have a uniform distribution U[0,θ]. From other similar routes it is known that θ has a Pareto distribution Pa(7,4), where the density of Pareto distribution Pa(α,β) is πα,β(θ)=(α/β)(β/θ)α+1 for θ ≥ β and 0, otherwise. Waiting times of 10, 3, 2, 5 and 14 minutes have been observed at the given corner at the last 5 days.
  1. Show that the Pareto distribution provides a conjugate prior for uniform data and find the posterior distribution of θ.
  2. Estimate θ w.r.t. to the quadratic error (recall that E(Pa(α,β))=αβ/(α-1), α>1).
  3. Find a 95% HPD for θ.
  4. Test the hypotheses H0: 0 ≤ θ ≤ 15 vs. H1: θ > 15 by choosing the most likely (a posteriori) hypothesis.