Recall that $$ p(\theta | x) = \frac,$$ where $p(\theta)$ is your prior, $p(x|\theta)$ is the likelihood (i.e.the evidence that you use to update the prior), and $p(\theta|x)$ is the posterior probability.However, I found that some times people don't find this result self-evident, thus I give a slightly long-winded proof.$$P(F|HH) =\frac= \frac$$ by the chain rule of conditional probabilities.Collect your data, and then the likelihood curve shows the relative support that your data lend to various simple hypotheses.Likelihoods are a key component of Bayesian inference because they are the bridge that gets us from prior to posterior.

Alternatively one could understand the term as using the posterior of the first step as prior input for further calculation. You should make it clear in your question that you're only considering the possibilities that either the coin is perfectly fair, or else it always comes up heads.Bayesian updating is particularly important in the dynamic analysis of a sequence of data.Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law.If you had normal data you could use a normal prior and obtain a normal posterior.Conjugate priors are not required for doing bayesian updating, but they make the calculations a lot easier so they are nice to use if you can.

Alternatively one could understand the term as using the posterior of the first step as prior input for further calculation. You should make it clear in your question that you're only considering the possibilities that either the coin is perfectly fair, or else it always comes up heads.Bayesian updating is particularly important in the dynamic analysis of a sequence of data.Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law.If you had normal data you could use a normal prior and obtain a normal posterior.Conjugate priors are not required for doing bayesian updating, but they make the calculations a lot easier so they are nice to use if you can. My question is as to how far method b is a valid approach ? What is the probability that the coin is fair, i.e. Now for the first toss: $Pr(Fair\ coin| H) = \frac = \frac \quad\quad (1)$ Assuming starting prior belief P(Fair) = 0.5, want to find P(F|H) for the first toss Below are the calculation for the intermediate steps: $P(H|F)= \theta^(1-\theta)^ = 0.5^(0.5)^= 0.5$ $P(H)= P(H|F) \cdot P(F) P(H|Biased) \cdot P(Biased)=(0.5 \cdot 0.5) (1 \cdot 0.5) = 0.75$ (Note: P(H|Biased) = 1 because assuming an extreme example with Heads on both sides of the coin, the probability of getting Heads with a biased coin = 1 (makes calculation easy)) Hence, plugging into (1), we get : $Pr(F| H) =\frac = \frac = 0.33$ Now, we toss the coin again and get another H.