What you are describing is the beta-binomial model. You have sample of $n$ individuals, from whom $k$ are liars, you want to learn about the proportion $p$ of the lairs in the population. When applying Bayes theorem to probabilities of events, you update the probabilities directly. When dealing with continuous (or with discrete probability distributions), you deal with functional forms of probability distributions. First, you would need to define the likelihood function of your data, the distribution describing number of "successes" in $n$ trials is a binomial distribution. So far so good, you can describe your data in terms of binomial distribution and you want to learn about the probability of "success" $p$. Next, you need to assume some probability distribution for $p$, your prior. Then you would need apply Bayes theorem
$$
\pi(p | y) = \frac{f(y|p) \; \pi(p)}{ \int f(y|p) \; \pi(p) \; dp }
$$
where $f(y|p)$ is your likelihood function and $\pi(p)$ is your prior. In this case, $f(y|p)$ is a binomial distribution parametrized by known parameter $n$ and unknown $p$. There is many possible choices for prior distribution $\pi(p)$ (it can be any continuous distribution bounded in $[0,1]$), but beta distribution is very convenient choice, since it is a conjugate prior, so it has a closed-form solution. If your model is
$$
\begin{align}
p &\sim \mathcal{Beta}(\alpha, \beta) \\
y &\sim \mathcal{Bin}(n, p)
\end{align}
$$
then posterior distribution for $p$ is simply
$$
p \sim \mathcal{Beta}(\alpha + k, \beta + n-k)
$$
If you want to use "uninformative" prior, you can use e.g. $\alpha=\beta=1$ what leads to uniform prior distribution for $p$.
If this is still unclear, check the links I provided in the text and one of the multiple handbooks on Bayesian inference.