Seems like Bayesian is the natural approach here. Since prior of p is unif(0,1), posterior is Beta(X+1, Y-X+1) as already answered by jbowman.
So I'll add the details here. Let $x_i$, $i=1,\ldots,Y$ be the outcomes from coin tosses. Given: $\sum_{i=1}^Y x_i = X$, the number of heads. Let $\theta = Pr(X_i = 1)$, where we'll treat 'head' as 1. Therefore, the random variable $X_i \sim Bernoulli(\theta)$, with 'head' as the 'success'.
Given prior for $\theta$: $f(\theta) = 1 I(\theta \in (0,1))$, which is unif(0,1). Since $\sum_{i=1}^Y x_i$ is $Binomial(Y,\theta)$, the likelihood is $Pr(\sum_{i=1}^Y x_i = X) = K\theta^X(1-\theta)^{Y-X}$, where $K$ is $Y$ choose $X$.
Now, posterior of $\theta$: $f(\theta\mid D) = \frac{f(D\mid\theta)f(\theta)}{\int_{0}^{1}f(D\mid\theta)f(\theta)d\theta}$, where $D$ is $\sum_{i=1}^Y x_i$, the 'data'. Therefore, $f(\theta\mid D) = \frac{K\theta^X(1-\theta)^{Y-X}}{K\int_{0}^{1}\theta^{X}(1-\theta)^{Y-X}d\theta} = \frac{\theta^X(1-\theta)^{Y-X}}{Beta(X+1, Y-X+1)}$, which is the density function of the Beta distribution with parameters $X+1$ and $Y-X+1$. $Beta(,)$ is the Beta function.