If I understand the question correctly, you have two values ("counts") $x$ and $y$ with accompanying estimates of their standard deviations, $\sigma_x$ and $\sigma_y,$ respectively. Let's suppose $x$ and $y$ are independent realizations of random variables $X$ and $Y,$ both of which have the same mean $\mu$. Let us further suppose that $\sigma_x$ and $\sigma_y$ are highly accurate estimates of the true standard deviations of $X$ and $Y$. You wish to find a linear combination
$$\xi x + \eta y$$
that estimates $\mu.$ Among the many ways to do this, one that is commonplace in physics (going back about 210 years) is to find a linear combination that minimizes the mean squared deviation
$$\mathbb{E}_{X,Y}[\left(\xi X + \eta Y - \mu\right)^2] = \mathbb{E}_{X,Y}[\left(\xi(X-\mu) + \eta(Y-\mu) + \mu(\xi+\eta-1)\right)^2] .$$
Upon expanding the argument algebraically and using the definitions $\mathbb{E}_X[X]=\mathbb{E}_Y[Y]=\mu,$ $\mathbb{E}_X[(X-\mu)^2]=\sigma_X^2,$ and $\mathbb{E}_Y[(Y-\mu)^2]=\sigma_Y^2,$ we obtain
$$\xi^2 \sigma_X^2 + \eta^2 \sigma_Y^2 + (\xi+\eta-1)^2\mu^2.$$
Because we do not know $\mu$, the only way to proceed is to arrange for the estimator to be independent of $\mu,$ whence
$$\xi + \eta = 1.$$
Subject to that constraint, the unique solution $(\xi,\eta)$ minimizing the objective makes $\xi$ proportional to $1/\sigma_x^2$ and $\eta$ proportional to $1/\sigma_y^2$, yielding
$$\frac{\sigma_Y^2 X + \sigma_X^2 Y}{\sigma_X^2 + \sigma_Y^2}$$
for the estimator. Its variance is
$$\frac{\sigma_X^2 \sigma_Y^2}{\sigma_X^2 + \sigma_Y^2}.$$
This never exceeds one-half the larger of $\sigma_X^2$ and $\sigma_Y^2$.