TL;DR We can develop a uniformly bounded rejection sampler, which will generate a variate from the desired density requiring an expected (worst case) $\approx 4.75$ independent uniform variates. Although the set-up is fairly straightforward/fast, it is non-trivial and this approach may be slow with varying parameters (e.g., Gibbs Sampling).
This is a tricky distribution. As mentioned in the comments, this is nearly the Generalized Gamma distribution (with $p=2$ and $d =a+1$), except for the fact that $b$ is not a true location parameter because it occurs only in the second term. I have been searching for a while now, and am unable to find a reference to this distribution anywhere.
A Uniformly Bounded Rejection Sampler
In this paper by Luc Devroye, a uniformly bounded rejection sampler is constructed for the Generalized Inverse Gaussian distribution, and we can follow a similar approach.
Let me redefine the density (up to a constant) as
$$f(x) = x^{\alpha -1}\exp{\left(-\gamma(x-\mu)^2\right)}, x > 0$$
First step is to prove that the density is log-concave. This can be done by showing that
- $f'(x)/f(x)$ is monotone decreasing for $x > 0$
- $(\log f(x))'' < 0$ for all $x > 0$.
These properties hold whenever $\alpha > 1$. Next, we note that the mode occurs at
$$m = \frac{\mu}{2} + \frac{1}{2\gamma}\sqrt{\gamma\left(2\alpha + \gamma\mu^2 -2\right)}.$$ Define
\begin{align*}
\phi(x) &= f(m)^{-1}f(x+m) \\
\psi(x) &= \log \phi(x) = (\alpha-1)\log(x+m) - \gamma(x+m-\mu)^2 - \log f(m)
\end{align*}
so that $\phi(0) = 1$ and $\psi(0) = 0$. We will also need the derivative of $\psi(x)$
$$\psi'(x) = \frac{\alpha-1}{x+m} - 2\gamma(x+m-\mu).$$
Finally, you will need to find $s, t > 0$ such that $\psi(-s) = \psi(t) = -1.$ Newton-Raphson should converge fairly quickly, by iterating
$$t_0 > 0, \ t_{n+1} = t_n - \frac{\psi(t_n) + 1}{\psi'(t_n)} \quad\text{and}\quad s_0 < 0, \ s_{n+1} = s_n + \frac{\psi(-s_n) + 1}{\psi'(-s_n)}.$$
The Algorithm
INPUTS: s, t, psi, psi'
Compute p = 1/psi'(-s)
Compute r = -1/psi'(t)
Compute t' = t + r*psi(t)
Compute s' = s + p*psi(-s)
Compute q = t' + s'
REPEAT
Generate U, V, W ~ U(0, 1)
if U < q/(q + r + p) then X = -s' + qV
elseif U < (q + r)/(q + r + p) then X = t' - r*log(V)
else X = -s' + p*log(V)
if X > t' then chi = exp(psi(t) + psi'(t)*(x - t))
elseif X > -s' then chi = 1
else chi = exp(psi(-s) + psi'(-s)*(x + s))
UNTIL log(W) <= psi(X) - log(chi)
RETURN X + m
Discussion
This approach has some advantages as well as some important disadvantages. The main advantage is that the algorithm is uniformly bounded.
Theorem. Using the algorithm above, the expected number of iterations required to generate a sample is at most $1.581977\ldots$.
Since three independent uniform variates are required at each iteration, we expect that a draw from $f$ can be generated at the (worst case) cost of generating $\approx 4.75$ uniform variates.
Unfortunately, the set-up is non-trivial. In particular, Newton-Raphson is required to find $s$ and $t$. This approach can be approved by explicitly finding $s, t > 0$ such that $\psi(-s) = \psi(t) = -\rho$ for any $\rho > 0$. I am working on this right now, but have yet to find anything. It is also worth noting that this approach may fail when $\alpha < 1$, which may or may not be a problem depending on the application.
In summary, if you are looking to draw a large number of samples from $f$ for fixed parameters, then this method is robust and efficient. If you are looking for a single draw with varying parameters (e.g., Gibbs Sampling) then the required set-up of this algorithm is a substantial disadvantage.