Suppose that $r$ policies were issued in 2005. Let $t_1, \cdots, t_r$ denote the times at which accident claims are reported for policies issued in 2005. Additionally, let $z_1, \cdots, z_r$ denote the number of accident claims associated with these policies upon being reported. Assume $t_1, \cdots, t_r$ are independent, $z_1, \cdots, z_r$ are independent, and $t_i,z_i$ are pairwise independent for all $i$. In other words, the reporting time and number of accident claims are independent of one another for a given a policy, and each policy is independent. We shall also assume that $t_i^{\ast}=\mbox{I}\left(t_i = 2005\right) \sim Bernoulli(d)$.
Let $Y$ denote the total number of accident claims associated with all policies issued in 2005. Clearly, from the above set-up, $Y = \sum_{i=1}^r z_i$. Since we assume that $Y \sim \mbox{NB}(r,p)$, it must be the case that $z_i \sim \mbox{Geo}(p)$; that is, the probability mass function of $z_i$ is
\begin{eqnarray*}
f_{z_i}(z) = p(1-p)^z \quad \mbox{for} \quad z \in \{0, 1, \cdots \}.
\end{eqnarray*}
Furthermore, let $X$ denote the number of accident claims associated with all policies issued in 2005 that are reported in 2005. Therefore, $X = \sum_{i=1}^r t_i^{\ast}z_i$. Let $u_i = t_i^{\ast}z_i$. Clearly, $u_1, \cdots, u_r$ are independent and we can write $X = \sum_{i=1}^r u_i$.
Next, we can express $Y$ as $Y=\sum_{i=1}^r t_i^{\ast}z_i + \sum_{i=1}^r \left(1-t_i^{\ast}\right)z_i$. Let $v_i = \left(1-t_i^{\ast}\right)z_i$ and $W=\sum_{i=1}^r v_i$. From the previous independence assumptions, $v_1, \cdots, v_r$ are independent. Therefore, we can write $Y = \sum_{i=1}^r \left(u_i + v_i\right) = X+W$. Although, $u_i$ and $v_j$ are indepedent for $i \ne j$, $u_i$ and $v_i$ are dependent; thus, $X$ and $W$ are dependent. Hence, we will have to use a bivariate discrete transformation to find the joint density of $(u_i,v_i)$ from the joint density of $(t_i^{\ast},z_i)$. This is necessary since by Bayes' theorem, we can write
\begin{eqnarray*}
P[Y=y|X=x] &=& \frac{P[X=x,Y=y]}{P[X=x]} \\
&=& \frac{P[X=x,W=y-x]}{P[X=x]}.
\end{eqnarray*}
Hence, we need only find the joint probability mass function of $(X,W)$ as well as the probability mass function of $X$. To this end, we begin by finding the joint distribution of $(u_i,v_i)$. Note that the transformation from $(t_i^{\ast},z_i)$ to $(u_i,v_i)$ is almost one-to-one since we can solve for $t_i$ and $z_i$ in terms of $u_i$ and $v_i$ as $t_i = u_i/(u_i+v_i)$ and $z_i=u_i+v_i$. Clearly, this transformation breaks down when $(u_i,v_i)=(0,0)$ since $t_i$ will be indeterminate. It is straightforward to show that if $(u_i,v_i)=(0,0)$, then $(t_i^{\ast},z_i)=(0,0)$ or $(t_i^{\ast},z_i)=(1,0)$. When $(u_i,v_i)\ne(0,0)$, it is important to note that either $u_i=0$ or $v_i=0$ since $t_i^{\ast}$ is a Bernoulli random variable. Using the previous transformations, with $u,v \in \mathbb{N}$, if $(u_i,v_i)=(0,v)$, then $(t_i^{\ast},z_i)=(0,v)$ and if $(u_i,v_i)=(u,0)$, then $(t_i^{\ast},z_i)=(1,u)$. Hence, the joint probability mass function of $(u_i,v_i)$ is
\begin{eqnarray*}
P[u_i=u,v_i=v] = \begin{cases}
p & \mbox{if} \,\, (u,v) = (0,0) \\
p(1-d)(1-p)^v & \mbox{if} \,\, u=0,v \in \mathbb{N} \\
pd(1-p)^u & \mbox{if} \,\, v=0,u \in \mathbb{N} \\
0 & \mbox{otherwise}
\end{cases}.
\end{eqnarray*}
Since $X$ and $W$ are comprised of $r$ i.i.d. terms, the probability generating function of $X$ and $(X,W)$ may be expressed as the probability generating functions of $u_i$ and $(u_i,v_i)$ raised to the $r$th power. For completeness, we have that the probability generating function of $(X,W)$ is
\begin{eqnarray*}
G_{X,W}\left(z_1,z_2\right) &=& \mbox{E} \left[z_1^X z_2^W\right] \\
&=& \mbox{E} \left[z_1^{\sum_{i=1}^r u_i} z_2^{\sum_{i=1}^r v_i} \right] \\
&=& \prod_{i=1}^r\mbox{E} \left[z_1^{u_i} z_2^{v_i} \right] \\
&=& \left(\mbox{E} \left[z_1^{u_i} z_2^{v_i} \right]\right)^r \\
&=& \left(G_{u_i,v_i}\left(z_1,z_2\right)\right)^r,
\end{eqnarray*}
where $G_{u_i,v_i}\left(z_1,z_2\right)$ denotes the probability generating function of $(u_i,v_i)$. Clearly, the probability generating function of $X$, $G_X(z_1) = G_{X,W}\left(z_1,1\right)$.
Now, the probability generating function of $(u_i,v_i)$ is
\begin{eqnarray*}
G_{u_i,v_i}\left(z_1,z_2\right) &=& \sum_{u=0}^\infty\sum_{v=0}^\infty P[u_i=u,v_i=v] z_1^u z_2^v \\
&=& p + p(1-d) \sum_{v=1}^\infty \left[z_2(1-p)\right]^v + pd \sum_{u=1}^\infty \left[z_1(1-p)\right]^u \\
&=& p(1-d)+p(1-d) \frac{z_2(1-p)}{1-z_2(1-p)} + pd + pd \frac{z_1(1-p)}{1-z_1(1-p)} \\
&=& \frac{p(1-d)}{1-z_2(1-p)} + \frac{pd}{1-z_1(1-p)}.
\end{eqnarray*}
By independence, the probability generating function of $(X,W)$ is
\begin{eqnarray*}
G_{X,W}\left(z_1,z_2\right) &=& \left[\frac{p(1-d)}{1-z_2(1-p)} + \frac{pd}{1-z_1(1-p)}\right]^r.
\end{eqnarray*}
Using the above equation, we can compute probabilities of $(X,W)=(x,w)$ by using the relation
\begin{eqnarray*}
P[X=x,W=w] = \frac{ \frac{\partial^{x+w}}{\partial z_1^x \partial z_2^w} G_{X,W}(z_1,z_2)|_{(z_1,z_2)=(0,0)}}{x!w!}.
\end{eqnarray*}
Therefore, the answer to the question may be written as
\begin{eqnarray*}
P[Y=y|X=x] &=& \frac{1}{(y-x)!}\frac{ \frac{\partial^{y}}{\partial z_1^x \partial z_2^{y-x}} G_{X,W}(z_1,z_2)|_{(z_1,z_2)=(0,0)}}{\frac{\partial^{x}}{\partial z_1^x } G_{X,W}(z_1,1)|_{z_1=0}}.
\end{eqnarray*}
For instance, we may find $P[Y=0|X=0]$ by evaluating $G_{X,W}(0,0)$ and $G_{X,W}(0,1)$. Specifically,
\begin{eqnarray*}
P[Y=0|X=0] = \left[\frac{p}{(1-d)+pd}\right]^r.
\end{eqnarray*}
This value agrees with that found by simulating. See the following R code
d=.4
p=.3
r=6
B=10^6
cond.check = 0
l = 0
set.seed(555)
for (i in 1:B){
ti = rbinom(r,1,d)
zi = rnbinom(r,1,p)
X = sum(ti*zi)
Y = sum(zi)
W = Y-X
if (X==0){
cond.check = cond.check + (Y==0)
l = l+1
}
}
(cond.check = cond.check/l) #0.005273457
(p/(1-d+p*d))^r #0.005232781