Let $T\in\mathbb{R}_+$ and $R\in\{0,1\}$
Then if $$P[T]=\sum_RP\big[T,R\big]=\sum_RP\big[T\!\mid{\!R}\big]P\big[R\big],$$
does it imply $T$ and $R$ dependent?
Let $T\in\mathbb{R}_+$ and $R\in\{0,1\}$
Then if $$P[T]=\sum_RP\big[T,R\big]=\sum_RP\big[T\!\mid{\!R}\big]P\big[R\big],$$
does it imply $T$ and $R$ dependent?
Here is an answer to what I think is the general probability question (?)
Given the two random variables, $T\in\mathbb{R}_+$ and $R\in\{0,1\}$, then the marginal distribution of $T$ is given by $$P[T]=\sum_RP\big[T,R\big]=\sum_RP\big[T\!\mid{\!R}\big]P\big[R\big]$$ This is true whether $T$ and $R$ are dependent or independent. So your last equation says nothing definite on the topic of independence.
If $T$ and $R$ are independent then their joint distribution is the product of their individual marginal distributions $$P[T,R]=P[T]P[R]$$ and since $P[R=0]+P[R=1]=1$ by definition, then the first equation simply reduces to the tautology $P[T]=P[T]$.
On the other hand, two variables are uncorrelated if their covariance is zero $$\mathrm{cov}[T,R]=\langle{TR}\rangle-\langle{T}\rangle\langle{R}\rangle=0$$ Now obviously if the variables are independent then this will hold.
However in general uncorrelated does not imply independent. To see this, denoting $p=P[R=1]$ we can compute the expectations as \begin{align} \langle{T}\rangle &= \langle{T|R=0}\rangle{(1-p)} + \langle{T|R=1}\rangle{p} \\ \langle{R}\rangle &= (0)(1-p) + (1)p = p \\ \langle{TR}\rangle &= \langle{T(0)|R=0}\rangle{(1-p)} + \langle{T(1)|R=1}\rangle{p} = \langle{T|R=1}\rangle{p} \end{align} So all that is required for the variables to be uncorrelated is $$\langle{T|R=0}\rangle=\langle{T|R=1}\rangle=\langle{T}\rangle$$ i.e. the conditional expectations of $T$ are identical no matter the value of $R$.
Obviously this integral condition is much less restrictive than the requirement $$P[T|R=0]=P[T|R=1]=P[T]$$ that the conditional probability distributions are entirely independent of $R$ in a point-wise sense.