While the "Darth Vader rule" (a silly name) applies to any non-negative random variable, I am going to simplify the analysis by looking only at continuous random variables. Extension to discrete and mixed random variables should also be possible, but I will not pursue that here. In a related answer here we show a partial extension of the expectation rule. Specifically, it is shown that for an arbitrary continuous random variable $X$ and any constant $a \in \mathbb{R}$ you have the general rule:$^\dagger$
$$\mathbb{E}[\max(X-a,0)] = \int \limits_{a}^\infty (1-F_X(x)) \ dx.$$
We can go further by writing the expectation of an arbitrary continuous random variable $X$ as:
$$\begin{align}
\mathbb{E}[X]
&= \lim_{a\rightarrow -\infty} \mathbb{E}[\max(X,a)] \\[12pt]
&= \lim_{a\rightarrow -\infty} \Big( a+\mathbb{E}[\max(X-a,0)] \Big) \\[12pt]
&= \lim_{a\rightarrow -\infty} \Big( - \int \limits_{a}^0 \ dx + \mathbb{E}[\max(X-a,0)] \Big) \\[12pt]
&= \lim_{a\rightarrow -\infty} \Bigg( - \int \limits_{a}^\infty \mathbb{I}(x < 0) \ dx + \int \limits_{a}^\infty (1-F_X(x)) \ dx \Bigg) \\[6pt]
&= \lim_{a\rightarrow -\infty} \Bigg( \int \limits_{a}^\infty (1-\mathbb{I}(x < 0)-F_X(x)) \ dx \Bigg) \\[6pt]
&= \lim_{a\rightarrow -\infty} \int \limits_{a}^\infty (\mathbb{I}(x \geqslant 0)-F_X(x)) \ dx \\[6pt]
&= \int \limits_{-\infty}^\infty (\mathbb{I}(x \geqslant 0)-F_X(x)) \ dx. \\[6pt]
\end{align}$$
In cases where the individual integrals are convergent, this can be written in simple form as:
$$\mathbb{E}[X] = \int \limits_0^\infty (1-F_X(x)) \ dx - \int \limits_0^\infty F_X(-x) \ dx.$$
This integral rule extends the Darth Vader rule for continuous non-negative random variables. (Extension for discrete random variables is similar, but you have to be a bit more careful with the boundaries of the integrals.) In the special case where $X$ is continuous and non-negative we have $F_X(-x) = 0$ for all $x < 0$ and so the second term in this equation vanishes, giving the standard expectation rule. I have not seen this integral expression in any textbooks or papers, so it does not seem to be one that is used much (if at all?) in statistical practice. Nevertheless, it does provide one possible extension of the standard integral rule to deal with random variables that can be negative.
$^\dagger$ In the special case where $X$ is non-negative and $a=0$ this reduces down to the standard expectation rule for non-negative random variables shown in the question.