Let's take $\sigma=1$ and ignore the division by $k;$ these simplifications will require us to multiply the answer by $|\sigma|/k$ (which I leave up to you). Thus we seek the expectation of $\left|Z(n,k)\right| $ where
$$Z(n,k) = \sum_{i\in\Phi_1} s_i - \sum_{i\in\Phi_2}s_i.$$
Because $-s_i$ and $s_i$ have the same distribution, the expression inside the absolute value has the same distribution as
$$\sum_{i\in\Phi_1\oplus\Phi_2}s_i$$
(writing $\Phi_1\oplus\Phi_2$ for the symmetric difference $\Phi_1\cup \Phi_2 \setminus \left(\Phi_1\cap\Phi_2\right)$), because the values in the intersection $\Phi_1\cap\Phi_2$ cancel out in the definition of $Z(n,k).$
Conditional on $(\Phi_1,\Phi_2),$ since $Z$ is the sum of independent Normal variables, its distribution is Normal with mean $0$ and variance $2(k-j)$ where $j$ is the cardinality of $\Phi_1\cap\Phi_2.$ (Notice that the component for $j=k$ is singular: it is an atom at $0.$)
Consequently, the distribution of $Z$ is a mixture of these Normal distributions. The weights in the mixture are the chances of $j$ given by the hypergeometric distribution
$$\Pr(|\Phi_1\cap\Phi_2|=j) = \frac{\binom{k}{j}\binom{n-k}{k-j}}{\binom{n}{k}} =: p_{n,k}(j).$$
The distribution of $|Z(n,k)|$ thus is a mixture of variables $Z_j(k),$ $j=0, 1, \ldots, k,$ that are $\sqrt{2(k-j)}$ times (independent copies of) $\chi(1)$ variables. Its expectation therefore is
$$E\left[\left|Z(n,k)\right|\right] = \sum_{j=0}^k p_{n,k}(j) \sqrt{2(k-j)} \sqrt{2/\pi} = \frac{2}{\sqrt{\pi}} \sum_{j=0}^k \sqrt{k-j}\, p_{n,k}(j).$$
As a test, we may simulate many values of $Z(n,k)$ directly from either of the first two formulas and compare their distribution to the mixture. Here, for instance, is the cumulative distribution of $5000$ simulated values on which the mixture CDF is overplotted in red:

The agreement is excellent.
Finally, with the formula for the expected absolute value available, we may plot $E\left[\left|Z(n,k)\right|\right]$ for $k=0, 1, \ldots, n.$ Here is a plot for larger $n:$

Remarks
This analysis readily extends to the case where $\Phi_1$ and $\Phi_2$ are of different sizes $k_1$ and $k_2:$ replace $2(k-j) = \left|\Phi_1\oplus\Phi_2\right|$ by $(k_1-j)+(k_2-j)$ at the outset and use
$$p_{n;k_1,k_2}(j)=\Pr\left(\left|\Phi_1\cap\Phi_2\right| = j\right) = \frac{\binom{k_1}{j}\binom{n-k_1}{k_2-j}}{\binom{n}{k_2}}$$
for the mixture weights, taking the sum over all $j$ for which the binomial coefficients are nonzero.
The atom (discrete component) in the distribution of $Z$ occurs only when $k_1=k_2=k.$ Its weight is the chance of complete cancellation where $\Phi_1=\Phi_2,$ given by $$p_{n,k}(k) = 1/\binom{n}{k}.$$ In the figure (showing the CDF), this is the height of the vertical jump at $Z=0,$ there equal to $1/\binom{5}{3}=1/10.$
We could even go so far as to choose fixed coefficient vectors $\alpha_i$ and $\beta_i,$ let the $s_i$ have an arbitrary distribution (with possibly nonzero mean), and consider
$$Z(n,k;\alpha,\beta) = \sum_{i\in\Phi_1}\alpha_i s_i + \sum_{i\in\Phi_2}\beta_i s_i.$$
The question concerns the case $\alpha_i=1/k$ and $\beta_i=-1/k$ for all $i.$ The preliminary simplification of factoring out the common factor of $1/k$ is no longer available, but the analysis doesn't essentially change: the strategy of conditioning on $(\Phi_1,\Phi_2)$ and breaking the union of the samples into $\Phi_1\setminus\Phi_2,$ $\Phi_2\setminus\Phi_1,$ and $\Phi_1\cap\Phi_2$ still works. I leave the algebraic complications to the interested reader.
Appendix
Here is R
code for the simulation in the first figure:
n <- 5
k <- 3
#
# Random draws of Z
#
set.seed(17)
Z <- replicate(5e3, {
x <- rnorm(n)
i1 <- sample.int(n, k)
i2 <- sample.int(n, k)
sum(x[i1]) - sum(x[i2]) # Original formula
# sum(x[setdiff(union(i1,i2), intersect(i1,i2))])# Second formula
})
#
# CDF of Z
#
pf <- function(x, n, k) {
lp <- function(j) lchoose(k,j) + lchoose(n-k,k-j) - lchoose(n,k)
z <- sapply(0:k, function(j) exp(lp(j) + pnorm(x, 0, sqrt(2*(k-j)), log=TRUE)))
rowSums(matrix(z, ncol=k+1))
}
#
# Plots
#
plot(ecdf(Z), main=paste0("Simulated values of Z(",n,",",k,")"),
cex.main=1, xlab="Z", ylab="Probability")
curve(pf(x, n, k), xlim=c(min(Z), -1e-15), add=TRUE, col="Red", lwd=2, n=1001)
curve(pf(x, n, k), xlim=c(1e-15, max(Z)), add=TRUE, col="Red", lwd=2, n=1001)
Here is R
code for the second figure, showing the direct calculation of the expectation:
eZ <- Vectorize(function(n, k) {
p <- function(j) exp(lchoose(k,j) + lchoose(n-k,k-j) - lchoose(n,k))
j <- 0:k
2 / sqrt(pi) * sum(sqrt(k-j) * p(j))
}, "k")
n <- 25
plot(0:n, eZ(n, 0:n), type="h", ylab="Value",
main=expression(E*group("[", list(italic(Z)(25,k)), "]")), cex.main=1,
bty="n", xlab=expression(italic(k)))