I have found some info on this problem. This question is about the proof of a theorem due to Feller, to be found on volume 2 of his "Introduction to Probability Theory and its Applications" (p. 278-279). Here is a restatement.
$\mathbf{Theorem.}$ Let $X_1,\dots,X_n$ be independent random variables with distribution functions satisfying $1-F_i(x)\sim x^{-\alpha}L_i(x)$, where $L_i$ is slowly varying at infinity. Then, the convolution $G_n:=F_1\star\dots\star F_n$ has a regularly varying tail such that $$1-G_n(x)\sim x^{-\alpha}(L_1(x)+\dots+L_n(x)) \, .$$
Feller proves the case with two random variables and just states that the general result follows by induction. By the way, his proof of the $n=2$ case is a gem.
So we already know from Feller that the theorem holds for two random variables. To prove the induction step, suppose that the theorem holds for $n-1$ random variables, which means that $$1-G_{n-1}(x)\sim x^{-\alpha}(L_1(x)+\dots+L_{n-1}(x)) \, .$$ Since the sum of slowly varying functions is a slowly varying function itself, we have that $X_1+\dots+X_{n-1}$ is a random variable, independent of $X_n$, whose distribution function $G_{n-1}$ satisfies the tail hypothesis of the theorem, that is, $1-G_{n-1}(x)\sim x^{-\alpha}M(x)$, where the slowly varying $M=L_1+\dots+L_{n-1}$. By the associativity of the convolution, we know that
$$
G_n = F_1\star\dots\star F_{n-1}\star F_n = (F_1\star\dots\star F_{n-1})\star F_n = G_{n-1}\star F_n\, ,
$$
and we are back to the (already proved by Feller) case of two random variables satisfying the hypotheses of the theorem. Therefore,
$$
1 - G_n(x) \sim x^{-\alpha}(M(x)+L_n(x)) = x^{-\alpha}(L_1(x)+\dots+L_{n-1}(x)+L_n(x)) \, .
$$
Hence, the tail of $G_n$ satisfies the necessary property, the theorem holds for $n$ random variables, and we are done with the induction step.