I know that if $\pmb{X}_1$ and $\pmb{X}_2$ are independent copies of a $n \times 1$ random vector $\pmb{X}$, then $\pmb{X}$ is said to be sum stable in $\mathbb{R}^n$ if $a\pmb{X}_1 + b\pmb{X}_2 \stackrel{D}{=}c\pmb{X} + \pmb{d}$ for any positive scalars $a$ and $b$, and for some positive scalar $c$, and some real vector $\pmb{d}$.
My question: Does the definition generalize to $A\pmb{X}_1 + B\pmb{X}_2 \stackrel{D}{=}C\pmb{X} + \pmb{d}$, where $A$, $B$, and $C$ are $n \times n$ square matrices? If yes, then what are the conditions on $A$, $B$, and $C$? A small proof will be highly appreciated.