I have a random symmetric matrix $ A \in \mathbb{R}^{M \times M}$, and random vector $b \in \mathbb{R}^M$. I also have access to expressions for the mean and variance of each element of $A$ and $b$ (so $M^2 + M$ elements in total), as well as the covariance between any elements in either $A$ or $b$ (so $(M^2+M)^2$ possible pairs). Many elements in $A$ and $b$ are highly correlated to each other as they are functions of the same underlying random variables.
My question is: How can I use the above to find approximate statistics (particularly the variance of) the solution to the linear system $A x = b$, assuming that $A$ is non-singular? Obviously the solution is explicitly given by $x = A^{-1} b$, but it is unclear what kind of distribution over $x$ will be induced by a particular distribution of $A,b$ (and I do only have access to first and second order moments of $A,b$, unfortunately).
I found a two-page PDF describing the statistics of a quotient (https://www.stat.cmu.edu/~hseltman/files/ratio.pdf) of two random variables. Here, taylor expansions were used to derive approximate statistics (mean and variance) of a quotient, given the mean/variance of the numerator and denominator. Is it possible to generalize this to matrix inversion/solutions of linear systems, perhaps using multivariate taylor expansions?