Short and sweet: I'd like to model $n$ random variables representing price changes of individual assets. Each of these should be distributed as a log normal variable with a median of 1. Is there a way to generate these variables with a given correlation matrix?
Longer explanation: I'd like to model a price change from time $t_0$ to $t_1$ assuming price grows with probability 50%. For a single asset, we could model the growth as log normal.
$$G \sim \text{lognormal}(\mu, \sigma)$$
Setting $\mu = 0$, the median will be $\text{exp}(\mu) = 1$. Meaning price will increase ($G > 1$) with probability 50%.
We could also get any variance we would like by changing $\sigma$.
Now lets assume we have $n$ such assets that can grow and they are correlated according to a correlation matrix A
. The traditional way to generate random variables that are correlated would be to generate independent variables and then adjust them using Cholesky decomposition.
In this case that wouldn't work as it would disrupt our goal of having the median be equal to 1. Moreover, the linear sum of two lognormal distributions is not lognormal.
Questions:
- Is there an algorithm to generate such numbers?
- Is this a reasonable model for and a reasonable set of constraints to model stock price movement from $t_0$ to $t_1$ or is a different set of assumptions commonly used?