For example if I have two multiplied distributions a * b
:
d = np.random.normal(1,2,150000) * np.random.normal(1,3,150000)
mean_d = np.mean(d); print(mean_d, '\t- mean d')
std_d = np.std(d); print(std_d, '\t- std d')
var_d = np.var(d); print(var_d, '\t- var d')
d = np.random.normal(0,2,150000) * np.random.normal(0,3,150000)
mean_d = np.mean(d); print(mean_d, '\t- mean d')
std_d = np.std(d); print(std_d, '\t- std d')
var_d = np.var(d); print(var_d, '\t- var d')
d = np.random.normal(2,4,150000) * np.random.normal(3,6,150000)
mean_d = np.mean(d); print(mean_d, '\t- mean d')
std_d = np.std(d); print(std_d, '\t- std d')
var_d = np.var(d); print(var_d, '\t- var d')
Result:
1.00402910382297 - mean d
7.006851762786954 - std d
49.09597162567064 - var d
-0.010013854223866369 - mean d
5.98944114495835 - std d
35.873405228919985 - var d
6.015273009526654 - mean d
29.412059046178683 - std d
865.0692173359013 - var d
Mean of the product calculated by multiplying mean values of each distribution mean_d = mean_a * mean_b
. What is the formula for calculating variance or standard deviation?
Calculating using this formula:
def std_prod(x,y):
return np.sqrt(np.mean(y)**2*np.std(x)**2 + np.mean(x)**2*np.std(y)**2 + np.std(y)**2*np.std(x)**2)
gives slightly different results:
7.004889924524913 - std d
6.983597750949972 - std_prod d
5.9967239637288525 - std d
5.9877281028408005 - std_prod d
29.315982668663818 - std d
29.305068878687155 - std_prod d