I am trying to re-create a variational autoencoder. The loss function has two terms: reconstruction loss and KL-divergence term. KL-divergence is defined as $$ D_{KL}(P||Q) = -\sum_{x\in X}{P(X)\log\bigg(\frac{Q(X)}{P(X)}}\bigg)$$
While in the the code here it says
kl_loss = - 0.5 * K.mean(1 + z_log_sigma - K.square(z_mean) - K.exp(z_log_sigma), axis=-1)
The formula for KL-divergence looks completely different from what is in this line of code. Can somebody familiar with variational autoencoders help?