I have just read the paper https://arxiv.org/pdf/1701.00160.pdf which is a tutorial on GAN. I have a few clarifications:
- Must the dimension of the output layer of Generator match the input layer of Discriminator so that the following equation is possible $D(G(z))$ in the equation (1) $$-\frac{1}{2} \mathbb{E}_{x\sim p_{data}(x)}[\log D(x)] -\frac{1}{2}\mathbb{E}_{z\sim p_z(z)}[\log (1 - D(G(z)))]$$ (1)
2.The writers say that equation (1) is simply a cross entropy cost function. I struggle to see how it compares with equation (2). $$C = -\frac{1}{n} \sum_x [y \ln a+(1−y)\ln(1−a)]$$. specifically what does 1/2 in equation (1) represent. Is equation (1) missing truth values i.e $y$ as shown in equation (2) or $y$ is 1/2 in equation (1). Any clarification will be highly appreciated.