I am investigating whether building a classifier for sentence classification using CNN can be used for sentence generation. Say, we are classifying news articles' titles (classes such as sports, business, etc.). The question is whether the same model can be used to generate or reconstruct the original input words sentences, using de-convolution (sometimes called transpose convolution) .
The classic CNN-based model for sentence classification:
The above model is equipped with a (max) pooling layer. Since pooling is non-invertible, and positions of word vectors will be lost, what are the alternative CNN-based models which can be easily de-convoluted?
Is there a CNN model for NLP without the pooling layer? In that case how to deal with variable length input? Is dilated convolution the way to go to overcome the aforementioned problem.
Any pointers or references is highly appreciated.