This is a brief summary of paper for me to study and organize it, Convolutional Neural Networks for Sentence Classification (Kim., EMNLP 2014) I read and studied.

They propose CNN architecture on top two sets of pre-training word2vecs, static and dynamic one.

Their architecture is the followings:

(Kim., EMNLP 2014)

For two sets of pre-training word vectors, one of word2vecs is kept static throughout training training and another is fine-tuned via backpropagation.

The following is examples of top 4 neighboring words based on cosine similarity for static channel(left) and fine-tuned vectors in the non-static channel (right) for multichannel model.

(Kim., EMNLP 2014)

Reference