Submit your reviews here
Your student ID:
Your email:
Paper title:
Please select
(Week 1 Wed) A Neural Probabilistic Language Model
(Week 2 Mon) Distributed Representations of Words and Phrases and their Compositionality
(Week 2 Wed) Global Vectors for Word Representation
(Week 3 Wed) Text Classification from Labeled and Unlabeled Documents using EM
(Week 4 Mon) A Maximum Entropy Approach to Natural Language Processing
(Week 4 Wed) Discriminative Training Methods for Hidden Markov Models
(Week 6 Mon) Probabilistic Models for Segmenting and Labeling Sequence Data
(Week 6 Wed) Non-Projective Dependency Parsing using Spanning-Tree Algorithms
(Week 7 Wed) A Fast and Accurate Dependency Parser using Neural Networks
(Week 8 Mon) Neural Machine Translation by Jointly Learning to Align and Translate
(Week 8 Wed) Attention is All You Need
(Week 9 Mon) Pre-training of Deep Bidirectional Transformers for Language Understanding
Your review: