The University of Sheffield
Natural Language Processing Group

NLP Reading Group

The target audience is all the members of the NLP group and other possible interested participants.

The meeting will take place weekly for one hour usually on Tuesdays from 11-12pm.

The meetings of the group will be informal and no necessary preparation will be required with the exception of the moderator reading the current paper and the rest having at least a brief overview of it.

Next Meeting

Tuesday 13 December 2016

Optimization and Sampling for NLP from a Unified Viewpoint
Marc Dymetman, Guillaume Bouchard, Simon Carter

Past Meetings

Tuesday 6 December 2016

Matrix Completion has No Spurious Local Minimum
Rong Ge, Jason D. Lee, Tengyu Ma

Tuesday 29 November 2016

Compositional Semantic Parsing on Semi-Structured Tables
Panupong Pasupat and Percy Liang

Tuesday 22 November 2016

Minimum Risk Training for Neural Machine Translation
Shiqi Shen, Yong Cheng, Zhougjun He, Wei He, Hua Wu, Maosong Sun, Yang Liu

Tuesday 15 November 2016

Generation from Abstract Meaning Representation using Tree Transducers
Jeffrey Flanigan, Chris Dyer, Noah A. Smith and Jaime Carbonell

Tuesday 1 November 2016

Visual Representations for Topic Understanding and Their Effects on Manually Generated Labels Transactions of the Association for Computational Linguistics, 2016.
Alison Smith, Tak Yeon Lee, Forough Poursabzi-Sangdeh, Leah Findlater, Jordan Boyd-Graber, and Niklas Elmqvist

Tuesday 25 October 2016

Learning to Search Better than your Teacher

  • Talk
    Chang et al. ICML 2015

    Tuesday 11 October 2016

    A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task
    Danqi Chen, Jason Bolton, Christopher D. Manning

    Tuesday 4 October 2016

    Ultradense Word Embeddings by Orthogonal Transformation
    Sascha Rothe, Sebastian Ebert, Hinrich Schütze

    Tuesday 7 June 2016

    Not All Character N-grams Are Created Equal: A Study in Authorship Attribution.
    Upendra Sapkota, Steven Bethard, Manuel Montes-y-Gómez & Thamar Solorio (2015)

    Tuesday 31 May 2016

    Relation extraction with matrix factorization and universal schemas.
    Riedel, S., Yao, L., McCallum, A., & Marlin, B. M. (2013)

    Tuesday 10 May 2016

    Training Deterministic Parsers with Non-Deterministic Oracles, TACL

  • slides
    Goldberg, Y. and Nivre, J. (2013)

    Tuesday 3 May 2016

    A New Corpus and Imitation Learning Framework for Context-Dependent Semantic Parsing
    Vlachos, A. and Clark, S.

    Tuesday 22 April 2016

    Sequence Level Training with recurrent Neural Networks
    Marc'Aurelio Ranzato, Sumit Chopra, Michael Auli, Wojciech Zaremba

    Tuesday 22 March 2016

    "Distributed Representation of Sentences and Documents"
    Quoc Le and Tomas Mikolov

    Tuesday 8 March 2016

    AutoExtend: Extending Word Embeddings to Embeddings for Synsets and Lexemes
    Sascha Rothe; Hinrich Schütze. ACL2015 (best student paper)

    Tuesday 23 February 2016

    From Word Embeddings To Document Distances
    Kusner et al.

    Tuesday 16 February 2016

    "Target-Dependent Twitter Sentiment Classification with Rich Automatic Features"

    Tuesday 9 February 2016

    "Evaluation methods for unsupervised word embeddings"

    Tuesday 25 January 2016

    Multi-Perspective Sentence Similarity Modeling with Convolutional Neural Networks
    Hua He, Kevin Gimpel, and Jimmy Lin. EMNLP2015

    Tuesday 19 January 2016

    Multilingual Image Description with Neural Sequence Models

    Tuesday 12 January 2016

    "Improving Distributional Similarity with Lessons Learned from Word Embeddings"

    Tuesday 8 December 2015

    Using Discourse Structure Improves Machine Translation Evaluation.
    F Guzmán, S Joty, L Màrquez, P Nakov

    And here are the author's slides

    Tuesday 1 December 2015

    Practical Bayesian Optimization of Machine Learning Algorithms Advances in Neural Information Processing Systems, 2012
    Snoek, J.; Larochelle, H. & Adams, R. P.

    Related presentations/lecture slides:

  • http://becs.aalto.fi/en/research/bayes/courses/4613/Vik_Kamath_Presentation.pdf
  • http://drona.csa.iisc.ernet.in/~indous/Lectures-2014/slides/jasper.pdf
  • Related Video

    My reading group presentation slides

    Tuesday 24 November 2015

    Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks ACL 2015
    LSTMs? Kai Sheng Tai, Richard Socher, Christopher D. Manning

  • http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-2-implementing-a-language-model-rnn-with-python-numpy-and-theano/
  • http://www.wildml.com/2015/10/recurrent-neural-networks-tutorial-part-3-backpropagation-through-time-and-vanishing-gradients/
  • http://colah.github.io/posts/2015-08-Understanding-LSTMs/

    Additional resource about LSTM: "Anyone Can Learn To Code an LSTM-RNN in Python"

    Tuesday 17 November 2015

    RNNs/LSTMs ConvNets

    More details on auto encoders for unsupervised pre-training:

  • http://deeplearning.stanford.edu/wiki/index.php/Autoencoders_and_Sparsity
  • http://www.jmlr.org/papers/volume11/erhan10a/erhan10a.pdf
  • http://www.slideshare.net/billlangjun/simple-introduction-to-autoencoder

    Tuesday 10 November 2015

    Multi-Metric Optimization Using Ensemble Tuning. NAACL2013. Video
    Baskaran Sankaran, Anoop Sarkar and Kevin Duh

    Tuesday 3 November 2015

    NN tutorials by Quoc Le

    Josiah's slides

    Other resources:

  • Andrej Karpathy's notes
  • Different objective functions, multiclass problems
  • Gradient descent
  • Backpropagation
  • Discussion about different activation functions

    Tuesday 27 October 2015

    Three blog posts introducing RNNs for language modelling in equations and code

    might help to read this NLP primer

    Additional material:
    a thorough explanation of back propagation

    Tuesday 20 October 2015

    Teaching Machines to Read and Comprehend. NIPS 2015.
    Karl Moritz Hermann, Tomáš Kociský, Edward Grefenstette, Lasse Espeholt, Will Kay, Mustafa Suleyman, Phil Blunsom

    Slides (presented at LXMLS)

    Background reading:

  • Understanding LSTMs
  • NAACL 2013 Tutorial "Deep Learning without Magic"
  • EMNLP 2014 Tutorial "Embedding Methods for NLP"

    Related Work:

  • Entailment with Neural Attention (better description of attention models than in the NIPS paper in my opinion)
  • Memory Networks

    Tuesday 13 October 2015

    A large annotated corpus for learning natural language inference. Proceedings of EMNLP 2015.
    Samuel R. Bowman, Gabor Angeli, Christopher Potts, and Christopher D. Manning

    Should compare this to work on (multilingual) textual similarity