(en Anglais)
The course focuses on modern and statistical approaches to NLP. Natural language processing (NLP) is today present in some many applications because people communicate most everything in language : post on social media, web search, advertisement, emails and SMS, customer service exchange, language translation, etc. While NLP heavily relies on machine learning approaches and the use of large corpora, the peculiarities and diversity of language data imply dedicated models to efficiently process linguistic information and the underlying computational properties of natural languages. Moreover, NLP is a fast evolving domain, in which cutting-edge research can nowadays be introduced in large scale applications in a couple of years. The course focuses on modern and statistical approaches to NLP: using large corpora, statistical models for acquisition, disambiguation, parsing, understanding and translation. An important part will be dedicated to deep-learning models for NLP. – Introduction to NLP, the main tasks, issues and peculiarities
– Sequence tagging: models and applications
– Computational Semantics
– Syntax and Parsing
– Deep Learning for NLP: introduction and basics
– Deep Learning for NLP: advanced architectures
– Deep Learning for NLP: Machine translation, a case study
Bibliographie, lectures recommandées :
– Costa-jussà, M. R., Allauzen, A., Barrault, L., Cho, K., & Schwenk, H. (2017). Introduction to the special issue on deep learning approaches for machine translation. Computer Speech & Language, 46, 367-373.
– Dan Jurafsky and James H. Martin. Speech and Language Processing (3rd ed. draft): https://web.stanford.edu/~jurafsky/slp3/
– Yoav Goldberg. A Primer on Neural Network Models for Natural
Language Processing: http://u.cs.biu.ac.il/~yogo/nnlp.pdf
– Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Deep Learning: http://www.deeplearningbook.org/