Saturday, April 21, 2018

Lecture 14 (20/04/2018): syntactic parsing (2/2)

The Early algorithm. Probabilistic CFGs. Probabilistic parsing, Neural transition-based dependency parsing with LSTMs: arc-factored dependency parsing.

Lecture 13 (19/4/2018): syntactic parsing (1/2)

Introduction to syntax. Context-free grammars and languages. Treebanks. Normal forms. Dependency grammars. Syntactic parsing: top-down and bottom-up. Structural ambiguity Backtracking vs. dynamic programming for parsing. The CKY algorithm.

Friday, April 6, 2018

Lecture 10 (06/04/2018): Recurrent Neural Networks and LSTM; POS tagging with LSTMs in TensorFlow

Introduction to Recurrent Neural Networks (RNNs): definitions and configurations. Simple RNN, CBOW as RNN, gated architectures, Long-Short Term Memory networks (LSTMs). Neural POS tagging with LSTMs in TensorFlow.

Lecture 9 (05/04/2018): part-of-speech tagging

Stochastic part-of-speech tagging. Hidden markov models. Deleted interpolation. Linear and logistic regression: Maximum Entropy models. Transformation-based POS tagging. Handling out-of-vocabulary words. The Stanford POS tagger.



Friday, March 23, 2018

Lecture 8 (23/03/2018): more on word2vec; smoothing for language modeling; introduction to part-of-speech tagging

Use of word embeddings produced with word2vec. Smoothing techniques for probabilistic language modeling. Introduction to part-of-speech tagging: word classes; universal tag set.


Lecture 7 (22/03/2018): word2vec and its implementation

Word2vec: CBOW and skipgram; explanation, derivation of the loss function and implementation in TensorFlow.


Friday, March 16, 2018

Lecture 6 (16/03/2018): deep learning and word embeddings (1)

Introduction to neural networks. The perceptron. Neural units. Activation functions. MaxEnt and softmax. Word embeddings: rationale and word2vec. CBOW and skipgram. Homework 1 assignment!


Lecture 5 (15/03/2018): language modeling

We introduced N-gram models (unigrams, bigrams, trigrams), together with their probability modeling and issues. We discussed perplexity and its close relationship with entropy, we introduced smoothing