Reading Club session 17 December 2020

We returned to a topic for which we had previously had many sessions, that of NLP and word embeddings. We discussed “Deep contextualized word representations” by Peters et alia. We discussed the historical context of ELMo being popular as a bidirectional language model having the advantage over Word2vec of taking word order into account.

For the second half of the session, we went over a tutorial on using pretrained models from Tensor Hub, in particular the BERT language model.