Natural language processing has gone through a series of innovations in the last few years. Deep learning achieved state-of-the-art results on many tasks, and lowered the entry barrier for newcomers. This course will focus on modern word and meaning representation, and compare them to classic linguistic approaches. We will talk about statistical learning and language modelling and demonstrate some of these methods' weak points.
The target audience of this course is data-scientists and software developers who are proficient in other fields of machine learning and would like to get a crash course on natural language processing applications.
Familiarity with Machine Learning
Graduates of this course will be able to:
Model conversational interaction as a machine learning model
Identify a topic of document programmatically
Develop algorithms to augment and classify human generated texts
Introduction to Natural Language Processing
Intro to Linguistics
Phonemes / Morphemes / Language Trees
Syntactic / Semantic Representation
Natural Language Tasks
Statistical Modeling of Language
The Bag-of-Words Model
Hypothesis Testing on Words and Sentences
Bayesian Model of a Language
Document / Sentence Representation
Syntax and Semnatics
Intro to Parse Trees
Word2vec and extensions
Deep Learning for Natural Language Processing
Recurrent Neural Networks
Transformers and Intro to BERT
Course requires a minimum number of students to open.