Sequence Learning – Deprecated!
Elective for CS grad students at the Technical University of Applied Sciences Nuremberg.
⚠ Deprecated ⚠
Go to https://seqlrn.github.io for the most recent version!
⚠ Deprecated ⚠
Class Schedule and Credits
Time and Location: Mondays at 9.45 (online, Zoom link on Moodle)
Announcements and Discussions: Moodle Course #5312
Teams for discussion around assignments: 4fbxju8
.
Format
Each week, we will discuss algorithms and their theory before implementing them to get a better hands-on understanding.
The materials consist of a mix of required and recommended readings, slides as well as a set of programming assignments.
These assignments are mandatory and in python3
.
Pair-programming encouraged, BYOD strongly recommended!
Credits
Credits are earned through two components:
- All six assignments (dynamic programming, Markov chains, hidden Markov models, recurrent neural networks, attention, transformer) must be completed throughout the semester; assigments are ass/fail, pair programming encouraged (ie. you can submit as teams of two).
- Oral exam (20’) covering theory and assignments (graded; individual exams).
Note: Materials will be (mostly) in English, the lectures/tutorials will be taught in German unless English speaker present; oral exam in language of choice.
Important Dates
TBA
Recommended Textbooks
- Chaeo, K.-M. and Zhang, L.: Sequence Comparison (Springer). available online through Ohm Library
- Sun, R., Giles, L. and van Leeuwen, J.: Sequence Learning: Paradigms, Algorithms and Applications (Springer). available online through Ohm Library
- Niemann, H: Klassifikation von Mustern. 2. Überarbeitete Auflage, 2003 (available online)
- Huang, Acero, Hon: Spoken Language Processing: A Guide to Theory, Algorithm and System Development. (ISBN-13: 978-0130226167)
- Jurafsky, D and Martin, J: Speech and Language Processing. 2017 (available online)
- Manning, C, Raghavan P and Schütze, H: Introduction to Information Retrieval, Cambridge University Press. 2008. (available online)
- Goodfellow, I and Bengio,Y and Courville, A: Deep Learning. 2016 (available online)
Syllabus
Syllabus is currently undergoing updates… (as of Jan 21, 2021)
-
✆ March 19: Introduction. (slides, exercise)
We’ll start with the general concepts of supervised vs. unsupervised learning and classification of independent observations vs. sequences of observations. To get you motivated, we’ll look at a list of recent “AI products” that utilize sequence learning.
-
✆ March 26: Comparing Sequences. (slides by Ben Langmead, exercise)
We’ll start with a classic implementation of auto-correcting mispelled words to bring dynamic programming back to memory. We’ll also look at scalability regarding computation and memory efforts.
-
✆ April 2: States and Cost Functions. (slides, exercise)
Understand how DP can be used on an abstraction of distances and states. We’ll build a smarter, keyboard layout aware auto-correct and start looking into some applications in signal processing (isolated word and DTMF sequence classification).
April 9: Maundy Thursday (Gründonnerstag)
-
✆ April 16: Modeling Sequences. (slides, exercise)
Learn about n-grams, a simple yet effective approach to learn contexts of distcrete symbols. We’ll use n-grams to improve our auto-correct by incorporating context and suggesting following words.
-
✆ April 23: Hidden Markov Models. (slides, exercise)
We’ll take a close look at hidden Markov models and how to (efficiently) evaluate and train them. The Viterbi decoding algorithm tells us the most likely sequence and the path that lead to it. We’ll use them to build a proof-of-concept isolated word recognizer.
-
✆ April 30: Higher-Level Sequence Modeling with HMM. (slides, exercise)
Learn how to model complex sequences of arbitrary length that prohibit explicit modeling, such as speech recognition or choreographies in sports. Here we will combine what we’ve discussed so far: prefix trees, n-gram models and efficient search.
-
✆ May 7: Feed-Forward Neural Networks. (slides perceptron and nnets, fizzbuzz.py, exercise).
A brief introduction to neural networks: fundamentals, topologies and training. We’ll skip implementing the details and use pytorch for the examples. Did you know that you could program fizzbuzz as a neural network? Please have Python with Numpy and PyTorch installed and operational on your machine!
-
✆ May 14: Recurrent Neural Networks. (slides cs231n: RNNs, exercise)
Recurrent neural networks use feedback loops to introduce temporal context or “memory” into the network. Attention is a modeling concept which allows the networks to learn an even better understanding of the context. We’ll study them using two examples: language modeling and sentiment analysis.
Also: Introduction to the class project!
May 21: Ascension Day (Christi Himmelfahrt)
⚠ May 28: Project proposals due! ⚠
-
✆ May 28: Embeddings and Sequence-to-Sequence Learning (embeddings, s2s, literature and exercise)
Previous algorithms explicitly modeled the sequence, either as a graph-like structure such as an HMM or by concatenating observations to a single data point. Embeddings are learned feature representations that can incorporate large quantities of unlabeled data. Encoder-decoder networks are a special kind of topology of recurrent neural networks that can be used to model sequence to sequence mappings, such as found in end-to-end speech recognition, machine translation or automatic summarization – without explicitly modeling states! We’ll also look at transformers which can capture temporal structures without recurrence.
-
✆ June 4: Project Check-In 1
No class; teams will meet individually with instructor to discuss their projects. Plan for 20 minutes discussion and bring 5 slides: rough outline of related work section (3 slides), baseline results (1) and experiments outline (1).
Book your time slot: DFN Terminplaner
June 11: Corpus Christ (Fronleichnam)
-
✆ June 18: Project Check-In 2
No class; teams will meet individually with instructor to discuss their projects. Plan for 20 minutes to talk about your implementation and experiments, and a rough outline (bullet points) of the method and experiments sections (slides as needed).
Book your time slot: DFN Terminplaner
⚠ June 25: Papers due! ⚠
-
✆ June 25: How-To Peer Review, Sequence Kernels (slides, SVM slides, seq. kernels, assignment)
In many cases, classifying a sequence into a discrete class does not quite work with recurrent networks. We’ll learn about support vector machines, sequence kernels and methods to map sequences into a single observation of a continuous space.
⚠ July 2: Reviews due! ⚠
-
✆ July 2: Machine Learning in Production (slides)
We’ll then look at architectural challenges when training and deploying machine learning models for production.
⚠ July 9: Projects due! ⚠
-
✆ July 15-16: Project Colloquium
No class; teams will meet individually with instructor to present and discuss their paper and code. Plan for 20 minutes total to discuss: data usage, baseline, method, experiments and conclusions (slides to support the colloqium, not to present).
Timeslots will be coordinated in Project Check-In #2.
Subscribe to https://github.com/sikoried/sequence-learning/ to follow updates.