Beyond Chatbots: Concepts, Code, and Practical Use Cases for Transformers that Really Get Work Done
Thursday, 24. April 2025 | 9:00 - 17:00
Description
Although more people are familiar with generative AI tools based on Decoder-Only Transformers like ChatGPT, Encoder-Only Transformers, like BERT, can be much more useful. Encoder-Only Transformers are the backbone for RAG (retrieval augmented generation), sentiment analysis and classification problems, and clustering. In this workshop, we’ll cover the main ideas behind the different types of Transformers, learn how to code various components from scratch in PyTorch, and then use pipelines to illustrate practical use cases for Encoder-Only Transformers.
Topics
Step-by-step overview of how Encoder-Only Transformers (like BERT) work and compare and contrast them to Decoder-Only Transformers (like ChatGPT). This will include learning about Word Embedding, Positional Encoding, and Attention.
Walk through the PyTorch code for Word Embedding.
Use Word Embedding to generate new music playlists based on your favorite songs.
Walk through the PyTorch code for Positional Encoding.
Walk through the PyTorch code for Attention.
Use RAG (retrieval augmented generation) to ask questions about your favorite Wikipedia article.
Use pre-trained Encoder- and Decoder-Only Transformers to cluster 45,000 documents and give each cluster a label.
Learn how we can have confidence in the output from neural networksin general.
Requirements
Basic coding skills in Python.
Target audience
Anyone who wants to learn how artificial intelligence (AI) like ChatGPT works. No previous experience with AI is necessary.