This course will teach you the theory behind large language models (LLMs) and give you the tools to implement LLMs in your applications.
Dr. Bruno Yun
Send your queries by email. For a face-to-face meeting, contact me by email to arrange one.
This part of the course will have 18 hours of lectures, 12 hours of practicals.
The teaching timetable is split into three sessions as follows.
Part 1: Zoom on the transformer architecture (17th of November 2025)
Practical 0: Background on PyTorch (if needed)
Practical 1: Tokenizers, embeddings, and multi-head attention from scratch
Continuous assessment 1 (written test, 30%) → Deadline 30/11/2025 23:59
Part 2: A short history of large language models (24th of November 2025)
Part 3: Pre-training and Post-training (1st of December 2025)
Part 4: Practical techniques and tools for large language models (8th of December 2025)
Retrieval Augmented Generation
Useful tools for large language models
Continuous assessment 2 (oral presentation, 30%) → Deadline 11th of January 2026 23:59
Part 5: Evaluation and ethical questions of large language models (15th of December 2025)
Practical 3: Fine-tuning GPT2 and simple LLM-based evaluation
Part 6: Exciting research of large language models (5th of Jan 2026)
Part 7: Additional practical and support on project (12th & 19th of Jan 2026)
Practical 4 Building applications with LLMs
Continuous assessment 3 (code and report, 40%) → Deadline 25th of January 2026 23:59