This course will teach you the theory behind large language models (LLMs) and give you the tools to implement LLMs in your applications.
Dr. Bruno Yun
Send your queries by email. For a face-to-face meeting, contact me by email to arrange one.
This part of the course will have 18 hours of lectures, 12 hours of practicals.
The teaching timetable is split into three sessions as follows.
Part 1: Zoom on the transformer architecture (9th of November 2026)
Continuous assessment 1 (written test, 30%) → Deadline 20/11/2026 23:59
Part 2: A short history of large language models (20th of November 2026)
Part 3: Pre-training and Post-training (25th of November 2026)
Part 4: Practical techniques and tools for large language models (30th of November 2026)
Retrieval augmented generation (RAG) and context-engineering
Useful tools for large language models
Continuous assessment 2 (oral presentation, 30%) → Deadline 13th of December 2026 23:59
Part 5: Evaluation and ethical questions of large language models (1st of December 2026)
Part 6: Exciting research discussion on large language models (2nd of December 2026)
Part 7: Additional practical, support on project, and oral presentations (4th of December 2026)
Continuous assessment 3 (code and report, 40%) → Deadline 13th of December 2026 23:59
Practicals 1 (10th, 20th, 24th of November 2026, 2 sessions)
Practical 0: Background on PyTorch
Practical 1: Tokenizers, embeddings, and multi-head attention from scratch
Practicals 2 (24th & 27th of November 2026, 2 sessions)