LLM How I Trained a High-Performance Coding Model on a Single GPU Meet Anni: a 14B parameter coding LLM built on a student budget. Using progressive training and data distillation on a single GPU, we overcame hardware limits to achieve SOTA-tier efficiency and performance.