Zero-Shot Learning
Definition
A model's ability to perform a task it was not explicitly trained on, using only a natural language description of the task.
Zero-shot learning leverages the general knowledge encoded in large language models to perform tasks without any task-specific training examples. The model receives only a description of what to do and applies its pre-trained knowledge to produce output.
This capability is fundamental to how Ummless uses language models for text refinement — the model has not been specifically fine-tuned for each refinement preset, but can follow the preset's natural language instructions to transform text in the desired way. Zero-shot performance improves with model size and training data quality.