Capabilities that appear suddenly as language models reach certain sizes.
Capabilities that appear suddenly as language models reach certain sizes. A model might go from zero to strong performance on a task just by adding more parameters. Examples include arithmetic, code generation, and multi-step reasoning. Whether these are truly emergent or just hard to measure at small scales is debated.
Mathematical relationships showing how AI model performance improves predictably with more data, compute, and parameters.
Capabilities that appear in AI models at scale without being explicitly trained for.
An AI model with billions of parameters trained on massive text datasets.
A mathematical function applied to a neuron's output that introduces non-linearity into the network.
An optimization algorithm that combines the best parts of two other methods — AdaGrad and RMSProp.
Artificial General Intelligence.
Browse our complete glossary or subscribe to our newsletter for the latest AI news and insights.