MBMACHINE BRIEF
AnalysisOriginalsModelsResearchStartupsTools
Newsletter

Navigate

  • Home
  • About Us
  • Newsletter
  • Search
  • Sitemap

Content

  • Original Analysis
  • Blog
  • Glossary
  • Best Lists
  • AI Tools

Categories

  • Models
  • Research
  • Startups
  • Robotics
  • Policy
  • Business
  • Analysis
  • Originals

Legal

  • Privacy Policy
  • Terms of Service
Machine Brief|

2026 Machine Brief. All rights reserved.

  1. Home
  2. /Glossary
  3. /Knowledge Distillation
Back to Glossary
ai

Knowledge Distillation

Training a smaller model to replicate the behavior of a larger one.

Definition

Training a smaller model to replicate the behavior of a larger one. The small model learns from the bigger model's probability distributions rather than raw data, capturing nuanced knowledge more efficiently. How companies create fast, cheap models that still perform well.

Share this term

Related Terms

Distillation

A technique where a smaller 'student' model learns to mimic a larger 'teacher' model.

Activation Function

A mathematical function applied to a neuron's output that introduces non-linearity into the network.

Adam Optimizer

An optimization algorithm that combines the best parts of two other methods — AdaGrad and RMSProp.

AGI

Artificial General Intelligence.

AI Alignment

The research field focused on making sure AI systems do what humans actually want them to do.

AI Safety

The broad field studying how to build AI systems that are safe, reliable, and beneficial.

Explore More

Latest NewsAI NewsMarketsAnalysisFull Glossary

Want to learn more about AI?

Browse our complete glossary or subscribe to our newsletter for the latest AI news and insights.

Browse GlossarySubscribe to Newsletter