Gaussian Error Linear Unit. An activation function used in most modern transformers including GPT and BERT. Smoother than ReLU, it applies a probabilistic gating that lets some negative values through. Empirically performs better than ReLU for NLP tasks, though the theoretical reasons aren't fully understood.
A mathematical function applied to a neuron's output that introduces non-linearity into the network.
Rectified Linear Unit.
The neural network architecture behind virtually all modern AI language models.
An optimization algorithm that combines the best parts of two other methods — AdaGrad and RMSProp.
Artificial General Intelligence.
The research field focused on making sure AI systems do what humans actually want them to do.
Browse our complete glossary or subscribe to our newsletter for the latest AI news and insights.