A self-supervised learning approach where the model learns by comparing similar and dissimilar pairs of examples.
A self-supervised learning approach where the model learns by comparing similar and dissimilar pairs of examples. It pulls representations of similar items closer together and pushes different items apart in the embedding space. The core idea behind models like CLIP and SimCLR.
A training approach where the model creates its own labels from the data itself.
Contrastive Language-Image Pre-training.
A dense numerical representation of data (words, images, etc.
A mathematical function applied to a neuron's output that introduces non-linearity into the network.
An optimization algorithm that combines the best parts of two other methods — AdaGrad and RMSProp.
Artificial General Intelligence.
Browse our complete glossary or subscribe to our newsletter for the latest AI news and insights.