Contrastive Learning

  • Contrastive learning can be Supervised Learning, Unsupervised Learning and also Semi-supervised Learning
  • The main advantage of contrastive learning is to it can learn from unlabeled data
  • In contrastive learning, there is an anchor, a positive and 1+ (in batch) negative points
    • The main goal of this learning is to make the anchor closer to the positive
    • And make the anchor far from the negative(s)
    • During this learning, it will learn representation for the anchor, positive and negative(s)

Things to remember:

  1. Batch size is very important parameter for contrastive learning. Larger batch size is better as it gives more diver negative samples
  2. We need hard negative and not false negatives


  1. Standard Contrastive Loss
  2. Triplet Loss
  3. InfoNCE Loss