Learning Through Comparison: Use Cases of Contrastive Learning
ML+X: Machine Learning Beyond Traditional CS Fields
Event Details
ML+X Forum on Contrastive Learning - Yin Li & Chris Endemann
Summary: Contrastive learning is more than just a tool for big tech—it's a practical approach that improves models across a variety of machine learning tasks, from structured data to vision and NLP. At its core, contrastive learning transforms how models learn by focusing on relationships rather than rigid class assignments. Instead of training models to classify data into fixed categories, contrastive learning encourages them to structure representations based on similarity and dissimilarity between pairs of observations. For example, a positive pair might be different views of the same image, while a negative pair could be two unrelated images. This relational approach has been key to advancements in feature learning, clustering, multimodal models, and out-of-distribution detection, helping models make better use of unlabeled data and generalize more effectively. This forum will explore how contrastive learning works, why it's useful, and how you can integrate it into your workflows. We'll cover key methods, real-world applications, and practical ways to get started. Bring your challenges, ideas, and use cases, and let's discuss how contrastive learning can help solve real ML problems.
Registration: We ask that attendees please register for this event.
Miss a past forum? Most forums are recorded! Visit ML+X Nexus to get caught up. We regularly upload past ML+X forums and other AI/ML talks around campus there. Nexus is the ML+X community’s centralized hub for sharing machine learning (ML) resources (e.g., educational materials, applications and blogs, useful scripts, and more).