Loading...
Unsupervised Learning for Agents(ULA)
Discovering hidden patterns and structures in unlabeled data to enhance agent understanding
๐ฏ 30-Second Overview
Pattern: Adapt models to new domains using unlabeled data through self-supervised learning objectives
Why: Leverages abundant unlabeled data to learn domain-specific patterns without expensive annotation requirements
Key Insight: Self-supervised pretext tasks create supervisory signals from data structure, enabling effective domain adaptation
โก Quick Implementation
๐ Do's & Don'ts
๐ฆ When to Use
Use When
- โข Large amounts of unlabeled domain data are available
- โข Labeled data is expensive or impossible to obtain
- โข Need to adapt to new domains with different distributions
- โข Self-supervised signals can be extracted from data structure
- โข Domain has rich inherent patterns and regularities
Avoid When
- โข High-quality labeled data is readily available
- โข Domain lacks clear self-supervised signals
- โข Computational resources are severely limited
- โข Immediate deployment without adaptation time
- โข Simple transfer learning is sufficient
๐ Key Metrics
๐ก Top Use Cases
References & Further Reading
Deepen your understanding with these curated resources
Foundational Self-Supervised Learning
Self-Supervised Learning: Generative or Contrastive (Liu et al., 2021)
A Simple Framework for Contrastive Learning of Visual Representations (Chen et al., 2020)
Momentum Contrast for Unsupervised Visual Representation Learning (He et al., 2020)
Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning (Grill et al., 2020)
Language Model Self-Supervision
BERT: Pre-training of Deep Bidirectional Transformers (Devlin et al., 2018)
RoBERTa: A Robustly Optimized BERT Pretraining Approach (Liu et al., 2019)
DeBERTa: Decoding-enhanced BERT with Disentangled Attention (He et al., 2020)
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators (Clark et al., 2020)
Domain Adaptation Techniques
Contrastive Learning Methods
SimCLR: A Simple Framework for Contrastive Learning (Chen et al., 2020)
SwAV: Unsupervised Learning of Visual Features by Contrasting Cluster Assignments (Caron et al., 2020)
SimCSE: Simple Contrastive Learning of Sentence Embeddings (Gao et al., 2021)
SupCon: Supervised Contrastive Learning (Khosla et al., 2020)
Masked Language Modeling
BERT: Pre-training of Deep Bidirectional Transformers (Devlin et al., 2018)
SpanBERT: Improving Pre-training by Representing and Predicting Spans (Joshi et al., 2020)
ERNIE: Enhanced Representation through Knowledge Integration (Zhang et al., 2019)
ALBERT: A Lite BERT for Self-supervised Learning (Lan et al., 2019)
Recent Advances (2023-2024)
Multimodal Self-Supervision
CLIP: Learning Transferable Visual Representations from Natural Language Supervision (Radford et al., 2021)
ALIGN: Scaling Up Visual and Vision-Language Representation Learning (Jia et al., 2021)
Florence: A New Foundation Model for Computer Vision (Yuan et al., 2021)
DALLE-2: Hierarchical Text-Conditional Image Generation (Ramesh et al., 2022)
Cross-Domain Adaptation
Domain Adaptation for Neural Networks: A Review (Farahani et al., 2021)
Unsupervised Cross-domain Representation Learning (Hoffman et al., 2018)
Deep Transfer Learning: A New Deep Learning Research Direction (Tan et al., 2018)
A Survey on Deep Transfer Learning and Domain Adaptation (Wilson & Cook, 2020)
Contribute to this collection
Know a great resource? Submit a pull request to add it.
Unsupervised Learning for Agents(ULA)
Discovering hidden patterns and structures in unlabeled data to enhance agent understanding
๐ฏ 30-Second Overview
Pattern: Adapt models to new domains using unlabeled data through self-supervised learning objectives
Why: Leverages abundant unlabeled data to learn domain-specific patterns without expensive annotation requirements
Key Insight: Self-supervised pretext tasks create supervisory signals from data structure, enabling effective domain adaptation
โก Quick Implementation
๐ Do's & Don'ts
๐ฆ When to Use
Use When
- โข Large amounts of unlabeled domain data are available
- โข Labeled data is expensive or impossible to obtain
- โข Need to adapt to new domains with different distributions
- โข Self-supervised signals can be extracted from data structure
- โข Domain has rich inherent patterns and regularities
Avoid When
- โข High-quality labeled data is readily available
- โข Domain lacks clear self-supervised signals
- โข Computational resources are severely limited
- โข Immediate deployment without adaptation time
- โข Simple transfer learning is sufficient
๐ Key Metrics
๐ก Top Use Cases
References & Further Reading
Deepen your understanding with these curated resources
Foundational Self-Supervised Learning
Self-Supervised Learning: Generative or Contrastive (Liu et al., 2021)
A Simple Framework for Contrastive Learning of Visual Representations (Chen et al., 2020)
Momentum Contrast for Unsupervised Visual Representation Learning (He et al., 2020)
Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning (Grill et al., 2020)
Language Model Self-Supervision
BERT: Pre-training of Deep Bidirectional Transformers (Devlin et al., 2018)
RoBERTa: A Robustly Optimized BERT Pretraining Approach (Liu et al., 2019)
DeBERTa: Decoding-enhanced BERT with Disentangled Attention (He et al., 2020)
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators (Clark et al., 2020)
Domain Adaptation Techniques
Contrastive Learning Methods
SimCLR: A Simple Framework for Contrastive Learning (Chen et al., 2020)
SwAV: Unsupervised Learning of Visual Features by Contrasting Cluster Assignments (Caron et al., 2020)
SimCSE: Simple Contrastive Learning of Sentence Embeddings (Gao et al., 2021)
SupCon: Supervised Contrastive Learning (Khosla et al., 2020)
Masked Language Modeling
BERT: Pre-training of Deep Bidirectional Transformers (Devlin et al., 2018)
SpanBERT: Improving Pre-training by Representing and Predicting Spans (Joshi et al., 2020)
ERNIE: Enhanced Representation through Knowledge Integration (Zhang et al., 2019)
ALBERT: A Lite BERT for Self-supervised Learning (Lan et al., 2019)
Recent Advances (2023-2024)
Multimodal Self-Supervision
CLIP: Learning Transferable Visual Representations from Natural Language Supervision (Radford et al., 2021)
ALIGN: Scaling Up Visual and Vision-Language Representation Learning (Jia et al., 2021)
Florence: A New Foundation Model for Computer Vision (Yuan et al., 2021)
DALLE-2: Hierarchical Text-Conditional Image Generation (Ramesh et al., 2022)
Cross-Domain Adaptation
Domain Adaptation for Neural Networks: A Review (Farahani et al., 2021)
Unsupervised Cross-domain Representation Learning (Hoffman et al., 2018)
Deep Transfer Learning: A New Deep Learning Research Direction (Tan et al., 2018)
A Survey on Deep Transfer Learning and Domain Adaptation (Wilson & Cook, 2020)
Contribute to this collection
Know a great resource? Submit a pull request to add it.