PromptPapers is a curated repository that collects and organizes influential research papers focused on prompt-based tuning for large-scale pre-trained language models. It serves as a centralized reading list for researchers and practitioners interested in modern prompt learning techniques, which differ from traditional fine-tuning by leveraging pre-trained tasks directly for prediction. PromptPapers categorizes papers into multiple sections such as overview, pilot work, basics, analysis, improvements, and specializations, helping users navigate the evolution of the field systematically. It also includes keyword conventions to describe each paper’s methods, tasks, and properties, making it easier to compare different approaches. PromptPapers emphasizes community contribution, encouraging researchers to submit updates and add new papers to keep the list current.
Features
- Curated list of prompt-based tuning research papers
- Organized sections covering trends, basics, and advanced topics
- Keyword conventions describing methods and research focus
- Includes both foundational and recent works in prompt learning
- Community-driven updates via pull requests
- Highlights related areas like parameter-efficient tuning