Author:
Asai Akari,Salehi Mohammadreza,Peters Matthew,Hajishirzi Hannaneh
Publisher
Association for Computational Linguistics
Cited by
9 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. PanDa: Prompt Transfer Meets Knowledge Distillation for Efficient Model Adaptation;IEEE Transactions on Knowledge and Data Engineering;2024-09
2. Multi-Task Learning in Natural Language Processing: An Overview;ACM Computing Surveys;2024-07-25
3. When MOE Meets LLMs: Parameter Efficient Fine-tuning for Multi-task Medical Applications;Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval;2024-07-10
4. Device-Edge Cooperative Fine-Tuning of Foundation Models as a 6G Service;IEEE Wireless Communications;2024-06
5. Snapshot Prompt Ensemble for Parameter-Efficient Soft Prompt Transfer;ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP);2024-04-14