Rethinking Resource Management in Edge Learning: A Joint Pre-Training and Fine-Tuning Design Paradigm
Abstract
References
Index Terms
- Rethinking Resource Management in Edge Learning: A Joint Pre-Training and Fine-Tuning Design Paradigm
Recommendations
Rethinking pre-training on medical imaging
AbstractTransfer learning from natural image datasets, such as ImageNet, is common for applying deep learning to medical imaging. However, the modalities of natural and medical images differ considerably, and the reason for the latest medical ...
Highlights- We study the transfer learning effectiveness based on medical and natural images.
Improved fine-tuning by better leveraging pre-training data
NIPS '22: Proceedings of the 36th International Conference on Neural Information Processing SystemsAs a dominant paradigm, fine-tuning a pre-trained model on the target data is widely used in many deep learning applications, especially for small data sets. However, recent studies have empirically shown that training from scratch has the final ...
Adaptive Transfer Learning via Fine-grained Multi-task Pre-training
ACAI '21: Proceedings of the 2021 4th International Conference on Algorithms, Computing and Artificial IntelligenceNowadays pre-training paradigm has been widely adopted for deep learning-based applications. In multiple pre-training tasks, conventional methods process them using naive Multi-Task Learning (MTL) technology. The pre-trained models are unavoidably ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In
Publisher
IEEE Press
Publication History
Qualifiers
- Research-article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 0Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0