Transfer learning is important for foundation models to adapt to downstream tasks. However, many foundation models are proprietary, so users must share their data with model owners to fine-tune the models, which is costly and raise privacy concerns. Moreover, fine-tuning large foundation models is computation-intensive and impractical for most downstream users. In this paper, we propose Offsite-Tuning, a privacy preserving and efficient transfer learning framework that can adapt...
Transfer learning is important for foundation models to adapt to downstream tasks. However, many foundation models are proprietary, so users must share their data with model owners to fine-tune the models, which is costly and raise privacy concerns. Moreover, fine-tuning large foundation models is computation-intensive and impractical for most downstream users. In this paper, we propose Offsite-Tuning, a privacy preserving and efficient transfer learning framework that can adapt billion-parameter foundation models to downstream data without access to the full model.
2023: Guangxuan Xiao, Ji Lin, Song Han
https://arxiv.org/pdf/2302.04870v1.pdf
View more