Transfer Learning with Kernel Methods.

Nature communications
Authors
Abstract

Transfer learning refers to the process of adapting a model trained on a source task to a target task. While kernel methods are conceptually and computationally simple models that are competitive on a variety of tasks, it has been unclear how to develop scalable kernel-based transfer learning methods across general source and target tasks with possibly differing label dimensions. In this work, we propose a transfer learning framework for kernel methods by projecting and translating the source model to the target task. We demonstrate the effectiveness of our framework in applications to image classification and virtual drug screening. For both applications, we identify simple scaling laws that characterize the performance of transfer-learned kernels as a function of the number of target examples. We explain this phenomenon in a simplified linear setting, where we are able to derive the exact scaling laws.

Year of Publication
2023
Journal
Nature communications
Volume
14
Issue
1
Pages
5570
Date Published
09/2023
ISSN
2041-1723
DOI
10.1038/s41467-023-41215-8
PubMed ID
37689796
Links