Affiliation:
1. Faculty of Science and Technology of Fez, Department of Mathematics, Modelling and Mathematical Structures Laboratory Sidi Mohamed Ben Abdellah University Fez Morocco
Abstract
AbstractNowadays, transfer learning has shown promising results in many applications. However, most deep transfer learning methods such as parameter sharing and fine‐tuning are still suffering from the lack of parameters transmission strategy. In this paper, we propose a new optimization model for parameter‐based transfer learning in convolutional neural networks named STP‐CNN. Indeed, we propose a Lasso transfer model supported by a regularization term that controls transferability. Moreover, we opt for the proximal gradient descent method to solve the proposed model. The suggested technique allows, under certain conditions, to control exactly which parameters, in each convolutional layer of the source network, which will be used directly or adjusted in the target network. Several experiments prove the performance of our model in locating the transferable parameters as well as improving the data classification.
Reference42 articles.
1. Ajakan H. Germain P. Larochelle H. Laviolette F. &Marchand M.(2014).Domain‐adversarial neural networks.arXiv preprint arXiv:1412.4446.
2. Asgarian A. Sobhani P. Zhang J. C. Mihailescu M. Sibilia A. Ashraf A. B. &Taati B.(2018).A hybrid instance‐based transfer learning method.arXiv preprint arXiv:1812.01063.
3. Unsupervised Transfer Learning via Multi-Scale Convolutional Sparse Coding for Biomedical Applications
4. ImageNet: A large-scale hierarchical image database
5. Duan J. Dasgupta A. Fischer J. &Tan C.(2022).A survey on machine learning approaches for modelling intuitive physics.arXiv preprint arXiv:2202.06481.