FedDyn Combined with Dynamic Federated Distillation Approach on Recommender System

Authors

  • Yu Sun Author

DOI:

https://doi.org/10.61173/vrn16s52

Keywords:

FedDyn Algorithm, Federated Learning, Recommender System, Knowledge Distillation

Abstract

This article introduces the background of federated learning, highlighting its emergence due to the challenges posed by data distribution and privacy requirements, which limit traditional methods. However, federated learning also faces issues such as bandwidth constraints and communication power consumption, with existing optimization algorithms having their own shortcomings. The main focus of this paper is to study federated learning-related algorithms and optimize federated recommendation algorithms to address data privacy and communication overhead issues. Two reference algorithms are highlighted: the Federated Dynamic Regularizer, which adjusts the local loss function dynamically to facilitate the convergence of local models toward the global optimum, and the Dynamic Federated Distillation, which compresses models to minimize communication costs while improving the reliability of knowledge and safeguarding privacy. By orthogonally combining the codes of both algorithms, the paper can effectively leverage their advantages. Experiments conducted on three datasets—MovieLens-1M, MovieLens-100K, and Pinterest—compare the optimized algorithm FedDyn-DF with baseline methods such as FedAvg and FedProx, as well as the original algorithms. The results show that FedDyn-DF outperforms these methods, converging faster while also protecting privacy. Finally, the paper discusses the limitations of the experiments and future research directions.

Downloads

Published

2024-12-31

Issue

Section

Articles