Federated learning aims to protect data privacy by collaboratively learning a model without sharing private data among users. However, an adversary may still be able to infer the private training data by attacking the released model. Differential privacy (DP) provides a statistical guarantee against such attacks, at a privacy of possibly degenerating the accuracy or utility of the trained models. In this paper, we apply a utility enhancement scheme based on Laplacian smoothing for differentially-private federated learning (DP-Fed-LS), where the parameter aggregation with injected Gaussian noise is improved in statistical precision. We provide tight closed-form privacy bounds for both uniform and Poisson subsampling and derive corresponding DP guarantees for differential private federated learning, with or without Laplacian smoothing. Experiments over MNIST, SVHN and Shakespeare datasets show that the proposed method can improve model accuracy with DP-guarantee under both subsampling mechanisms.
14 May 2020
11am - 12pm
Where
https://hkust.zoom.us/j/91364836963
Speakers/Performers
Mr. Zhicong LIANG
HKUST
Organizer(S)
Department of Mathematics
Contact/Enquiries
mathseminar@ust.hk
Payment Details
Audience
Alumni, Faculty and Staff, PG Students, UG Students
Language(s)
English
Other Events
21 Jun 2024
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Alzheimer’s Disease is Likely a Lipid-disorder Complication: an Example of Functional Lipidomics for Biomedical and Biological Research
Abstract Functional lipidomics is a frontier in lipidomics research, which identifies changes of cellular lipidomes in disease by lipidomics, uncovers the molecular mechanism(s) leading to the chan...
24 May 2024
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Confinement Controlled Electrochemistry: Nanopore beyond Sequencing
Abstract Nanopore electrochemistry refers to the promising measurement science based on elaborate pore structures, which offers a well-defined geometric confined space to adopt and characterize sin...