2018-10-10
Stochastic Gradient Descent with Differentially Private Updates
Publication
Publication
In recent decades, the amount of data available has grown immensely. A lot of this data may be private or sensitive. Privacy of of this data is very important, which is why algorithms that can operate on this data without violating privacy have become crucial. A framework for designing such algorithms is differential privacy. In this paper we propose differentially private versions of single-point and mini-batch stochastic gradient descent (SGD) and use these for optimizing the objective for logistic regression. We use several data sets of varying sizes to test the algorithms. We conclude that the performance of mini-batch differentially private SGD is very close to non-private SGD, in contrast to single-point differentially private SGD, which does not converge and has a high variance. This holds for both low and high dimensional problems. We also conclude that deciding on hyperparameters is not an easy choice. All the results mentioned before are obtained with doing a single pass through the data sets. We also test the effect of doing multiple passes through the data set for singlepoint differentially private SGD. This decreases the level of privacy and does not increase performance as much as mini-batching does.
Additional Metadata | |
---|---|
Birbil, S.I. | |
hdl.handle.net/2105/43548 | |
Econometrie | |
Organisation | Erasmus School of Economics |
Hardwarsing, R. (2018, October 10). Stochastic Gradient Descent with Differentially Private Updates. Econometrie. Retrieved from http://hdl.handle.net/2105/43548
|