2020-01-21
Solving Linear Programs to Train Support Vector Machines Under Differential Privacy
Publication
Publication
Differential privacy is a framework for sensitive data protection which can be used to obtain privately produced machine learning models. The challenge is to make the privacyaccuracy trade-off while still getting acceptable modelling results from private machine learning algorithms. Support Vector Machines are a class of machine learning methods for solving classification problems. Algorithms for training Support Vector Machines under differential privacy exist and have been shown to produce models which perform almost as well as non-private models. Unlike these existing private solutions, which utilize quadratic programming formulations, Support Vector Machines can be trained by solving accompanying linear programming formulations. Solving these privately is an interesting application for both machine learning and linear optimization fields. This research explores the application of privately solving linear programs for training Support Vector Machines by putting a theoretical framework to the test. Although the resulting algorithms do not outperform the state of the art in private machine learning modelling, this research paves a way towards solving different kinds of problems adequately while maintaining differential privacy.
Additional Metadata | |
---|---|
, , , , , | |
Birbil, S.I. | |
hdl.handle.net/2105/51698 | |
Econometrie | |
Organisation | Erasmus School of Economics |
Reijden, P. van der. (2020, January 21). Solving Linear Programs to Train Support Vector Machines Under Differential Privacy. Econometrie. Retrieved from http://hdl.handle.net/2105/51698
|