Solving Linear Programs to Train Support Vector Machines Under Differential Privacy
Differential privacy is a framework for sensitive data protection which can be used to obtain privately produced machine learning models. The challenge is to make the privacyaccuracy trade-off while still getting acceptable modelling results from private machine learning algorithms. Support Vector Machines are a class of machine learning methods for solving classification problems. Algorithms for training Support Vector Machines under differential privacy exist and have been shown to produce models which perform almost as well as non-private models. Unlike these existing private solutions, which utilize quadratic programming formulations, Support Vector Machines can be trained by solving accompanying linear programming formulations. Solving these privately is an interesting application for both machine learning and linear optimization fields. This research explores the application of privately solving linear programs for training Support Vector Machines by putting a theoretical framework to the test. Although the resulting algorithms do not outperform the state of the art in private machine learning modelling, this research paves a way towards solving different kinds of problems adequately while maintaining differential privacy.
|Keywords||support vector machines, differential privacy, machine learning, linear programming, dense multiplicative weights, linear classification|
|Thesis Advisor||Birbil, S.I.|
Reijden, P. van der. (2020, January 21). Solving Linear Programs to Train Support Vector Machines Under Differential Privacy. Econometrie. Retrieved from http://hdl.handle.net/2105/51698