Differential privacy is a framework for sensitive data protection which can be used to obtain privately produced machine learning models. The challenge is to make the privacy­accuracy trade-off while still getting acceptable modelling results from private machine learn­ing algorithms. Support Vector Machines are a class of machine learning methods for solving classification problems. Algorithms for training Support Vector Machines under differential privacy exist and have been shown to produce models which perform almost as well as non-private models. Unlike these existing private solutions, which utilize quadratic pro­gramming formulations, Support Vector Machines can be trained by solving accompanying linear programming formulations. Solving these privately is an interesting application for both machine learning and linear optimization fields. This research explores the application of privately solving linear programs for training Support Vector Machines by putting a theo­retical framework to the test. Although the resulting algorithms do not outperform the state of the art in private machine learning modelling, this research paves a way towards solving different kinds of problems adequately while maintaining differential privacy.

Additional Metadata
Keywords support vector machines, differential privacy, machine learning, linear pro­gramming, dense multiplicative weights, linear classification
Thesis Advisor Birbil, S.I.
Persistent URL hdl.handle.net/2105/51698
Series Econometrie
Citation
Reijden, P. van der. (2020, January 21). Solving Linear Programs to Train Support Vector Ma­chines Under Differential Privacy. Econometrie. Retrieved from http://hdl.handle.net/2105/51698