Description
In recent years, gradient boosted decision tree learning has proven to be an effective method of training robust models. Moreover, collaborative learning among multiple parties can yield great benefits for all parties involved, but parties must be cautious of how they share sensitive data due to regulatory, business, and liability concerns.
We propose Secure XGBoost, an oblivious gradient boosting system that enables multiparty computation. Secure XGBoost is the first system of its kind, and builds on XGBoost, a state-of-the-art gradient boosting library with no security for the single party setting, to execute pre-agreed upon queries from multiple parties in hardware enclaves. Notably, Secure XGBoost introduces (i) a new system design facilitating secure collaboration tailored toward the outsourced computation model and (ii) oblivious algorithms for gradient boosted decision tree training and inference.
We find that our implementation of Secure XGBoost providing data encryption, authentication, and computation integrity is 0.23 to 12.5x slower than XGBoost; obliviousness comes with2-3 orders of magnitude overhead.