Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data

Sergei Popov, Stanislav Morozov, Artem Babenko

Keywords: dnn, ensembles, optimization, representation learning, tabular data

Tues Session 2 (08:00-10:00 GMT) [Live QA] [Cal]
Tues Session 3 (12:00-14:00 GMT) [Live QA] [Cal]

Abstract: Nowadays, deep neural networks (DNNs) have become the main instrument for machine learning tasks within a wide range of domains, including vision, NLP, and speech. Meanwhile, in an important case of heterogenous tabular data, the advantage of DNNs over shallow counterparts remains questionable. In particular, there is no sufficient evidence that deep learning machinery allows constructing methods that outperform gradient boosting decision trees (GBDT), which are often the top choice for tabular problems. In this paper, we introduce Neural Oblivious Decision Ensembles (NODE), a new deep learning architecture, designed to work with any tabular data. In a nutshell, the proposed NODE architecture generalizes ensembles of oblivious decision trees, but benefits from both end-to-end gradient-based optimization and the power of multi-layer hierarchical representation learning. With an extensive experimental comparison to the leading GBDT packages on a large number of tabular datasets, we demonstrate the advantage of the proposed NODE architecture, which outperforms the competitors on most of the tasks. We open-source the PyTorch implementation of NODE and believe that it will become a universal framework for machine learning on tabular data.

Similar Papers

Double Neural Counterfactual Regret Minimization
Hui Li, Kailiang Hu, Shaohua Zhang, Yuan Qi, Le Song,
Locally Constant Networks
Guang-He Lee, Tommi S. Jaakkola,
N-BEATS: Neural basis expansion analysis for interpretable time series forecasting
Boris N. Oreshkin, Dmitri Carpov, Nicolas Chapados, Yoshua Bengio,