Part of Advances in Neural Information Processing Systems 34 (NeurIPS 2021)
Jessica Finocchiaro, Rafael M. Frongillo, Bo Waggoner
The convex consistency dimension of a supervised learning task is the lowest prediction dimension d such that there exists a convex surrogate L:Rd×Y→R that is consistent for the given task. We present a new tool based on property elicitation, d-flats, for lower-bounding convex consistency dimension. This tool unifies approaches from a variety of domains, including continuous and discrete prediction problems. We use d-flats to obtain a new lower bound on the convex consistency dimension of risk measures, resolving an open question due to Frongillo and Kash (NeurIPS 2015). In discrete prediction settings, we show that the d-flats approach recovers and even tightens previous lower bounds using feasible subspace dimension.