Talk information

Date: Sunday, May 4, 2025
Time: 10:10–11:00
Place: Schreiber 309
Speaker: Tom Waknine (Technion)
Title: On Reductions of Learning Problems and the Borsuk-Ulam theorem


Abstract:

Many practical prediction algorithms represent inputs in Euclidean space and replace the discrete $0/1$ classification loss with a real-valued surrogate loss, effectively reducing classification tasks to stochastic optimization. In this talk, I will explore the expressiveness of such reductions in terms of key resources. I will formally define a general notion of reductions between learning problems, and relate it to some well known examples such as representations by half-spaces. I will then establish bounds on the minimum Euclidean dimension $D$ required to reduce a concept class with VC dimension $d$ to a Stochastic Convex Optimization (SCO) problem in a $D$-dimensional Euclidean space. This result provides a formal framework for understanding the intuitive interpretation of the VC dimension as the number of parameters needed to learn a given class. The proof leverages a clever application of the Borsuk-Ulam theorem, illustrating an interesting connection between topology and learning theory.