This is a talk given by Prof Ivana Ljubic as part of the Online Seminar Series Machine Learning NeEDS Mathematical Optimization, https://congreso.us.es/mlneedsmo/
Many important applications in machine learning deal with submodular maximization. Indeed, finding a subset of data points to train neural networks, determining which data points to label to train a classifier, identifying the most influential nodes in a social network, or generating a representative set of diverse recommendations, are some examples of problems that can be solved using techniques of submodular maximization.
In this talk we focus on a family of discrete optimization problems asking for the maximization of the expected value of a concave, strictly increasing, and differentiable function composed with a set-union operator. The expected value is computed with respect to a set of coefficients taking values from a discrete set of scenarios. The function models the utility function of the decision maker, while the set-union operator models a covering relationship between two ground sets, a set of items and a set of metaitems. The goal is to find a subset of metaitems (satisfying certain budget constraints) that maximizes the total utility corresponding to the items it covers. It has applications to, among others, maximal covering location, and influence maximization problems.
The problem can be modeled as a mixed integer nonlinear program (MINLP) involving binary decision variables associated with the items and metaitems. We propose a double-hypograph decomposition which allows for projecting out the variables associated with the items by separately exploiting the structural properties of the utility function and of the set-union operator. Thanks to it, the utility function is linearized via an exact outer-approximation technique, whereas the set-union operator is linearized in two ways: either (i) via a reformulation based on submodular cuts, or (ii) via a Benders decomposition. We analyze from a theoretical perspective the strength of the inequalities of the resulting reformulations, and embed them into two branch-and-cut algorithms. The computational results reveal that, on our testbed of instances, the method based on combining an outer approximation with Benders cuts significantly outperforms the other alternatives.
Link to the talk: https://eu.bbcollab.com/guest/3953e5e4f39546019de1969c63656eda
The organizers of the Online Seminar Series Machine Learning NeEDS Mathematical Optimization
Emilio Carrizosa, IMUS-Instituto de Matemáticas de la Universidad de Sevilla
Thomas Halskov, Copenhagen Business School
Kseniia Kurishchenko, Copenhagen Business School
Cristina Molero-Río, IMUS-Instituto de Matemáticas de la Universidad de Sevilla
Jasone Ramírez-Ayerbe, IMUS-Instituto de Matemáticas de la Universidad de Sevilla
Dolores Romero Morales, Copenhagen Business School