Part of Advances in Neural Information Processing Systems 34 (NeurIPS 2021)
Paul Liu, Aviad Rubinstein, Jan Vondrak, Junyao Zhao
We consider the problem of maximizing submodular functions in single-pass streaming and secretaries-with-shortlists models, both with random arrival order.For cardinality constrained monotone functions, Agrawal, Shadravan, and Stein~\cite{SMC19} gave a single-pass (1−1/e−ε)-approximation algorithm using only linear memory, but their exponential dependence on ε makes it impractical even for ε=0.1.We simplify both the algorithm and the analysis, obtaining an exponential improvement in the ε-dependence (in particular, O(k/ε) memory).Extending these techniques, we also give a simple (1/e−ε)-approximation for non-monotone functions in O(k/ε) memory. For the monotone case, we also give a corresponding unconditional hardness barrier of 1−1/e+ε for single-pass algorithms in randomly ordered streams, even assuming unlimited computation. Finally, we show that the algorithms are simple to implement and work well on real world datasets.