NIPS Proceedingsβ

Ohad Shamir

17 Papers

  • On the Power and Limitations of Random Features for Understanding Neural Networks (2019)
  • Are ResNets Provably Better than Linear Predictors? (2018)
  • Global Non-convex Optimization with Discretized Diffusions (2018)
  • Dimension-Free Iteration Complexity of Finite Sum Optimization Problems (2016)
  • Without-Replacement Sampling for Stochastic Gradient Methods (2016)
  • Communication Complexity of Distributed Convex Learning and Optimization (2015)
  • Fundamental Limits of Online and Distributed Algorithms for Statistical Learning and Estimation (2014)
  • On the Computational Efficiency of Training Neural Networks (2014)
  • Online Learning with Switching Costs and Other Adaptive Adversaries (2013)
  • Relax and Randomize : From Value to Algorithms (2012)
  • Better Mini-Batch Algorithms via Accelerated Gradient Methods (2011)
  • Efficient Learning of Generalized Linear and Single Index Models with Isotonic Regression (2011)
  • Efficient Online Learning via Randomized Rounding (2011)
  • From Bandits to Experts: On the Value of Side-Observations (2011)
  • Learning with the weighted trace-norm under arbitrary sampling distributions (2011)
  • On the Reliability of Clustering Stability in the Large Sample Regime (2008)
  • Cluster Stability for Finite Samples (2007)