NIPS Proceedingsβ

Tengyu Ma

10 Papers

  • Data-dependent Sample Complexity of Deep Neural Networks via Lipschitz Augmentation (2019)
  • Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss (2019)
  • Regularization Matters: Generalization and Optimization of Neural Nets v.s. their Induced Kernel (2019)
  • Towards Explaining the Regularization Effect of Initial Large Learning Rate in Training Neural Networks (2019)
  • Verified Uncertainty Calibration (2019)
  • On the Optimization Landscape of Tensor Decompositions (2017)
  • A Non-generative Framework and Convex Relaxations for Unsupervised Learning (2016)
  • Matrix Completion has No Spurious Local Minimum (2016)
  • Sum-of-Squares Lower Bounds for Sparse PCA (2015)
  • On Communication Cost of Distributed Statistical Estimation and Dimensionality (2014)