NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:2920
Title:Regularized Gradient Boosting


		
This paper proposes Rademacher generalization bounds for Regularized Gradient Boosting which encompasses various accelerated GB methods. Although there are still some work to be done in order to make the proposed algorithm derived from the theoretical study faster but the proposed theoretical study deserves publication.