NeurIPS 2020
### Smoothly Bounding User Contributions in Differential Privacy

### Meta Review

The paper considers a setting in which there are multiple users, and each user contributes a possibly different amount of data. The goal is to perform a certain task while ensuring a user-level differential privacy is desired. The paper proposes an idea and method to reweigh the data so that all of the data is used but which reduces the sensitivity and thus helps in the accuracy-privacy tradeoff. The paper addresses specific problems of mean estimation, quantile estimation and logistic regression and also discusses a general ERM framework.
The problem proposed is of interest to the community. The results are exciting in parts, particularly since it is a new way of addressing this non-uniform-samples problem as compared to past approaches which just bound the number of contributions from any user. The review team thus agree that this paper should be accepted.
There were also various concerns raised regarding computation of the weights as well as what problems the mechanism proposed in the paper can actually solve. For instance, how can one compute the weights c's in Section 4? What about A (reviewer 4 brought up this point but wasn't addressed in the rebuttal)? More generally, what problems can the proposed mechanism solve?
^^^Please make these points very clear in the camera ready version, clearly demarcating what the paper actually solves and what it does not.^^^