Advances in Private and Nonparametric Learning

The field of machine learning is moving towards more private and nonparametric methods. Researchers are developing new techniques to protect sensitive information while still allowing for accurate modeling and analysis. This includes the development of differentially private estimators, privacy wrappers for black-box functions, and purification methods to convert approximate differential privacy mechanisms into pure ones. Additionally, there is a growing interest in nonparametric methods, such as nonparametric factor analysis and Rademacher complexity, which can handle complex data distributions and relationships. Notable papers in this area include:

  • 'Nonparametric Factor Analysis and Beyond', which proposes a general framework for identifying latent variables in nonparametric noisy settings.
  • 'Privately Evaluating Untrusted Black-Box Functions', which introduces a novel setting for automated sensitivity detection and designs privacy wrappers with high accuracy and low query complexity.
  • 'Purifying Approximate Differential Privacy with Randomized Post-processing', which provides a systematic method for transforming approximate DP into pure DP while maintaining competitive accuracy and computational efficiency.

Sources

Nonparametric Factor Analysis and Beyond

On Privately Estimating a Single Parameter

Privately Evaluating Untrusted Black-Box Functions

Lean Formalization of Generalization Error Bound by Rademacher Complexity

Approximating Opaque Top-k Queries

Reflex: Speeding Up SMPC Query Execution through Efficient and Flexible Intermediate Result Size Trimming

Purifying Approximate Differential Privacy with Randomized Post-processing

A Theoretical Framework for Distribution-Aware Dataset Search

Spend Your Budget Wisely: Towards an Intelligent Distribution of the Privacy Budget in Differentially Private Text Rewriting

Built with on top of