Preconditioned Gradient Descent for Sketched Mixture Learning
Résumé
Sketching consists of reducing the dimensionality of data samples by retaining a small number of their moments. In this paper, a Preconditioned Gradient Descent algorithm (PGD) is proposed to estimate the parameter of mixture models (MM) in arbitrary dimensions by minimizing the non-convex quadratic loss between the sketch and the characteristic function of an MM of varying parameters. Preconditioning is introduced to dynamically adapt the descent direction to the local landspace of the objective function, fastening convergence, with no computational overhead per iteration compared to vanilla GD. An analysis of the linear convergence rate of PGD is conducted, and numerical simulations showcase the method's effectiveness, particularly when the weight of the classes is unbalanced or when a substantial number of data samples are available.
Origine | Fichiers produits par l'(les) auteur(s) |
---|