Preconditioned Gradient Descent for Sketched Mixture Learning - CentraleSupélec
Communication Dans Un Congrès Année : 2024

Preconditioned Gradient Descent for Sketched Mixture Learning

Résumé

Sketching consists of reducing the dimensionality of data samples by retaining a small number of their moments. In this paper, a Preconditioned Gradient Descent algorithm (PGD) is proposed to estimate the parameter of mixture models (MM) in arbitrary dimensions by minimizing the non-convex quadratic loss between the sketch and the characteristic function of an MM of varying parameters. Preconditioning is introduced to dynamically adapt the descent direction to the local landspace of the objective function, fastening convergence, with no computational overhead per iteration compared to vanilla GD. An analysis of the linear convergence rate of PGD is conducted, and numerical simulations showcase the method's effectiveness, particularly when the weight of the classes is unbalanced or when a substantial number of data samples are available.
Fichier principal
Vignette du fichier
ISIT_2024.pdf (366.9 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04425748 , version 1 (30-01-2024)
hal-04425748 , version 2 (30-05-2024)

Licence

Copyright (Tous droits réservés)

Identifiants

  • HAL Id : hal-04425748 , version 2

Citer

Joseph Gabet, Maxime Ferreira Da Costa. Preconditioned Gradient Descent for Sketched Mixture Learning. International Symposium on Information Theory, IEEE, Jul 2024, Athens, Greece. ⟨hal-04425748v2⟩
184 Consultations
132 Téléchargements

Partager

More