An Efficient Algorithm for Computing Entropic Measures of Feature Subsets - CentraleSupélec
Communication Dans Un Congrès Année : 2018

An Efficient Algorithm for Computing Entropic Measures of Feature Subsets

Résumé

Entropic measures such as conditional entropy or mutual information have been used numerous times in pattern mining, for instance to characterize valuable itemsets or approximate functional dependencies. Strangely enough the fundamental problem of designing efficient algorithms to compute entropy of subsets of features (or mutual information of feature subsets relatively to some target feature) has received little attention compared to the analog problem of computing frequency of itemsets. The present article proposes to fill this gap: it introduces a fast and scalable method that computes entropy and mutual information for a large number of feature subsets by adopting the divide and conquer strategy used by FP-growth-one of the most efficient frequent itemset mining algorithm. In order to illustrate its practical interest, the algorithm is then used to solve the recently introduced problem of mining reliable approximate functional dependencies. It finally provides empirical evidences that in the context of non-redundant pattern extraction, the proposed method outperforms existing algorithms for both speed and scalability.
Fichier principal
Vignette du fichier
article.pdf (457.98 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01897734 , version 1 (17-10-2018)

Identifiants

  • HAL Id : hal-01897734 , version 1

Citer

Frédéric Pennerath. An Efficient Algorithm for Computing Entropic Measures of Feature Subsets. ECML-PKDD 2018 - European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, Sep 2018, Dublin, Ireland. ⟨hal-01897734⟩
166 Consultations
280 Téléchargements

Partager

More