AdaBoost Parallelization on PC Clusters with Virtual Shared Memory for Fast Feature Selection - CentraleSupélec Access content directly
Conference Papers Year : 2007

AdaBoost Parallelization on PC Clusters with Virtual Shared Memory for Fast Feature Selection

Virginie Galtier
Olivier Pietquin
Stéphane Vialle

Abstract

Feature selection is a key issue in many machine learning applications and the need to test lots of candidate features is real while computational time required to do so is often huge. In this paper, we introduce a parallel version of the well- known AdaBoost algorithm to speed up and size up feature selection for binary classification tasks using large training datasets and a wide range of elementary features. This parallelization is done without any modification to the AdaBoost algorithm and designed for PC clusters using Java and the JavaSpace distributed framework. JavaSpace is a memory sharing paradigm implemented on top of a virtual shared memory, that appears both efficient and easy-to-use. Results and performances on a face detection system trained with the proposed parallel AdaBoost are presented.
Fichier principal
Vignette du fichier
Supelec334.pdf (309.05 Ko) Télécharger le fichier
Origin : Publisher files allowed on an open archive
Loading...

Dates and versions

hal-00216041 , version 1 (25-01-2008)

Identifiers

  • HAL Id : hal-00216041 , version 1

Cite

Virginie Galtier, Olivier Pietquin, Stéphane Vialle. AdaBoost Parallelization on PC Clusters with Virtual Shared Memory for Fast Feature Selection. IEEE International Conference on Signal Processing and Communication, Nov 2007, Dubai, United Arab Emirates. pp.165-168. ⟨hal-00216041⟩
126 View
321 Download

Share

Gmail Facebook Twitter LinkedIn More