Learning taxonomy adaptation in large-scale classification

Rohit Babbar, Ioannis Partalas, Eric Gaussier, Massih Reza Amini, Cécile Amblard

Research output: Contribution to journalArticlepeer-review

25 Citations (SciVal)


In this paper, we study flat and hierarchical classification strategies in the context of largescale taxonomies. Addressing the problem from a learning-theoretic point of view, we first propose a multi-class, hierarchical data dependent bound on the generalization error of classifiers deployed in large-scale taxonomies. This bound provides an explanation to several empirical results reported in the literature, related to the performance of flat and hierarchical classifiers. Based on this bound, we also propose a technique for modifying a given taxonomy through pruning, that leads to a lower value of the upper bound as compared to the original taxonomy. We then present another method for hierarchy pruning by studying approximation error of a family of classifiers, and derive from it features used in a meta-classifier to decide which nodes to prune. We finally illustrate the theoretical developments through several experiments conducted on two widely used taxonomies.

Original languageEnglish
Pages (from-to)1-37
Number of pages37
JournalJournal of Machine Learning Research
Publication statusPublished - 1 Feb 2016


  • Hierarchical classification
  • Large-scale classification
  • Meta-learning
  • Rademacher complexity
  • Taxonomy adaptation

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence


Dive into the research topics of 'Learning taxonomy adaptation in large-scale classification'. Together they form a unique fingerprint.

Cite this