On flat versus hierarchical classification in large-scale taxonomies

Rohit Babbar, Ioannis Partalas, Eric Gaussier, Massih Reza Amini

Research output: Contribution to journalConference articlepeer-review

58 Citations (SciVal)

Abstract

We study in this paper flat and hierarchical classification strategies in the context of large-scale taxonomies. To this end, we first propose a multiclass, hierarchical data dependent bound on the generalization error of classifiers deployed in large-scale taxonomies. This bound provides an explanation to several empirical results reported in the literature, related to the performance of flat and hierarchical classifiers. We then introduce another type of bound targeting the approximation error of a family of classifiers, and derive from it features used in a meta-classifier to decide which nodes to prune (or flatten) in a large-scale taxonomy. We finally illustrate the theoretical developments through several experiments conducted on two widely used taxonomies.

Original languageEnglish
Pages (from-to)1-9
Number of pages9
JournalAdvances in Neural Information Processing Systems
Publication statusPublished - 5 Dec 2013
Event27th Annual Conference on Neural Information Processing Systems, NIPS 2013 - Lake Tahoe, NV, USA United States
Duration: 5 Dec 201310 Dec 2013

Bibliographical note

Funding information:
This work was supported in part by the ANR project Class-Y, the Mastodons project Garguantua, the LabEx PERSYVAL-Lab ANR-11-LABX-0025 and the European project BioASQ (grant agreement no. 318652).

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'On flat versus hierarchical classification in large-scale taxonomies'. Together they form a unique fingerprint.

Cite this