Binary Stochastic Representations for Large Multi-class Classification

Thomas Gerald, Nicolas Baskiotis, Ludovic Denoyer

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

Classification with a large number of classes is a key problem in machine learning and corresponds to many real-world applications like tagging of images or textual documents in social networks. If one-vs-all methods usually reach top performance in this context, these approaches suffer of a high inference complexity, linear w.r.t. the number of categories. Different models based on the notion of binary codes have been proposed to overcome this limitation, achieving in a sublinear inference complexity. But they a priori need to decide which binary code to associate to which category before learning using more or less complex heuristics. We propose a new end-to-end model which aims at simultaneously learning to associate binary codes with categories, but also learning to map inputs to binary codes. This approach called Deep Stochastic Neural Codes (DSNC) keeps the sublinear inference complexity but do not need any a priori tuning. Experimental results on different datasets show the effectiveness of the approach w.r.t. baseline methods.
Original languageEnglish (US)
Title of host publicationLecture Notes in Computer Science
PublisherSpringer Nature
Pages155-165
Number of pages11
ISBN (Print)9783319700861
DOIs
StatePublished - Oct 24 2017
Externally publishedYes

Bibliographical note

KAUST Repository Item: Exported on 2020-10-01
Acknowledged KAUST grant number(s): OSR-2015-CRG4-2639
Acknowledgements: This publication is based upon work supported by the King Abdullah University of Science and Technology (KAUST) Office of Sponsored Research (OSR) under Award No. OSR-2015-CRG4-2639.
This publication acknowledges KAUST support, but has no KAUST affiliated authors.

Fingerprint

Dive into the research topics of 'Binary Stochastic Representations for Large Multi-class Classification'. Together they form a unique fingerprint.

Cite this