Parallel Algorithm for Incremental Betweenness Centrality on Large Graphs

Fuad Tarek Jamour, Spiros Skiadopoulos, Panos Kalnis

Research output: Contribution to journalArticlepeer-review

46 Scopus citations

Abstract

Betweenness centrality quantifies the importance of nodes in a graph in many applications, including network analysis, community detection and identification of influential users. Typically, graphs in such applications evolve over time. Thus, the computation of betweenness centrality should be performed incrementally. This is challenging because updating even a single edge may trigger the computation of all-pairs shortest paths in the entire graph. Existing approaches cannot scale to large graphs: they either require excessive memory (i.e., quadratic to the size of the input graph) or perform unnecessary computations rendering them prohibitively slow. We propose iCentral; a novel incremental algorithm for computing betweenness centrality in evolving graphs. We decompose the graph into biconnected components and prove that processing can be localized within the affected components. iCentral is the first algorithm to support incremental betweeness centrality computation within a graph component. This is done efficiently, in linear space; consequently, iCentral scales to large graphs. We demonstrate with real datasets that the serial implementation of iCentral is up to 3.7 times faster than existing serial methods. Our parallel implementation that scales to large graphs, is an order of magnitude faster than the state-of-the-art parallel algorithm, while using an order of magnitude less computational resources.
Original languageEnglish (US)
Pages (from-to)659-672
Number of pages14
JournalIEEE Transactions on Parallel and Distributed Systems
Volume29
Issue number3
DOIs
StatePublished - Oct 17 2017

Bibliographical note

KAUST Repository Item: Exported on 2020-10-01

Fingerprint

Dive into the research topics of 'Parallel Algorithm for Incremental Betweenness Centrality on Large Graphs'. Together they form a unique fingerprint.

Cite this