Cloud Native Applications Profiling using a Graph Neural Networks Approach

Amine Boukhtouta, Taous Madi, Makan Pourzandi, Hyame Alameddine A.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations


The convergence of Telecommunication and industry operational networks towards cloud native applications has enabled the idea to integrate protection layers to harden security posture and management of cloud native based deployments. In this paper, we propose a data-driven approach to support detection of anomalies in cloud native application based on a graph neural network. The essence of the profiling relies on capturing interactions between different perspectives in cloud native applications through a network dependency graph and transforming it to a computational graph neural network. The latter is used to profile different deployed assets like micro-service types, workloads' namespaces, worker machines, management and orchestration machines as well as clusters. As a first phase of the profiling, we consider a fine-grained profiling on microservice types with an emphasis on network traffic indicators. These indicators are collected on distributed Kubernetes (K8S) deployment premises. Experimental results shows good trade-off in terms of accuracy and recall with respect to micro-service types profiling (around 96%). In addition, we used predictions entropy scores to infer anomalies in testing data. These scores allow to segregate between benign and anomalous graphs, where we identified 19 out of 23 anomalies. Moreover, by using entropy scores, we can conduct a root cause analysis to infer problematic micro-services.
Original languageEnglish (US)
Title of host publication2022 IEEE Future Networks World Forum (FNWF)
StatePublished - Mar 8 2023

Bibliographical note

KAUST Repository Item: Exported on 2023-03-13


Dive into the research topics of 'Cloud Native Applications Profiling using a Graph Neural Networks Approach'. Together they form a unique fingerprint.

Cite this