Towards Private Learning on Decentralized Graphs With Local Differential Privacy

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

13 Scopus Citations
View graph of relations


Related Research Unit(s)


Original languageEnglish
Pages (from-to)2936-2946
Journal / PublicationIEEE Transactions on Information Forensics and Security
Online published11 Aug 2022
Publication statusPublished - 2022


Many real-world networks are inherently decentralized. For example, in social networks, each user maintains a local view of a social graph, such as a list of friends and her profile. It is typical to collect these local views of social graphs and conduct graph learning tasks. However, learning over graphs can raise privacy concerns as these local views often contain sensitive information. In this paper, we seek to ensure private graph learning on a decentralized network graph. Towards this objective, we propose Solitude, a new privacy-preserving learning framework based on graph neural networks (GNNs), with formal privacy guarantees based on edge local differential privacy. The crux of Solitude is a set of new delicate mechanisms that can calibrate the introduced noise in the decentralized graph collected from the users. The principle behind the calibration is the intrinsic properties shared by many real-world graphs, such as sparsity. Unlike existing work on locally private GNNs, our new framework can simultaneously protect node feature privacy and edge privacy, and can seamlessly incorporate with any GNN with privacy-utility guarantees. Extensive experiments on benchmarking datasets show that Solitude can retain the generalization capability of the learned GNN while preserving the users' data privacy under given privacy budgets.

Research Area(s)

  • Differential privacy, Privacy, Social networking (online), Task analysis, Calibration, Training, Privacy-preserving graph learning, graph neural networks, differential privacy, decentralized network graph