Trustworthy AI for Robust and Reliable Predictive Models in Sensor Networks

Project: Research

Project Details

Description

Graph Neural Networks (GNNs) have demonstrated their exceptional ability to model graphstructureddata across diverse domains, including healthcare, autonomous systems, and sensornetworks. However, their application in high-stakes environments is hindered by the lack ofreliable uncertainty quantification methods. Conformal Prediction (CP) offers a promisingsolution by providing assumption-free prediction intervals with rigorous probabilisticguarantees. Yet, existing CP methods often fail to account for the heteroscedasticity andcomplex topology inherent in graph data, leading to overly conservative prediction intervals.This project proposes Residual Reweighted Conformal Prediction for Graph NeuralNetworks (RR-GNN), a novel framework designed to enhance uncertainty quantification ingraph-based tasks. By combining graph-structured conformal prediction, residualreweighting, and a cross-training protocol, RR-GNN aims to provide efficient, localized, andadaptive prediction intervals for sensor networks. The project will focus on applying thisframework to sensor networks, where reliable performance is critical for real-time decisionmaking.Building on our prior work on graph-based Conformal Prediction and localizedprediction intervals, this research will advance the field of Trustworthy AI by addressing keychallenges in uncertainty quantification. The proposed work will benefit safety-criticalapplications and significantly enhance the reliability of GNNs in sensor networks.
Project number7020161
Grant typeREG-Small Scale
StatusActive
Effective start/end date1/06/25 → …

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.