Secure Out-of-Distribution Task Generalization with Energy-Based Models

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

2 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 36 (NeurIPS 2023)
EditorsA. Oh, T. Naumann, A. Globerson, K. Saenko, M. Hardt, S. Levine
Number of pages14
Publication statusPublished - Dec 2023

Publication series

Name
ISSN (Print)1049-5258

Conference

Title37th Conference on Neural Information Processing Systems (NeurIPS 2023)
LocationNew Orleans Ernest N. Morial Convention Center
PlaceUnited States
CityNew Orleans
Period10 - 16 December 2023

Abstract

The success of meta-learning on out-of-distribution (OOD) tasks in the wild has proved to be hit-and-miss. To safeguard the generalization capability of the meta-learned prior knowledge to OOD tasks, in particularly safety-critical applications, necessitates detection of an OOD task followed by adaptation of the task towards the prior. Nonetheless, the reliability of estimated uncertainty on OOD tasks by existing Bayesian meta-learning methods is restricted by incomplete coverage of the feature distribution shift and insufficient expressiveness of the meta-learned prior. Besides, they struggle to adapt an OOD task, running parallel to the line of cross-domain task adaptation solutions which are vulnerable to overfitting. To this end, we build a single coherent framework that supports both detection and adaptation of OOD tasks, while remaining compatible with off-the-shelf meta-learning backbones. The proposed Energy-Based Meta-Learning (EBML) framework learns to characterize any arbitrary meta-training task distribution with the composition of two expressive neural-network-based energy functions. We deploy the sum of the two energy functions, being proportional to the joint distribution of a task, as a reliable score for detecting OOD tasks; during meta-testing, we adapt the OOD task to in-distribution tasks by energy minimization. Experiments on four regression and classification datasets demonstrate the effectiveness of our proposal. © 2023 Neural information processing systems foundation.

Citation Format(s)

Secure Out-of-Distribution Task Generalization with Energy-Based Models. / Chen, Shengzhuang; Huang, Long-Kai; Schwarz, Jonathan Richard et al.
Advances in Neural Information Processing Systems 36 (NeurIPS 2023). ed. / A. Oh; T. Naumann; A. Globerson; K. Saenko; M. Hardt; S. Levine. 2023.

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review