Nuances in Margin Conditions Determine Gains in Active Learning

Samory Kpotufe, Gan Yuan, Yunfan Zhao

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

2 Citations (Scopus)

Abstract

We consider nonparametric classification with smooth regression functions, where it is well known that notions of margin in E[Y |X] determine fast or slow rates in both active and passive learning. Here we elucidate a striking distinction between the two settings. Namely, we show that some seemingly benign nuances in notions of margin—somehow involving the uniqueness of the Bayes classifier, and which have no apparent effect on rates in passive learning—determine whether or not any active learner can outperform passive learning rates. In particular, for AudibertTsybakov’s margin condition (allowing general situations with non-unique Bayes classifiers), no active learner can gain over passive learning in commonly studied settings where the marginal on X is near uniform. Our results thus negate the usual intuition from past literature that active rates should generally improve over passive rates in nonparametric settings. Copyright 2022 by the author(s).
Original languageEnglish
Title of host publicationProceedings of Machine Learning Research
Pages8112-8126
Number of pages15
Volume151
Publication statusPublished - 2022
Externally publishedYes
Event25th International Conference on Artificial Intelligence and Statistics (AISTATS 2022) - Virtual, Valencia, Spain
Duration: 28 Mar 202230 Mar 2022
https://proceedings.mlr.press/v151/

Conference

Conference25th International Conference on Artificial Intelligence and Statistics (AISTATS 2022)
Country/TerritorySpain
CityValencia
Period28/03/2230/03/22
Internet address

Fingerprint

Dive into the research topics of 'Nuances in Margin Conditions Determine Gains in Active Learning'. Together they form a unique fingerprint.

Cite this