Evidential Uncertainty Sets in Deep Classifiers Using Conformal Prediction

Published in Symposium on Conformal and Probabilistic Prediction with Applications (COPA), PMLR, 2024

We propose Evidential Conformal Prediction (ECP) method for image classifiers to generate the conformal prediction sets. Our method is designed based on a non-conformity score function that has its roots in Evidential Deep Learning (EDL) as a method of quantifying model (epistemic) uncertainty in DNN classifiers. We use evidence that are derived from the logit values of target labels to compute the components of our non-conformity score function: the heuristic notion of uncertainty in CP, uncertainty surprisal, and expected utility. Our extensive experimental evaluation demonstrates that ECP outperforms three state-of-the-art methods for generating CP sets, in terms of their set sizes and adaptivity while maintaining the coverage of true labels.



Recommended Citation: Karimi, H., & Samavi, R. (2024). “Evidential Uncertainty Sets in Deep Classifiers Using Conformal Prediction.” In Proceedings of the Thirteenth Symposium on Conformal and Probabilistic Prediction with Applications. Proceedings of Machine Learning Research (PMLR), vol. 230, 466–489.
Download Paper