Model-agnostic nonconformity functions for conformal classification
2017 (English)In: Proceedings of the International Joint Conference on Neural Networks, IEEE, 2017, p. 2072-2079Conference paper, Published paper (Refereed)
Abstract [en]
A conformai predictor outputs prediction regions, for classification label sets. The key property of all conformai predictors is that they are valid, i.e., their error rate on novel data is bounded by a preset significance level. Thus, the key performance metric for evaluating conformal predictors is the size of the output prediction regions, where smaller (more informative) prediction regions are said to be more efficient. All conformal predictions rely on nonconformity functions, measuring the strangeness of an input-output pair, and the efficiency depends critically on the quality of the chosen nonconformity function. In this paper, three model-agnostic nonconformity functions, based on well-known loss functions, are evaluated with regard to how they affect efficiency. In the experimentation on 21 publicly available multi-class data sets, both single neural networks and ensembles of neural networks are used as underlying models for conformal classifiers. The results show that the choice of nonconformity function has a major impact on the efficiency, but also that different nonconformity functions should be used depending on the exact efficiency metric. For a high fraction of single-label predictions, a margin-based nonconformity function is the best option, while a nonconformity function based on the hinge loss obtained the smallest label sets on average.
Place, publisher, year, edition, pages
IEEE, 2017. p. 2072-2079
Keywords [en]
Classification, Conformal prediction, Neural networks, Efficiency, Forecasting, Classification labels, Conformal predictions, Conformal predictors, Label predictions, Loss functions, Performance metrices, Significance levels, Single neural, Classification (of information)
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:hj:diva-38112DOI: 10.1109/IJCNN.2017.7966105ISI: 000426968702043Scopus ID: 2-s2.0-85031028048ISBN: 9781509061815 (print)OAI: oai:DiVA.org:hj-38112DiVA, id: diva2:1163950
Conference
2017 International Joint Conference on Neural Networks, IJCNN 2017, 14 May 2017 through 19 May 2017
2017-12-082017-12-082019-08-22Bibliographically approved