Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Calibrating multi-class models
Jönköping University, School of Engineering, JTH, Department of Computing, Jönköping AI Lab (JAIL).ORCID iD: 0000-0003-0412-6199
Jönköping University, School of Engineering, JTH, Department of Computing, Jönköping AI Lab (JAIL).ORCID iD: 0000-0003-0274-9026
School of Electrical Engineering and Computer Science, KTH Royal Institute of Technology, Sweden.
2021 (English)In: Proceedings of the Tenth Symposium on Conformal and Probabilistic Prediction and Applications / [ed] Lars Carlsson, Zhiyuan Luo, Giovanni Cherubin, Khuong An Nguyen, PMLR , 2021, Vol. 152, p. 111-130Conference paper, Published paper (Refereed)
Abstract [en]

Predictive models communicating algorithmic confidence are very informative, but only if well-calibrated and sharp, i.e., providing accurate probability estimates adjusted for each instance. While almost all machine learning algorithms are able to produce probability estimates, these are often poorly calibrated, thus requiring external calibration. For multiclass problems, external calibration has typically been done using one-vs-all or all-vs-all schemes, thus adding to the computational complexity, but also making it impossible to analyze and inspect the predictive models. In this paper, we suggest a novel approach for calibrating inherently multi-class models. Instead of providing a probability distribution over all labels, the estimation is of the probability that the class label predicted by the underlying model is correct. In an extensive empirical study, it is shown that the suggested approach, when applied to both Platt scaling and Venn-Abers, is able to improve the probability estimates from decision trees, random forests and extreme gradient boosting.

Place, publisher, year, edition, pages
PMLR , 2021. Vol. 152, p. 111-130
Series
Proceedings of Machine Learning Research
National Category
Computer Sciences Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:hj:diva-54836OAI: oai:DiVA.org:hj-54836DiVA, id: diva2:1601445
Conference
Conformal and Probabilistic Prediction and Applications, 8-10 September 2021, Virtual
Available from: 2021-10-08 Created: 2021-10-08 Last updated: 2021-10-08Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Fulltext

Authority records

Johansson, UlfLöfström, Tuwe

Search in DiVA

By author/editor
Johansson, UlfLöfström, Tuwe
By organisation
Jönköping AI Lab (JAIL)
Computer SciencesProbability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 146 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf