Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Regression conformal prediction with random forests
Högskolan i Borås, Institutionen Handels- och IT-högskolan.ORCID iD: 0000-0003-0412-6199
Högskolan i Borås, Institutionen Handels- och IT-högskolan.
Högskolan i Borås, Institutionen Handels- och IT-högskolan.ORCID iD: 0000-0003-0274-9026
Högskolan i Borås, Institutionen Handels- och IT-högskolan.
2014 (English)In: Machine Learning, ISSN 0885-6125, E-ISSN 1573-0565, Vol. 97, no 1-2, p. 155-176Article in journal (Refereed) Published
Abstract [en]

Regression conformal prediction produces prediction intervals that are valid, i.e., the probability of excluding the correct target value is bounded by a predefined confidence level. The most important criterion when comparing conformal regressors is efficiency; the prediction intervals should be as tight (informative) as possible. In this study, the use of random forests as the underlying model for regression conformal prediction is investigated and compared to existing state-of-the-art techniques, which are based on neural networks and k-nearest neighbors. In addition to their robust predictive performance, random forests allow for determining the size of the prediction intervals by using out-of-bag estimates instead of requiring a separate calibration set. An extensive empirical investigation, using 33 publicly available data sets, was undertaken to compare the use of random forests to existing stateof- the-art conformal predictors. The results show that the suggested approach, on almost all confidence levels and using both standard and normalized nonconformity functions, produced significantly more efficient conformal predictors than the existing alternatives.

Place, publisher, year, edition, pages
Springer, 2014. Vol. 97, no 1-2, p. 155-176
Keywords [en]
Conformal prediction, Random forests, Regression, Machine learning, Data mining
National Category
Computer Sciences Computer and Information Sciences
Identifiers
URN: urn:nbn:se:hj:diva-38084DOI: 10.1007/s10994-014-5453-0Scopus ID: 2-s2.0-84906946396Local ID: 0;0;miljJAILOAI: oai:DiVA.org:hj-38084DiVA, id: diva2:1163349
Note

Sponsorship:

This work was supported by the Swedish Foundation for Strategic Research through the project High-Performance

Data Mining for Drug Effect Detection (IIS11-0053) and the Knowledge Foundation through the project Big

Data Analytics by Online Ensemble Learning (20120192).

Available from: 2017-12-06 Created: 2017-12-06 Last updated: 2019-08-23Bibliographically approved

Open Access in DiVA

fulltext(217 kB)295 downloads
File information
File name FULLTEXT01.pdfFile size 217 kBChecksum SHA-512
548c08992f51434222d00525da354365d714748ce5b5ad090d097d46d79932cb80c6d1c842e725622ada0faf4ce24485deea72cdbc386fc205d40aef45398846
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Authority records

Johansson, UlfLöfström, Tuve

Search in DiVA

By author/editor
Johansson, UlfLöfström, Tuve
In the same journal
Machine Learning
Computer SciencesComputer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 295 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 410 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf