RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
One Tree to Explain Them All
Högskolan i Borås, Institutionen Handels- och IT-högskolan.ORCID-id: 0000-0003-0412-6199
Högskolan i Borås, Institutionen Handels- och IT-högskolan.
Högskolan i Borås, Institutionen Handels- och IT-högskolan.ORCID-id: 0000-0003-0274-9026
2011 (engelsk)Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Random forest is an often used ensemble technique, renowned for its high predictive performance. Random forests models are, however, due to their sheer complexity inherently opaque, making human interpretation and analysis impossible. This paper presents a method of approximating the random forest with just one decision tree. The approach uses oracle coaching, a recently suggested technique where a weaker but transparent model is generated using combinations of regular training data and test data initially labeled by a strong classifier, called the oracle. In this study, the random forest plays the part of the oracle, while the transparent models are decision trees generated by either the standard tree inducer J48, or by evolving genetic programs. Evaluation on 30 data sets from the UCI repository shows that oracle coaching significantly improves both accuracy and area under ROC curve, compared to using training data only. As a matter of fact, resulting single tree models are as accurate as the random forest, on the specific test instances. Most importantly, this is not achieved by inducing or evolving huge trees having perfect fidelity; a large majority of all trees are instead rather compact and clearly comprehensible. The experiments also show that the evolution outperformed J48, with regard to accuracy, but that this came at the expense of slightly larger trees.

sted, utgiver, år, opplag, sider
IEEE, 2011.
Emneord [en]
genetic programming, random forest, oracle coaching, decision trees, Machine learning
Emneord [sv]
Data mining
HSV kategori
Forskningsprogram
Handel och IT
Identifikatorer
URN: urn:nbn:se:hj:diva-45800Lokal ID: 0;0;miljJAILISBN: 978-1-4244-7834-7 (tryckt)OAI: oai:DiVA.org:hj-45800DiVA, id: diva2:1348946
Konferanse
IEEE Congress on Evolutionary Computation (CEC)
Merknad

Sponsorship:

This work was supported by the INFUSIS project www.his.se/infusis at the University of Skövde, Sweden, in partnership with the Swedish Knowledge Foundation under grant 2008/0502.

Tilgjengelig fra: 2019-09-06 Laget: 2019-09-06 Sist oppdatert: 2019-09-06bibliografisk kontrollert

Open Access i DiVA

fulltext(116 kB)5 nedlastinger
Filinformasjon
Fil FULLTEXT01.pdfFilstørrelse 116 kBChecksum SHA-512
9e110b053bb404bc781ba62a064543de61f921063b10402e7b68a87cd9bdfbb0594fea1e25d87d74a0aaf19027c485c6f96fb8b2b513a3115728c9195d7e3e5a
Type fulltextMimetype application/pdf

Personposter BETA

Johansson, UlfSönströd, CeciliaLöfström, Tuve

Søk i DiVA

Av forfatter/redaktør
Johansson, UlfSönströd, CeciliaLöfström, Tuve

Søk utenfor DiVA

GoogleGoogle Scholar
Totalt: 5 nedlastinger
Antall nedlastinger er summen av alle nedlastinger av alle fulltekster. Det kan for eksempel være tidligere versjoner som er ikke lenger tilgjengelige

isbn
urn-nbn

Altmetric

isbn
urn-nbn
Totalt: 54 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf