Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Hoeffding Trees with nmin adaptation
Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.ORCID iD: 0000-0003-4973-9255
Jönköping University, School of Engineering, JTH, Computer Science and Informatics, JTH, Jönköping AI Lab (JAIL). Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.ORCID iD: 0000-0002-0535-1761
Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.ORCID iD: 0000-0001-9947-1088
Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.ORCID iD: 0000-0002-3118-5058
Show others and affiliations
2018 (English)In: The 5th IEEE International Conference on Data Science and Advanced Analytics (DSAA 2018), IEEE, 2018, p. 70-79Conference paper, Published paper (Refereed)
Abstract [en]

Machine learning software accounts for a significant amount of energy consumed in data centers. These algorithms are usually optimized towards predictive performance, i.e. accuracy, and scalability. This is the case of data stream mining algorithms. Although these algorithms are adaptive to the incoming data, they have fixed parameters from the beginning of the execution. We have observed that having fixed parameters lead to unnecessary computations, thus making the algorithm energy inefficient.In this paper we present the nmin adaptation method for Hoeffding trees. This method adapts the value of the nmin pa- rameter, which significantly affects the energy consumption of the algorithm. The method reduces unnecessary computations and memory accesses, thus reducing the energy, while the accuracy is only marginally affected. We experimentally compared VFDT (Very Fast Decision Tree, the first Hoeffding tree algorithm) and CVFDT (Concept-adapting VFDT) with the VFDT-nmin (VFDT with nmin adaptation). The results show that VFDT-nmin consumes up to 27% less energy than the standard VFDT, and up to 92% less energy than CVFDT, trading off a few percent of accuracy in a few datasets.

Place, publisher, year, edition, pages
IEEE, 2018. p. 70-79
Series
Proceedings of the International Conference on Data Science and Advanced Analytics, ISSN 2472-1573
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:hj:diva-42995DOI: 10.1109/DSAA.2018.00017ISI: 000459238600008Scopus ID: 2-s2.0-85062817868ISBN: 978-1-5386-5090-5 (print)OAI: oai:DiVA.org:hj-42995DiVA, id: diva2:1288961
Conference
IEEE 5th International Conference on Data Science and Advanced Analytics, 1–4 October 2018, Turin
Funder
Knowledge Foundation, 20140032Available from: 2019-02-15 Created: 2019-02-15 Last updated: 2019-08-20Bibliographically approved

Open Access in DiVA

fulltext(898 kB)105 downloads
File information
File name FULLTEXT01.pdfFile size 898 kBChecksum SHA-512
2c3be9ddd7ae2116b0d99d5982dd22817118c9d3b05f5a60e29c463ceff49c108301b03b1dc907f5630b99c3bcbcd9757c60576e6bb0544430562b89e0f7a4aa
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Authority records BETA

García Martín, EvaLavesson, NiklasGrahn, HåkanCasalicchio, EmilianoBoeva, Veselka

Search in DiVA

By author/editor
García Martín, EvaLavesson, NiklasGrahn, HåkanCasalicchio, EmilianoBoeva, Veselka
By organisation
JTH, Jönköping AI Lab (JAIL)
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 105 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 163 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf