Change search
Link to record
Permanent link

Direct link
BETA
Casalicchio, EmilianoORCID iD iconorcid.org/0000-0002-3118-5058
Publications (3 of 3) Show all publications
García Martín, E., Lavesson, N., Grahn, H., Casalicchio, E. & Boeva, V. (2019). How to Measure Energy Consumption in Machine Learning Algorithms. In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics): ECMLPKDD 2018: European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases Workshops. Lecture Notes in Computer Science. Springer, Cham. Paper presented at European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2018; Dublin; Ireland; 10 September 2018 through 14 September 2018 (pp. 243-255).
Open this publication in new window or tab >>How to Measure Energy Consumption in Machine Learning Algorithms
Show others...
2019 (English)In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics): ECMLPKDD 2018: European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases Workshops. Lecture Notes in Computer Science. Springer, Cham, 2019, p. 243-255Conference paper, Published paper (Refereed)
Abstract [en]

Machine learning algorithms are responsible for a significant amount of computations. These computations are increasing with the advancements in different machine learning fields. For example, fields such as deep learning require algorithms to run during weeks consuming vast amounts of energy. While there is a trend in optimizing machine learning algorithms for performance and energy consumption, still there is little knowledge on how to estimate an algorithm’s energy consumption. Currently, a straightforward cross-platform approach to estimate energy consumption for different types of algorithms does not exist. For that reason, well-known researchers in computer architecture have published extensive works on approaches to estimate the energy consumption. This study presents a survey of methods to estimate energy consumption, and maps them to specific machine learning scenarios. Finally, we illustrate our mapping suggestions with a case study, where we measure energy consumption in a big data stream mining scenario. Our ultimate goal is to bridge the current gap that exists to estimate energy consumption in machine learning scenarios.

Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 11329
Keywords
Computer architecture, Energy efficiency, Green computing, Machine learning
National Category
Computer Sciences
Identifiers
urn:nbn:se:hj:diva-45614 (URN)10.1007/978-3-030-13453-2_20 (DOI)9783030134525 (ISBN)
Conference
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2018; Dublin; Ireland; 10 September 2018 through 14 September 2018
Funder
Knowledge Foundation, 20140032
Available from: 2019-08-19 Created: 2019-08-19 Last updated: 2019-08-19Bibliographically approved
García Martín, E., Lavesson, N., Grahn, H., Casalicchio, E. & Boeva, V. (2018). Hoeffding Trees with nmin adaptation. In: The 5th IEEE International Conference on Data Science and Advanced Analytics (DSAA 2018): . Paper presented at IEEE 5th International Conference on Data Science and Advanced Analytics, 1–4 October 2018, Turin (pp. 70-79). IEEE
Open this publication in new window or tab >>Hoeffding Trees with nmin adaptation
Show others...
2018 (English)In: The 5th IEEE International Conference on Data Science and Advanced Analytics (DSAA 2018), IEEE, 2018, p. 70-79Conference paper, Published paper (Refereed)
Abstract [en]

Machine learning software accounts for a significant amount of energy consumed in data centers. These algorithms are usually optimized towards predictive performance, i.e. accuracy, and scalability. This is the case of data stream mining algorithms. Although these algorithms are adaptive to the incoming data, they have fixed parameters from the beginning of the execution. We have observed that having fixed parameters lead to unnecessary computations, thus making the algorithm energy inefficient.In this paper we present the nmin adaptation method for Hoeffding trees. This method adapts the value of the nmin pa- rameter, which significantly affects the energy consumption of the algorithm. The method reduces unnecessary computations and memory accesses, thus reducing the energy, while the accuracy is only marginally affected. We experimentally compared VFDT (Very Fast Decision Tree, the first Hoeffding tree algorithm) and CVFDT (Concept-adapting VFDT) with the VFDT-nmin (VFDT with nmin adaptation). The results show that VFDT-nmin consumes up to 27% less energy than the standard VFDT, and up to 92% less energy than CVFDT, trading off a few percent of accuracy in a few datasets.

Place, publisher, year, edition, pages
IEEE, 2018
Series
Proceedings of the International Conference on Data Science and Advanced Analytics, ISSN 2472-1573
National Category
Computer Sciences
Identifiers
urn:nbn:se:hj:diva-42995 (URN)10.1109/DSAA.2018.00017 (DOI)000459238600008 ()2-s2.0-85062817868 (Scopus ID)978-1-5386-5090-5 (ISBN)
Conference
IEEE 5th International Conference on Data Science and Advanced Analytics, 1–4 October 2018, Turin
Funder
Knowledge Foundation, 20140032
Available from: 2019-02-15 Created: 2019-02-15 Last updated: 2019-08-20Bibliographically approved
García Martín, E., Lavesson, N., Grahn, H., Casalicchio, E. & Boeva, V. (2018). How to Measure Energy Consumption in Machine Learning Algorithms. In: Green Data Mining, International Workshop on Energy Efficient Data Mining and Knowledge Discovery: ECMLPKDD 2018: European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases Workshops. Lecture Notes in Computer Science. Springer, Cham. Paper presented at European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, Dublin.
Open this publication in new window or tab >>How to Measure Energy Consumption in Machine Learning Algorithms
Show others...
2018 (English)In: Green Data Mining, International Workshop on Energy Efficient Data Mining and Knowledge Discovery: ECMLPKDD 2018: European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases Workshops. Lecture Notes in Computer Science. Springer, Cham, 2018Conference paper, Published paper (Refereed)
Abstract [en]

Machine learning algorithms are responsible for a significant amount of computations. These computations are increasing with the advancements in different machine learning fields. For example, fields such as deep learning require algorithms to run during weeks consuming vast amounts of energy. While there is a trend in optimizing machine learning algorithms for performance and energy consumption, still there is little knowledge on how to estimate an algorithm’s energy consumption. Currently, a straightforward cross-platform approach to estimate energy consumption for different types of algorithms does not exist. For that reason, well-known researchers in computer architecture have published extensive works on approaches to estimate the energy consumption. This study presents a survey of methods to estimate energy consumption, and maps them to specific machine learning scenarios. Finally, we illustrate our mapping suggestions with a case study, where we measure energy consumption in a big data stream mining scenario. Our ultimate goal is to bridge the current gap that exists to estimate energy consumption in machine learning scenarios.

National Category
Computer Sciences
Identifiers
urn:nbn:se:hj:diva-42996 (URN)
Conference
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, Dublin
Funder
Knowledge Foundation, 20140032
Available from: 2019-02-15 Created: 2019-02-15 Last updated: 2019-08-20
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-3118-5058

Search in DiVA

Show all publications