Ändra sökning
Avgränsa sökresultatet
1 - 7 av 7
RefereraExporteraLänk till träfflistan
Permanent länk
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Träffar per sida
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sortering
  • Standard (Relevans)
  • Författare A-Ö
  • Författare Ö-A
  • Titel A-Ö
  • Titel Ö-A
  • Publikationstyp A-Ö
  • Publikationstyp Ö-A
  • Äldst först
  • Nyast först
  • Skapad (Äldst först)
  • Skapad (Nyast först)
  • Senast uppdaterad (Äldst först)
  • Senast uppdaterad (Nyast först)
  • Disputationsdatum (tidigaste först)
  • Disputationsdatum (senaste först)
  • Standard (Relevans)
  • Författare A-Ö
  • Författare Ö-A
  • Titel A-Ö
  • Titel Ö-A
  • Publikationstyp A-Ö
  • Publikationstyp Ö-A
  • Äldst först
  • Nyast först
  • Skapad (Äldst först)
  • Skapad (Nyast först)
  • Senast uppdaterad (Äldst först)
  • Senast uppdaterad (Nyast först)
  • Disputationsdatum (tidigaste först)
  • Disputationsdatum (senaste först)
Markera
Maxantalet träffar du kan exportera från sökgränssnittet är 250. Vid större uttag använd dig av utsökningar.
  • 1.
    García Martín, Eva
    et al.
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Lavesson, Niklas
    Högskolan i Jönköping, Tekniska Högskolan, JTH, Datateknik och informatik, JTH, Jönköping AI Lab (JAIL). Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Grahn, Håkan
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Energy Efficiency Analysis of the Very Fast Decision Tree Algorithm2017Ingår i: Trends in Social Network Analysis: Information Propagation, User Behavior Modeling, Forecasting, and Vulnerability Assessment / [ed] Rokia Missaoui, Talel Abdessalem, Matthieu Latapy, Cham, Switzerland: Springer, 2017, s. 229-252Kapitel i bok, del av antologi (Refereegranskat)
    Abstract [en]

    Data mining algorithms are usually designed to optimize a trade-off between predictive accuracy and computational efficiency. This paper introduces energy consumption and energy efficiency as important factors to consider during data mining algorithm analysis and evaluation. We conducted an experiment to illustrate how energy consumption and accuracy are affected when varying the parameters of the Very Fast Decision Tree (VFDT) algorithm. These results are compared with a theoretical analysis on the algorithm, indicating that energy consumption is affected by the parameters design and that it can be reduced significantly while maintaining accuracy.

  • 2.
    García Martín, Eva
    et al.
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Lavesson, Niklas
    Högskolan i Jönköping, Tekniska Högskolan, JTH, Datateknik och informatik, JTH, Jönköping AI Lab (JAIL). Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Grahn, Håkan
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Identification of Energy Hotspots: A Case Study of the Very Fast Decision Tree2017Ingår i: GPC 2017: Green, Pervasive, and Cloud Computing / [ed] Au M., Castiglione A., Choo KK., Palmieri F., Li KC., Cham, Switzerland: Springer , 2017, s. 267-281Konferensbidrag (Refereegranskat)
    Abstract [en]

    Large-scale data centers account for a significant share of the energy consumption in many countries. Machine learning technology requires intensive workloads and thus drives requirements for lots of power and cooling capacity in data centers. It is time to explore green machine learning. The aim of this paper is to profile a machine learning algorithm with respect to its energy consumption and to determine the causes behind this consumption. The first scalable machine learning algorithm able to handle large volumes of streaming data is the Very Fast Decision Tree (VFDT), which outputs competitive results in comparison to algorithms that analyze data from static datasets. Our objectives are to: (i) establish a methodology that profiles the energy consumption of decision trees at the function level, (ii) apply this methodology in an experiment to obtain the energy consumption of the VFDT, (iii) conduct a fine-grained analysis of the functions that consume most of the energy, providing an understanding of that consumption, (iv) analyze how different parameter settings can significantly reduce the energy consumption. The results show that by addressing the most energy intensive part of the VFDT, the energy consumption can be reduced up to a 74.3%.

  • 3.
    García Martín, Eva
    et al.
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Lavesson, Niklas
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Grahn, Håkan
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Casalicchio, Emiliano
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Boeva, Veselka
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Hoeffding Trees with nmin adaptationManuskript (preprint) (Övrigt vetenskapligt)
    Abstract [en]

    Machine learning software accounts for a significant amount of energy consumed in data centers. These algorithms are usually optimized towards predictive performance, i.e. accuracy, and scalability. This is the case of data stream mining algorithms. Although these algorithms are adaptive to the incoming data, they have fixed parameters from the beginning of the execution, which lead to energy hotspots. We present dynamic parameter adaptation for data stream mining algorithms to trade-off energy efficiency against accuracy during runtime. To validate this approach, we introduce the nmin adaptation method to improve parameter adaptation in Hoeffding trees. This method dynamically adapts the number of instances needed to make a split (nmin) and thereby reduces the overall energy consumption. We created an experiment to compare the Very Fast Decision Tree algorithm (VFDT, original Hoeffding tree algorithm) with nmin adaptation and the standard VFDT. The results show that VFDT with nmin adaptation consumes up to 89% less energy than the standard VFDT, trading off a few percent of accuracy. Our approach can be used to trade off energy consumption with predictive and computational performance in the strive towards resource-aware machine learning. 

  • 4.
    García Martín, Eva
    et al.
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Lavesson, Niklas
    Högskolan i Jönköping, Tekniska Högskolan, JTH, Datateknik och informatik, JTH, Jönköping AI Lab (JAIL). Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Grahn, Håkan
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Casalicchio, Emiliano
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Boeva, Veselka
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Hoeffding Trees with nmin adaptation2018Ingår i: The 5th IEEE International Conference on Data Science and Advanced Analytics (DSAA 2018), IEEE, 2018, s. 70-79Konferensbidrag (Refereegranskat)
    Abstract [en]

    Machine learning software accounts for a significant amount of energy consumed in data centers. These algorithms are usually optimized towards predictive performance, i.e. accuracy, and scalability. This is the case of data stream mining algorithms. Although these algorithms are adaptive to the incoming data, they have fixed parameters from the beginning of the execution. We have observed that having fixed parameters lead to unnecessary computations, thus making the algorithm energy inefficient.In this paper we present the nmin adaptation method for Hoeffding trees. This method adapts the value of the nmin pa- rameter, which significantly affects the energy consumption of the algorithm. The method reduces unnecessary computations and memory accesses, thus reducing the energy, while the accuracy is only marginally affected. We experimentally compared VFDT (Very Fast Decision Tree, the first Hoeffding tree algorithm) and CVFDT (Concept-adapting VFDT) with the VFDT-nmin (VFDT with nmin adaptation). The results show that VFDT-nmin consumes up to 27% less energy than the standard VFDT, and up to 92% less energy than CVFDT, trading off a few percent of accuracy in a few datasets.

  • 5.
    García Martín, Eva
    et al.
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Lavesson, Niklas
    Högskolan i Jönköping, Tekniska Högskolan, JTH, Datateknik och informatik, JTH, Jönköping AI Lab (JAIL). Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Grahn, Håkan
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Casalicchio, Emiliano
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Boeva, Veselka
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    How to Measure Energy Consumption in Machine Learning Algorithms2019Ingår i: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics): ECMLPKDD 2018: European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases Workshops. Lecture Notes in Computer Science. Springer, Cham, 2019, s. 243-255Konferensbidrag (Refereegranskat)
    Abstract [en]

    Machine learning algorithms are responsible for a significant amount of computations. These computations are increasing with the advancements in different machine learning fields. For example, fields such as deep learning require algorithms to run during weeks consuming vast amounts of energy. While there is a trend in optimizing machine learning algorithms for performance and energy consumption, still there is little knowledge on how to estimate an algorithm’s energy consumption. Currently, a straightforward cross-platform approach to estimate energy consumption for different types of algorithms does not exist. For that reason, well-known researchers in computer architecture have published extensive works on approaches to estimate the energy consumption. This study presents a survey of methods to estimate energy consumption, and maps them to specific machine learning scenarios. Finally, we illustrate our mapping suggestions with a case study, where we measure energy consumption in a big data stream mining scenario. Our ultimate goal is to bridge the current gap that exists to estimate energy consumption in machine learning scenarios.

  • 6.
    García Martín, Eva
    et al.
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Lavesson, Niklas
    Högskolan i Jönköping, Tekniska Högskolan, JTH, Datateknik och informatik, JTH, Jönköping AI Lab (JAIL). Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Grahn, Håkan
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Casalicchio, Emiliano
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Boeva, Veselka
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    How to Measure Energy Consumption in Machine Learning Algorithms2018Ingår i: Green Data Mining, International Workshop on Energy Efficient Data Mining and Knowledge Discovery: ECMLPKDD 2018: European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases Workshops. Lecture Notes in Computer Science. Springer, Cham, 2018Konferensbidrag (Refereegranskat)
    Abstract [en]

    Machine learning algorithms are responsible for a significant amount of computations. These computations are increasing with the advancements in different machine learning fields. For example, fields such as deep learning require algorithms to run during weeks consuming vast amounts of energy. While there is a trend in optimizing machine learning algorithms for performance and energy consumption, still there is little knowledge on how to estimate an algorithm’s energy consumption. Currently, a straightforward cross-platform approach to estimate energy consumption for different types of algorithms does not exist. For that reason, well-known researchers in computer architecture have published extensive works on approaches to estimate the energy consumption. This study presents a survey of methods to estimate energy consumption, and maps them to specific machine learning scenarios. Finally, we illustrate our mapping suggestions with a case study, where we measure energy consumption in a big data stream mining scenario. Our ultimate goal is to bridge the current gap that exists to estimate energy consumption in machine learning scenarios.

  • 7.
    García-Martín, Eva
    et al.
    Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Lavesson, Niklas
    Högskolan i Jönköping, Tekniska Högskolan, JTH, Datateknik och informatik, JTH, Jönköping AI Lab (JAIL). Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik.
    Is it ethical to avoid error analysis?2017Konferensbidrag (Refereegranskat)
    Abstract [en]

    Machine learning algorithms tend to create more accurate models with the availability of large datasets. In some cases, highly accurate models can hide the presence of bias in the data. There are several studies published that tackle the development of discriminatory-aware machine learning algorithms. We center on the further evaluation of machine learning models by doing error analysis, to understand under what conditions the model is not working as expected. We focus on the ethical implications of avoiding error analysis, from a falsification of results and discrimination perspective. Finally, we show different ways to approach error analysis in non-interpretable machine learning algorithms such as deep learning.

1 - 7 av 7
RefereraExporteraLänk till träfflistan
Permanent länk
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf