Umeå universitets logga

umu.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • Vancouver
  • biomed-central
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
An Experimental Performance Evaluation of Autoscalers for Complex Workflows
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap. University of Massachussets, Amherst.
Visa övriga samt affilieringar
2018 (Engelska)Ingår i: ACM Transactions on Modeling and Performance Evaluation of Computing Systems (TOMPECS), ISSN 2376-3639, Vol. 3, nr 2, artikel-id UNSP 8Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Elasticity is one of the main features of cloud computing allowing customers to scale their resources based on the workload. Many autoscalers have been proposed in the past decade to decide on behalf of cloud customers when and how to provision resources to a cloud application based on the workload utilizing cloud elasticity features. However, in prior work, when a new policy is proposed, it is seldom compared to the state-of-the-art, and is often compared only to static provisioning using a predefined quality of service target. This reduces the ability of cloud customers and of cloud operators to choose and deploy an autoscaling policy, as there is seldom enough analysis on the performance of the autoscalers in different operating conditions and with different applications. In our work, we conduct an experimental performance evaluation of autoscaling policies, using as application model workflows, a popular formalism for automating resource management for applications with well-defined yet complex structures. We present a detailed comparative study of general state-of-the-art autoscaling policies, along with two new workflow-specific policies. To understand the performance differences between the seven policies, we conduct various experiments and compare their performance in both pairwise and group comparisons. We report both individual and aggregated metrics. As many workflows have deadline requirements on the tasks, we study the effect of autoscaling on workflow deadlines. Additionally, we look into the effect of autoscaling on the accounted and hourly based charged costs, and we evaluate performance variability caused by the autoscaler selection for each group of workflow sizes. Our results highlight the trade-offs between the suggested policies, how they can impact meeting the deadlines, and how they perform in different operating conditions, thus enabling a better understanding of the current state-of-the-art.

Ort, förlag, år, upplaga, sidor
Association for Computing Machinery (ACM), 2018. Vol. 3, nr 2, artikel-id UNSP 8
Nyckelord [en]
Autoscaling, elasticity, scientific workflows, benchmarking, metrics
Nationell ämneskategori
Datavetenskap (datalogi)
Identifikatorer
URN: urn:nbn:se:umu:diva-138598DOI: 10.1145/3164537ISI: 000430350200004OAI: oai:DiVA.org:umu-138598DiVA, id: diva2:2234
Forskningsfinansiär
VetenskapsrådeteSSENCE - An eScience CollaborationDeutsche Forschungsgemeinschaft (DFG)Tillgänglig från: 2018-06-14 Skapad: 2018-06-14 Senast uppdaterad: 2018-06-14Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltext

Sök vidare i DiVA

Av författaren/redaktören
Ali-Eldin, Ahmed
Av organisationen
Institutionen för datavetenskap
Datavetenskap (datalogi)

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetricpoäng

doi
urn-nbn
Totalt: 1009 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • Vancouver
  • biomed-central
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf