- Postprint37351.pdf (187k)
Proceedings of the 31st Annual ACM Symposium on Applied Computing;2016
Association for Computing Machinery
In this paper, we present a novel and realistic model for modelling CPU usage of virtual machines on a cloud. The model is based on considering the CPU consumption of a virtual machine as a one generated from a Hidden Markov Model (HMM). The model assumes that the hidden layer (Markov chain) is inhomogeneous and depends on the time of day. In addition the model assumes that the observations follow an autoregressive process. The deviations from standard HMMs are motivated by the properties of real CPU consumption data. The HMM model replicate the properties of the real CPU consumption data in a very realistic way and outperform both AR(1) and AR(2) models in predicting time ahead CPU consumption.