Jacob Ozubu| Allafrica
IT virtualisation refers to various techniques, methods or approaches of creating a virtual hardware platform, operating system, and storage device or network resources.
It is the abstraction of physical network, server, and storage resources to greatly increase the ability to utilize and scale compute power. Indeed, virtualisation has become the very technology engine behind cloud computing.
Cloud computing is the use of computing resources such as hardware and software, which are available in a remote location and accessible over a network. Users are able to buy these computing resources including storage and computing power as utility on demand. The name comes from the common use of a cloud-shaped symbol as an abstraction for the complex infrastructure it contains in system diagrams. Cloud computing entrusts remote services with a user’s data, software and computation.
There are four effects or attributes of IT virtualisation. They include the rise of high density, power usage effectiveness (PUE), Dynamic IT loads and Lower redundancy requirements.
High Density Rise
While virtualisation may reduce overall power consumption in the room, virtualised servers tend to be installed and grouped in ways that create localised high-density areas that can lead to “hot spots”. This cooling challenge may come as a surprise given the dramatic decrease in power consumption that is possible due to high, realistically achievable physical server consolidation ratios of 10:1, 20:1 or even much higher. As a physical host is loaded up with more and more virtual machines, its CPU utilisation will increase. Although far from being a linear relationship, the power draw of that physical host increases as the utilisation increases. A typical non-virtualized server’s CPU utilization is around 5 per cent-10 per cent.
A virtualised server, however, could be 50 per cent or higher. The difference in power draw between 5 per cent and 50 per cent CPU utilisation would be about 20 per cent depending on the specific machine in question. Additionally, virtualised machines will often require increased processor and memory resources which can further raise power consumption above what a non-virtualised machine would draw. Grouping or clustering these bulked up, virtualised servers can result in significantly higher power densities that could then cause cooling problems. Not only are densities increasing, but virtualisation also allows workloads to be dynamically moved, started, and stopped – the result can be physical loads that change both over time and in their physical location.