According to Gartner, worldwide virtualisation software revenue is expected to increase by 43 percent from $1.9 billion in 2008 to $2.7 billion in 2009. Global virtualisation penetration is on pace to reach 20 percent in 2009 from 12 percent in 2008. Its adoption within the IT organisation is driven by the need to reduce the total cost of ownership (TCO), enhance the agility and speed of deployment of IT solutions and minimise the carbon footprint.
These predictions from Gartner bring to light the magnanimity of this market; however with this kind of expansion, the challenges associated with virtual environments are also on the rise. Thus, IT heads are left scouting for the right management tools and skill sets to handle virtual environments.
Factors leading to proliferation of virtualised environments
Ashok Swaminathan, senior consultant, CA, says, "What we should remember from a CIO perspective is that virtualisation is implemented for a business objective. If the CIO wants to know what is there in the virtualised environment, then he/she needs management solutions. A CIO has to know how many servers have been virtualised in order to go in for dynamic virtualisation based on time or load factor. For instance, it’s like saying that today there are ten systems and the enterprise wants to bring in the power of one more system just to handle high week-end or month-end loads. In such a situation, a CIO would want to use virtualisation to power up a particular system so that the load can be distributed. In such environments where virtualisation is a preferred option, it is advisable to have control over what has been done and this control can come only with the knowledge of what is being done and how it is being done and that’s how management comes into the picture, and that is what managing virtual environments is all about."
While load distribution is one side of the coin, immediate availability and feasibility of virtualisation platforms is the other side; both these factors have fostered the growth of virtual environments.
According to Professor Harrick Vin, VP – R&D and chief scientist – Systems Research lab, TCS, over the last one year hypervisors or virtualisation platforms have become almost free and as a result commoditised and thus, the name of the game has shifted to the use of this additional capability that is available to manage data centres better. There are several variants of virtualisation technologies that are available but the difficulty lies in effectively using them and successfully deploying them in virtual environments. So the future focus is going to be on the management tools that go with virtualisation. The trend is to lend more thought towards management tools that allow people to use the virtualisation technology effectively rather than dwelling on the technology itself.
As many would agree, virtualisation is definitely a tempting proposition especially during current times when budgets are under tight control and there is an increased pressure on CIOs to cut costs and optimise the current infrastructure installed. Though it all looks good from the outside and sounds like a great concept, it also comes with its share of challenges and effective management seems to be the only way out.
"People don’t manage the virtual environment, which is the first challenge. What they often do is randomly bring up virtual environments where they would have multiple copies of VMware, which is called VM Sprawl. What happens is that initially the idea of getting a virtual environment is to consolidate servers to reduce space and power consumption; however, when you have multiple VMwares running and you have no control or visibility into where they are running, you lose track of whether so many are actually required to be run and you may overlook the fact that the purpose of virtualisation may already be met with the existing number of VMwares. Managing these means having control and it also means policy-driven virtualisation. It also means that nobody is authorised to create a new virtual environment without actual policies," explains Swaminathan.
Security is a mounting concern with CIOs when it comes to virtualisation. Keeping track of all the activities in the virtual environment and handling communication among machines can be a daunting task.
Professor Vin feels that one problem that does occur when many virtualisation platforms are used is that any virtual machine that is hosted on a physical machine can talk to the latter without sending any information on the wire. This means that traditional intrusion detection systems or intrusion prevention systems that are sitting on the network and watching all the traffic will not be able to capture any exchange between the virtual machine and its physical host, which means that traditional solutions won’t quite work. Thus security is another major area where new products will start coming in, to ensure that intrusion prevention and detection systems and network security work effectively even in the presence of virtual machines.
Professor Vin adds further, "With virtualisation there is a lot of dynamism that is brought into the environment, which increases complexity, so you have to make sure that you have the right expertise available to manage a virtualised environment. So, in that sense there is no free lunch. You have to make sure that you are able to manage the complexity well, for if you can’t then the cost will actually go up, not in terms of hardware but in terms of labour. So the operational expense could go up as a result of increased complexity and you could end up spending more time and effort to manage your environment."
Guidelines to better manage virtualised environments
Now that virtualisation is here to stay and is a respite for many, BIztech2 offers below a few expert tips to make the most of virtualised environments.
With policy-driven virtualisation the concept of change management is heralded. Citing an example, Swaminathan says, "If there is a business request to distribute the load and create a new virtual environment, it has to go through a proper change request and policies, there should be an approval process and then there should be a software component for powering the virtual environment manually. Once the job is done, there has to be a process via which the virtual environment is shut down. During this entire cycle, it will be noticed that when new systems are being powered, it is also ensured that applications can be effectively run on those systems." Questions that would arise here would pertain to compliance and licencing, which would have to be taken into consideration.
In Vin’s words, everything that works in the traditional environment (such as asset management, change management, incident management, capacity and performance management etc) has to be looked at in a fresh perspective as all these functions work well in an environment that is relatively static. "When you move to a more dynamic environment with virtual machines, where the virtual machines can move around, then all those traditional solutions don’t work. Anyone who is trying to use virtualisation has to look at the entire roadmap, you cannot just look at server virtualisation and the cost that you are going to save; you have to also look at the impact of adding this level of dynamism in your environment and then essentially invest in tools and technologies and later expertise followed by process improvement such that you can make these technologies reap the right benefits for the organisation," he concludes.
Firstpost is now on WhatsApp. For the latest analysis, commentary and news updates, sign up for our WhatsApp services. Just go to Firstpost.com/Whatsapp and hit the Subscribe button.
Updated Date: Apr 13, 2011 15:12:08 IST