Expert advice for companies hesitant to implement virtualization

file 63 blue_lightbulb
For enterprise IT, evaluating different virtualization options can be a challenge, especially for organizations that don’t have any experience with it. That’s no reason to shy away from the technology, though, as virtualization is a great way to manage resources and reduce costs. To get some expert insight on the matter, we caught up with Matthew Portnoy, who literally wrote the book on virtualization.

Published on Aug. 29, 2016, the second edition of Virtualization Essentials contains an in-depth explanation of virtualization, as well as chapters on hypervisors, VMs and availability. The book also covers a broad range of processes, such as managing CPUs, memory, storage and networking for VMs. Here, Matthew Portnoy answers questions about virtualization basics and gives advice to those looking to implement virtualization.

What are the advantages of virtualization? Are there disadvantages?
Matthew Portnoy: The advantages of virtualization are still the same as they were at the beginning — more efficient and reliable use of resources at lower costs. With virtualization moving into other areas of the data center, such as storage and networking, and the addition of virtual appliances — load balancers, VPNs [virtual private networks], firewalls — costs are permanently being driven down and efficiencies are still improving.

If there’s a disadvantage to virtualization, I would say it’s also still the same as the initial challenges: that a poorly planned deployment will perform poorly. That might be said about physical deployments, as well, though virtual infrastructures often mask issues that might be more obvious in a physical deployment. One common example is not providing enough I/O for storage requirements, though users now allocate enough memory and CPU resources, which was not always the case.

What are the biggest misconceptions regarding virtualization?
Portnoy: Virtualization is not a panacea for bad practices. You have to pay as much attention to resources in the virtual world as you do in the physical. You can architect and deploy efficient, performant, highly available and secure virtual environments, but it doesn’t just happen. You have to understand the product you are working with, know what its limitations and benefits are and act accordingly.

The mindsets that used to limit which applications could be virtualized are mostly gone now via a combination of maturing administrators, more capable hardware and experience. There are things that you can take advantage of in the virtual environment that can’t be duplicated in the physical world — page sharing, for example, or new security models in virtual networking — that still make people hesitant or skeptical. But a well-deployed virtual environment is still usually less expensive — per workload — more available, more manageable and more secure than their physical counterparts.

What types of organizations are more likely to implement virtualization?
Portnoy: At this time, there are no organizations that couldn’t implement virtualization if they choose to. In certain industries, there are use cases that might drive organizations to virtualize sooner — healthcare providers and virtual desktops come to mind. Today, even smaller companies probably have some virtualization in their IT departments whether they know it or not.

One early driver was that VMware could provide High Availability to workloads on a cluster without any additional hardware or software. Organizations could get better uptime merely by being on the platform, whereas in the physical environment, those unprotected workloads would be prone to both planned and unplanned downtime.

What steps should companies take when evaluating virtualization products?
Portnoy: Like with other products, the best questions to ask are: What problem am I trying to solve? What is the cost of trying to solve this? What will it cost me if I do nothing? What are the transformational effects and costs on the organization of deploying this particular platform?

I often heard that VMware [was] expensive, but it solved certain issues for companies and returned enough value in a short enough time to make it an easy choice for many people. If Hyper-V solves the issue at a lower price and the feature differences are not relevant, well, that’s a good answer. Nothing beats kicking the tires, so if you have the time, download the products and either compare them using your use cases or have a trusted partner assist your efforts.

How can IT best partner with the C-suite and other departments to evaluate virtualization products?
Portnoy: It comes back to the value IT is providing to the business. Initially, virtualization drove huge hardware costs out of the data center while increasing availability and decreasing deployment times. This allowed companies to provide a more stable platform for their applications and allowed them to roll out new applications much faster, decreasing time to market for certain corporate initiatives.

Virtualization provides some really interesting disaster recovery products, again, at lower deployment costs and lower operational costs than traditional models. In areas prone to natural disasters — hurricanes, tornados, snowstorms and so on — virtual platforms offered business continuance in the event of [a] disaster at a lower barrier to entry. New technologies like long distance vMotion, [which offers] the capability to migrate a running VM across continental distances without interruption, offer other possibilities. Virtual networking can significantly improve environment security, which is an important topic for any public-facing company today.

What are the top challenges companies face when trying to implement virtualization?
Portnoy: The technology itself is fairly mature today, so much of the challenge is in the people side of the equation. Change means unknown, and unknown is uncomfortable, and uncomfortable and unknown are two words that application owners try to stay far away from. Education of the executive and application owners is probably the single largest way to smooth that transition. Once they understand the value that virtualization provides, they are usually proponents of the implementation.

For application owners, focusing on how virtualization can help them mitigate risk with higher availability, improve testing and quality assurance through VM cloning, and improve performance with dynamic server upgrades — without incurring downtime — are all compelling reasons.

Executives are looking to be more agile, more cost effective and to protect the interests of their company. Virtualization provides these benefits as well.

How do you recommend balancing costs with the desire to have the latest technology?
Portnoy: If everyone had unlimited resources … But, seriously, cost is the practicality that drives most projects. If the latest technology provides a solution to an acute problem that needs to be solved, then, usually, cost is no longer the gating factor. But that is not typical.

It still comes back to what is the value that can be provided with a platform, and is the problem it solves going to be worth the money that is spent. Project dollars are competed for in most organizations, and unless there is a strong case for deploying a particular product, those funds can quickly find another home.

Are you interested in Virtualization of your IT Environment? Contact Quant ICT Group, www.quant-ict.nl, info@quant-ict.nl, tel: +31880882500

Source: TechTarget, Ryann Burnett