Virtual Economics

Table of contents

  1.  

The current economy places substantial burdens on every business. Meeting these challenges requires stretching every resource to its fullest potential and allowing nothing to go to waste. When it comes to the technology that drives business, virtualization provides the first and best answer to maximizing investment returns.

Applications are the face of technology, and in the interconnected world of business, most of them rely on a central brain located somewhere else. Whether it’s an e-mail program, a system for storage and retrieval of customer data, or a financial assets tracker, that application needs a server system. Even though the server component may communicate perfectly with the application, it’s unlikely to play nicely with server components that support other applications. Purchasing and maintaining server-class hardware to hold individual server software components can quickly drain budgets. Virtualizing those loads allows for application servers to be consolidated on substantially less hardware yet retain the necessary isolation. Advances in hardware technologies have outpaced the demands of most software such that most business software rarely taxes even modest equipment. Many hardware components spend most of their time idling. Through the consolidation provided by virtualization, not only is equipment quantity reduced, the hardware that remains is more fully utilized.

Of course, businesses don’t acquire all their applications at the same time. Each major software deployment is typically accompanied by server-class hardware deployment. When hardware isn’t deployed on a matched cycle, datacenters become a mix of vendors and technology generations, all of which must be managed and tracked. Virtualization decouples the hardware cycle from the software cycle, allowing a deployment schedule – and budget – that is more predictable and controllable.

By reducing costs and breaking software’s one-to-one dependence on hardware, virtualization has another positive side effect. Server applications can be protected by redundancy with ease and cost efficiency. Hardware failures result in the loss of seconds instead of hours. IT managers no longer need to prioritize which applications will receive rapid protection due to budgeting constraints.

There are many virtualization solutions available today (called “hypervisors”), and the market continues to expand with new offerings. Unfortunately, most of them drastically reduce the cost-saving benefits of virtualization with high price tags for software licensing, consultants, and training. Microsoft’s Hyper-V R2 stands apart by providing all its features without licensing cost and by leveraging the same Microsoft Server technology that most IT departments are already highly familiar with. Hyper-V R2 offers both the expected features of a modern hypervisor and several unique technologies such as:

  • High availability: two or more physical server computers can be joined together in a “cluster”. If administrators suspect an imminent hardware failure, they can move virtual machines to functioning hosts without any service interruption; if the failure occurs without preparation, Hyper-V automatically moves virtual machines to surviving hosts within moments.
  • Tight integration with Windows Active Directory infrastructure, including group policy
  • Redundant and load-balanced pathways to network storage devices; includes technologies such as multi-path I/O without additional cost
  • Works with any server hardware whose manufacturer has provided a Windows Server 2008 R2 driver. This allows Microsoft, as a software company, to focus on delivering software while leaving the business of hardware interface development to the hardware manufacturers who know it best.

With all of its benefits, virtualization does not alleviate the need to back up critical data. Adding redundancy to a virtual system addresses hardware component failures, which is one of the leading causes of data loss. That’s not the only threat to vital information. Data protection strategies must also plan for the possibilities of human error or malice, natural disasters, computer viruses, and many other risks. A virtual environment presents unique and valuable opportunities for backup. In a non-virtualized environment, system administrators have two unpleasant choices. They can employ complicated “bare metal restore” operations that rely on particular software vendor technologies, careful documentation, and the ability to access both in the event of a catastrophe. The other option is to use less complicated software and procedures to only back up protected data; in the event of a failure, administrators will first need to spend time building a complete replacement system, then more time restoring the saved data “over the top” of the newly built system. Both approaches require a substantial time investment and are highly error-prone.

In contrast, virtualization-aware backup software can keep copies of entire virtual machines, and track changes made to them. These copies can be brought online in the amount of time it takes to place them on virtualization-ready hardware. They can even be activated in an isolated mode that allows them to run side-by-side with the original virtual machines without causing conflicts; this allows for a great many possibilities beyond simple backup, such as easy “what if” testing.

 

 

Altaro Hyper-V Backup
Share this post

Not a DOJO Member yet?

Join thousands of other IT pros and receive a weekly roundup email with the latest content & updates!

Leave a comment or ask a question

Your email address will not be published. Required fields are marked *

Your email address will not be published.

Notify me of follow-up replies via email

Yes, I would like to receive new blog posts by email

What is the color of grass?

Please note: If you’re not already a member on the Dojo Forums you will create a new account and receive an activation email.