Virtualization has become one of the most used terms in the IT industry in the last few years. Yet there are several IT professionals, administrators, and managers who prefer to just avoid the term and not include it in their daily lives.
According to a survey among IT areas in different organizations by Zenoss (http://www.zenoss.com/), a company which provides virtual and cloud services solutions, around 30% of deployments only exist as virtual.
Why don't people trust virtual environments? Most of the existing misconceptions regarding adding a new layer in your infrastructure by virtualization technologies are:
It represents a new layer of complexity, an increase in risks, and variables that could lead to unmanageable platforms
Increases in costs regarding hardware and licensing
How many times have you heard the expression "My company won't spend any more money on server licenses or new hardware"? I've heard that a lot.
Of course, the managers of a company are not obligated to understand the technical benefits and licensing options regarding servers, but when those concepts appear around us, it is our job as IT administrators, managers, or even just IT geeks to start discussing the benefits of virtualization.
In this chapter, we'll see why working with virtual environments is no more than a simple strategy or an implementation plan where you can start reviewing if it fits your needs or not. With virtual environments we rapidly start gaining the agility, scalability, cost-saving, and the security that almost any business today requires.
Infrastructure as a Service (IaaS): Supplying computer infrastructure as a service. Instead of companies thinking about buying new hardware and the maintenance costs that implies, the infrastructure is provided (typically in virtual machines) as they need it.
If you need to recover from a disaster in your platform, did you consider the real cost of having your infrastructure down for several hours while you are restoring the latest backup?
How much time and resources do you spend deploying/providing operating systems to users? And then how much do you spend later, troubleshooting application installations?
If those number don't seem right to you (and I think they never do) maybe it is time to start thinking about SaaS, PaaS, and/or IaaS.
Fortunately, today's demand regarding virtualization is incredibly high, which is why the possibilities and offerings are even higher. We can virtualize servers, appliances, desktops, and applications, achieving presentation and profile virtualization; you name it and there's probably already a bunch of products and technologies you can use to virtualize it.
Application virtualization is still one of the emerging platforms but is increasing rapidly in the IT world. More and more of the dynamic aspects of isolating and scaling the applications deployment are being implemented. And Microsoft's App-V represents one of the strongest technologies we can rely on.
The reason for any of the misconceptions mentioned above could be a simple statement—Fear of the unknown. Most of the decision makers in organizations don't see IT as an investment; they see it as an unavoidable cost, so there's a simple thought around it "If it's working now, why we should spend more money? Why change it and take more risks?".
Agile IT, Dynamic IT, Green IT. Have you ever heard of those terms? If you did and wondered what they mean exactly and how do I achieve them, then we are on the right track. Most of the benefits of virtualization can be grouped in four different areas.
Handling server or desktop deployments is always a painful thing to do, requiring hours of deployment, tuning, and troubleshooting; all of these aspects are inherited in any operating system lifecycle. Having virtual machines as baselines would reduce OS deployment from several hours to a few minutes.
The desktop virtualization concept provides the end user the same environment as using a local desktop computer but working with remote computer resources. Taking this strategy will enhance the provisioning of desktop environments, more resources can be added on demand, and the deployment will no longer depend on a specific hardware.
Building virtual machines templates ready to go, self-service portals to provision virtual machines for our power users whenever they need a virtual environment to test an application; these are some of other features that can be included using a virtualization platform.
Regarding cost savings, there are basically two significant points to mention:
Lower power consumption: Large datacenters also include large electricity consumption; removing the physical layer from your servers will translate yearly to a nice reduced number in electrical bills. This is no small matter; most of the capacity and costs planning for implementing virtualization, also includes the "power consumption" variable. It won't be long until "Green Datacenters" and "Green IT" will be a requirement for every mid-size and large business.
Hardware cost savings: Before thinking about probably needing expensive servers to host your entire infrastructure let me ask you this, Did you know that the average hardware resources usage is around 5% to 7%? That means we are currently wasting around 90% of the money invested in that hardware. Virtualization will optimize and protect your investment; we can guarantee that the consolidation of your servers will be not only effective but also efficient.
Regarding that second observation, there's an interesting point about today's technology. Acquiring a server containing two quad-core processors and 32 GB of memory is not such a crazy idea. Today's cost for that kind of hardware is pretty much accessible for most mid-size and some small companies. And if we are not thinking of virtualization, what can we do with that server? Promote a domain controller or use it as a file server? Yes, I know, it doesn't sound right to me either.
There's a common scenario in several organizations where there are users that only depend on, for example, the Office suite for their work; but the cost of applying a different hardware baseline to them, that fits their needs exactly, is extremely high. That is why efficiency also plays an important variable in your desktop-using desktop virtualization you can be certain that you are not over-or under-resourcing any end-user workstation. You can easily provide all the necessary resources to a user as long for as they need them.
A new contingency layer for every machine, taking a snapshot of an operating system, is a concept that didn't appear before virtualization existed. Whenever you introduce a change in your platform (like a new service pack release) there's always a risk that things won't be just as fine as they were before. Having a quick, immediate, and safe restore point of a server/desktop could represent a cost-saving solution.
Virtual machines and snapshot possibilities will give you the necessary features to manage and easily maintain your labs for testing updates or environment changes, even the facilities to add/remove memory, CPUs, hard drives, and other devices to a machine in just a few seconds.
Virtual environments will let you redesign your disaster recovery plan (if you ever had one) and minimize any disruption to the services you are providing. The possibilities around virtual machine's hot backups and straightforward recoveries will give you the chance to arrange and define different service level agreements (SLAs) with your customers and company.
Have you ever had to improvise a recovery because of a hardware failure in one of your servers? Then you probably experienced that you didn't have the right hardware to replace in the server and you had to quickly buy a new one, or replace it with the hardware that was available at that moment. The virtualization model offers you the possibility to remove the hardware dependencies of your roles, services, and applications; a hardware failure can present only a minor issue in the continuity of your business, simply by moving the virtual machines to different physical servers without major disruptions.
Inserting a virtualized environment into our applications deployment will reduce the time invested in maintaining and troubleshooting operating system and applications incompatibilities. Allowing the applications to run in a virtualized and isolated environment every time they are deployed removes possible conflicts with other applications.
It is also a common scenario for most organizations to face incompatibility issues with their business applications whenever there's a change—new operating system, new hardware, or even problems with the development of the application that starts generating issues with particular environments. You can say goodbye to those problems, facilitating real-time and secure deployments of applications that are decoupled from tons of requirements.
Application deployment, as for operating systems, always represents a significant load in every IT area, having to repeatedly deploy, maintain, and troubleshoot the same applications (and the same problems) over and over again. The Application Virtualization model introduces to us a new way to understand application deployments.
As virtual machines that work abstracting the hardware layer from physical servers, application virtualization abstracts the application and its dependencies from the operating system, effectively isolating the application from the OS and other applications.
Application Virtualization, in general terms, represents a set of components and tools that remove the complexity of deploying and maintaining applications for desktop users; preserving only a small footprint of the operating system.
Getting more specific, Application Virtualization is a process for packaging (or virtualizing) an application and the environment in which the application works, and distributing this package to end users. The use of this package (which can contain more than one application) is completely decoupled from the common requirements (like the installation and uninstallation processes) attached to applications.
The Technical Overview of Application Virtualization offered by Microsoft represents a fine graphic explanation about how normal applications interact with the operating system and their components; and how virtualized applications do the same. Take a look at http://www.microsoft.com/systemcenter/appv/techoverview.mspx.
In standard OS environments, applications install their settings onto the host operating system, hard-coding the entire system to fit that application's needs. Other applications' settings can be overwritten, possibly causing them to malfunction or break.
Here's a common example of how two applications co-exist in the same operating system, and if these applications share some registry values the application's (or even operating system's) usability could be compromised.
With Application Virtualization, each application brings down its own set of configurations on-demand, and executes in a way such that it only sees its own settings.
Each virtual application is able to read and write information in their application profile and can access operating system settings in the registry or DLLs, but cannot change them.
Each App-V-enabled application brings down its own set of configurations and can run side by side without the settings conflicting with each other or the host operating system. Despite this separation, inter-application communication with other App-V applications and those installed locally is preserved, allowing for cut and paste, OLE, and all other standard operations. Here's a simple example on how App-V applications can work interconnected; this feature is called Dynamic Suite Composition (DSC).
Here are some facts about Application Virtualization:
The applications are not installed on clients, they are published.
With Application Virtualization we can achieve the co-existence of incompatible applications like Microsoft Office 2007 and Microsoft Office 2010.
Applications are installed only once on a reference computer, where the package is captured and prepared.
You can capture a set of interconnected applications into a single package.
The capturing process is in most cases a transparent process; which identifies the environment that the application requires to work, like files and registry keys.
Application Virtualization offers you the possibility of centralized management. There is one point where we handle virtualized applications and the distributing behavior in our environment.
Even though you can create a package of almost any software, not all applications can be virtualized. There are some examples that could be quite tricky to actually pack into one bundle. Applications that require high operating system integration can generate some known issues.
Whenever there's a discussion concerning this question, I usually present common scenarios which most the companies know about.
You are using a company's application that requires:
Special configurations every time that is deployed. Customizing files or setting special values within the application configuration environment.
It is also interconnected with other applications (for example, Java Runtime Environment, a local database engine, or some other particular requirement).
It demands several hours every week to support end users deployments and troubleshooting configurations.
Application Virtualization offer us the possibility to guarantee that end users always have the same configuration deployed, no matter when or where, as you only need to configure it once and then wrap up the entire set of applications into one package.
You have a dynamic base operating system image that changes constantly, and the scenario looks like this:
There are several types of base images according to user profiles (HR users usually have a different base operating system than the one provided for a developer).
Every change requires manual and local intervention on every client for installing or removing applications (some prefer accepting the risks, and provide end users with local administrator privileges to achieve some kind of auto-management).
Again, this translates into several hours of supporting, deploying, and maintaining those different types of images. It also carries important risks if the organization is getting bigger, as the hours invested on these matters will increase exponentially.
Implementing Application Virtualization with a clean base image will help us minimize the impact every time there's a change. With centralized management of the applications you can make all the necessary changes and implement them instantly on demand. Also, by adding granularity to the type of images, special applications can be distributed only to selected group of users, keeping a small footprint for every operating system.
Another aspect of Application Virtualization is related to a significant matter in many organization—application licensing. Application Virtualization can also maintain a central point for software licenses, allowing you to keep track of the current licensing situation of all your applications.
As mentioned before, there are many virtualization solutions available to consider. The demand for Application Virtualization is increasing significantly; some of the most important offerings available in the market are:
Some of the differences among these technologies are:
ThinApp and App-V 4.6 are the only ones that support 64-bit OS deployments (earlier versions of App-V do not support this feature).
Microsoft App-V 4.6 is the only one that supports 64-bit applications.
App-V, SVS, and XenApp use a set of kernel mode drivers and supporting services to manage the virtualization process. ThinApp includes the entire virtual environment directly into the application package.
App-V, SVS, and XenApp include options for reporting the virtualized applications usage.
VMware ThinApp and InstallFree Bridge do not include a central point for application license management.
VMware ThinApp and App-V, in a 32-bit environment, are the only platforms that support 16-bit applications. 64-bit operating systems do not support 16-bit applications.
Some other options available in the application virtualization market are:
The IO model also presents a path to accomplish the maximum level of maturity; performing these best practices we can guarantee operational efficiency among the IT activities. Fortunately, Microsoft provides complete guidelines, tools, checklists, and other assets to understand which maturity level we are staged on, what we should do to level up, and which technologies are involved.
You can find all of this at http://www.microsoft.com/infrastructure/. App-V's benefits give the necessary level of control to the administrators over the application deployments as well as securing the user's profile, providing them with the proper work environment. The Dynamic Level of Microsoft IO also requires dynamic application access and recovery for desktop applications, both of which are also gained with App-V.
Some of the most important topics we can find in all of these resources are:
An online assessment tool to achieve Dynamic IT: Once we run this wizard-like tool, we receive a complete report on how to optimize our infrastructure in areas like Identity and Access, Desktop, Device and Server Management, Security and Networking, Data Protection, and IT Process. You can access the online tool at http://www.microsoft.com/infrastructure/about/assessment-start.aspx.
IT Compliance Management Series: Guidelines oriented to IT governance, risk, and compliance requirements. Download the series from the Microsoft Download Center at http://www.microsoft.com/downloads/en/default.aspx.
Windows Optimized Desktop Scenarios Solution Accelerator: Guideline to achieving a proper plan and designing applications and operating systems in your organization. This accelerator will be useful when we start thinking in App-V. More information is available at http://www.microsoft.com/infrastructure/resources/desktop-accelerators.aspx.
Infrastructure Planning and Design Guides for Virtualization: Complete references for designing a virtualization strategy; you will find specialist guides for App-V, Remote Desktop Services (formerly known as Terminal Services), System Center Virtual Machine Manager, Windows Server Virtualization, Desktop Virtualization, and so on. More information is available at http://technet.microsoft.com/en-us/solutionaccelerators/ee395429.aspx.
App-V is the most robust as well as the most flexible platform of the application virtualization options available and we can implement it in any environment. It doesn't matter if we just want to use it with a few mobile users or for the entire organization, App-V fits all.
Let's take a quick look at the implementation models available:
Standalone: There's no infrastructure needed for this one, you just need a reference computer that packages the application (sequencer machine) and the App-V client that receives the application. You can even use it for offline users.
Streaming Mode: This is the same model as the standalone with the exception that you are also using a streaming server to distribute the applications. You can set up streaming servers for low-bandwidth links like branch offices.
Full Infrastructure: The full model introduces you to a Management Server, where you can centrally administer the entire infrastructure (packages, permissions, licenses, and reporting), a Streaming Server (which can stream down applications using the default App-V protocol, HTTP/S, SMB, or combination with SCCM 2007 R2), and the App-V client. Note that the Management Server can provide applications to users as well as stream.
To understand a little bit more about application virtualization, let's make a note of the milestones that represent the lifecycle in an App-V package:
Application Sequencing: As the first step in the life of a virtualized application the sequencing of an application represents capturing the environment (files, shortcuts, and registry keys) where it works, and packaging it into one bundle.
Application Publishing and Deployment: After the creation of the App-V package, the administrator publishes it on the server, defining different sets of permissions and licenses if applicable, and it is deployed to all the App-V clients within the scope.
Application Update: The application update process refers to applying the necessary changes and revisions into the application. And clients can consume this new version of the package the next time the application is launched.
Application Termination: This step consists only of removing or disabling the application from our server. As App-V clients never install this application, the termination process does not modify anything in their operating system and the application smoothly disappears from the environment without a trace.
We've previously covered some basic differences that we can find between the application virtualization platforms. Now, if you haven't made up your mind just yet, it's time for some quick facts about Microsoft App-V:
Microsoft App-V 4.6 supports 64-bit clients as well as 64-bit applications.
In the process of capturing-sequencing an application, we can immediately create MSI files to be deployed for offline clients and other particular environments.
Highly-integrated with System Center Configuration Manager 2007 R2. If you already have SCCM 2007 R2 implemented, publishing applications combining the platforms should not present difficulties.
Reporting: The use of virtualized applications can be easily monitored and reported with Microsoft App-V (only in the Full Infrastructure model). This can be an important feature in the lifecycle mentioned about applications, where you can get all the necessary information about the usage and termination, if it's needed, about applications that are no longer used by clients.
Variety of implementation models: If there are not enough resources for a complete implementation, the standalone mode lets us take advantage of application virtualization without requiring particular servers.
In this chapter we've covered:
The basics of virtualization
Common misconceptions about the technologies and their uses
The reasons why virtualization is no longer a simple strategy that you can use, but represents the next focus any IT platform should have
And exactly what application virtualization is and the benefits it includes
I once heard "What is virtualization but just removing physical characteristics to recreate them in an emulated environment? Why even bother?". It is exactly that, but within that abstraction from hardware or software dependencies is what makes the difference; you just need to see it.
Even though most of the technology and IT fans working in a company do not decide on the investments the organization should make, it is important for us to understand that the agility we gain administering a virtualized platform is easily translated to reducing costs regarding IT daily activities, creating the opportunity for us to focus on different valuable activities.
There are several technologies and products we can use for application virtualization. Microsoft App-V offers the most suitable strategy (64-bit applications and clients supported), is scalable (different models available that can be used simultaneously in our environment, centralized management of permissions and licenses), and dynamic (interconnecting different set of applications plus strong reporting features that help us monitor the application lifecycle).
In the next chapter, we'll take a deeper look at the App-V architecture and implementation models, roles, and components, along with the necessary guidelines to choose, plan, and design the right implementation model.