With this post dedicated to the series of newer IT infrastructure models and options available to CIO’s (previous posts available here) we have reached the milestone of Virtual Desktop Infrastructures. They are key component of the new computing paradigm and probably the most challenging option that faces IT management today.
Virtualization came as a consequence of the third wave of IT acquisitions: first we had the golden age of IT Expansion (a lot of new dedicated hardware, software, servers, platforms, business applications to be implemented for the modern enterprise), then we had the Constriction period due to the crisis (that has left many organizations without upgrades, software subscriptions and services), and now we have entered in the wave of Optimized Development where all that is acquired must add a considerable value to the business, otherwise it is discarded.
In this line, server virtualization has been a tremendous success: quantifiable consolidation rates, proven optimization of space, energy consumption and management costs and a solid model. Security has also fit in naturally, as a logical layer deeply integrated into the foundation of the entire edifice.
Desktop virtualization has been expected to follow this line and all the big market research companies have welcomed this trend (for example, Gartner predicted that hosted desktop will reach 40% of the worldwide professional PC market at the end of 2013 - 2009 prediction, reconsidered here - while Research and Markets forecasted that the global VDI market is expected to grow by a CAGR of 65.7% over the period 2012-2016). However the things started to cool and the expected “year of the VDI” didn’t come yet…
Perhaps it’s not an issue of costs, commodity, under delivery or something alike; what might be responsible is the wide and sometimes confusing range of options that appear to be at hand. When dealing with desktop virtualization, apart from the classic virtual machines we all know from the beginnings of virtualization, we are confronted with notions like: VDI, Remote Desktop Services/Terminal Services, Application virtualization, Desktop as a Service (DaaS), Server Based Computing.
All of them are valid options, not subtle variations of the same technology. No wonder many buyers are confused and don’t know where to start and what’s most suitable for them. So let’s explore the options.
We will start with the already classic Virtual Desktop Hosted on Servers. It’s a virtual machine hosted on a server in a data center that can be company-owned or outsourced and that can be accessed through remoting (facilitating remote access) protocols like Remote Desktop Services/ Terminal Services. The benefits are that
the processing is done at server level
the management of everything related to the virtual desktop is centralized and
the data can be consolidated and secured in a unified mode
The users can benefit from a dedicated virtual machine that can be persistent (the changes and user personalization are saved and the user can retrieve its work at next session) or non-persistent (the machine will be reverted to a standard image after disconnecting).
This model is simplifying the mission of security and disaster recovery: the security is concentrated at the central point while some attention must be dedicated to authentication and connection control.
The mirror alternative would be Virtual Desktops used on Physical Desktops. The user starts a virtual machine (that is stored in a data center) using a local hypervisor. All the processing is local but data is synchronized and stored in the data center in this case. The user eventually can save a local image to, for example, work from home, and sync data at the next session; this image will preserve his or her preferences, personalization and applications.
The benefits are that the processing power may be local but the data is synced and stored centrally, making it easier to secure. From the same security perspective, this model is more complicated because of the local processing and the possibility to compromise data in shared collaborative environments.
Another model is Shared Virtual Desktop. It is used to facilitate the access to server-based computing resources to large numbers of users. From what we have seen, it comes in two options, according to its usage:
For application delivery – it is a model that has been used for a while and allows the access only to one specific application, maintaining restricted the access to the entire desktop architecture. It may be beneficial to call centers, technical support teams, marketing research companies, financial institutions etc. The security level provided is impressive; the remaining concern will be to secure the access and communication channel.
For desktop delivery – basically it is the sharing of server computing resources delivering the experience of a desktop and using a remote services delivery protocol. The benefits reside in the system control and data consolidation of a centralized processing model. The security level is similar to the previous model.
Probably the trendiest option is represented by Desktop as a Service (DaaS); it is a virtualization delivery model that allows a company to rent virtual desktops from a service provider based on their specific needs and provider capabilities and pay a monthly fee.
It is the most flexible model and probably brings the most appealing benefits: “pay-as-you-go” model, all expenses fall under OpEx, it is scalable and easy to test, and to eventually switch from. Also the externalization of security to the team of specialists of the service provider and the near 100% availability make from this model an ideal option.
However, the Microsoft licensing for DaaS may be a real challenge. Security is high but, as it is externalized, the risks are transferred to the provider.
Finally, we have Application Virtualization that is basically the possibility of delivering separate containers for specific applications use. It facilitates several interesting uses, like the sharing of development platforms, streaming or access to specific centralized applications through a layered approach (like for example the use of graphic design suites). The benefits rely on containerization or the control of users and licensing. The security challenges in this case are really few but the dependency on good connections is increased.
We have presented the options we have encountered with brief outlining of the benefits and security challenges; they all lead to a set of topics you have to consider when planning a VDI:
You need to seriously assess not only the needs and business requirements, but also the user behaviors and preferences. Especially for management team members that work with crucially important data on a wide variety of devices and that are less tolerant to latencies and interruptions.
A good option will be to try to group users by: departments, applications usage/access rights, behaviors and even work schedules.
Test before making large scale deployments. Try not to ignore the fact that user experience is crucial for the success of your project.
Expect critics and frustrations, they will come anyway.
When considering your best option from an organizational standpoint, there are a number of security options. But investigate early in your strategic planning.