Subscribe to Email Updates

Subscribe

overload_endpoint.jpg

When, where, and why do I need different endpoint security when going virtual?

By Shaun Donaldson on Jul 16, 2014 | 0 Comments

In the endpoint security world, specifically antimalware (or antivirus, depending on your definition, but more on that in another post) vendors are offering different features and architectures to address performance in virtualized datacenters. A simple question that organizations have is, “When, where, and why do I need this stuff?” Of course, as a vendor, the tempting answer is, “Always, everywhere, and just because”. However, reality is always more nuanced than the average PowerPoint presentation.

First; the problem. Before virtualization, every endpoint (server or end-user) was an island that communicated with other islands. By that I mean the hardware of each endpoint was dedicated to a single operating system instance. Perhaps the hardware was stressed, perhaps it was mostly idle. Antimalware was run within each endpoint, enjoying dedicated hardware. When instances were stressed, antimalware added to the problem, when instances were idle, antimalware ran unnoticed.

Virtualization migrated these operating system islands to a single, larger landmass. The heftier underlying hardware is shared amongst a number of operating systems. When a stressed system needs more hardware resources, it can use resources that idling instances aren’t using. This is consolidation; a number of virtual machines running on a single, large, host. Economists in the European Union are familiar with the concept.

Antimalware agents running with these virtualized endpoints operated exactly as they did before. When updates/ upgrades/ full-system scans, and other resource-intensive tasks ran, the antimalware assumed that it could leverage as much of the hardware resources as it needed. In a shared-hardware environment, this means systems not running intensive tasks were impacted. Indeed, many of those tasks would traditionally run at the same time. In a virtualized world, this quickly exhausts the underlying hardware.

Today, endpoint security vendors have created mechanisms, features, or entirely new products, to try to resolve the resource exhaustion problem. In other words, they have re-engineered products to behave, or embrace, virtualization.

But under what circumstances is such a solution necessary, as opposed to keeping with the same antimalware as used for non-virtualized endpoints?

overload_endpoint

The answer is qualitative. When user experience (the ultimate measure of performance) begins to deteriorate, re-examining your endpoint antimalware solution is wise. User experience will begin to deteriorate, due to the problems described above, when consolidation ratios are high enough (the number of virtual machines on a given host) reaches a tipping point.

To illustrate, consider two extremes; databases and end-user systems (desktops – referring to the Windows desktop, not a literal desktop computer). It’s unlikely more than one or two resource-hungry databases will reside on the same host. If each is running a traditional antimalware agent, the duplication of antimalware effort is two (two engines, two antimalware databases, and so on). That double-effort is negligible compared to the resources consumed by the databases.

With VDI (Virtual Desktop Infrastructure), there can be hundreds of traditional antimalware instances running on a single piece of hardware because the resource consumption of each desktop is, on average, fairly low. The consolidation ratio is very high (virtual machines on each host), and so the duplication of traditional antimalware is also very high, resulting in the antimalware consuming a huge amount of resources on that host.

Following that illustration, traditional antimalware will operate without inducing performance problems in low-consolidation environments. In high-consolidation environments (VDI being the extreme), traditional antimalware products will, unavoidably, create performance problems.

Most organizations fall between those extremes. Typically, servers are virtualized before end-user systems. Rarely are all servers virtualized at once. Organizations tend to overspend on hardware for a virtualization pilot project, anticipating that more servers will be virtualized on the same hardware as the effort evolves. As the level of virtualization increases within a datacenter, so does the consolidation ratio. Eventually, a tipping point (or brick wall, depending on your perspective) is found at which traditional endpoint antimalware starts creating performance problems.

To conclude, the traditional versus virtualization-centric antimalware tipping point isn’t the same for every organization. It is always wise to anticipate problems before they happen, and so when pressed, I’ll give the following advice; you need to re-examine your endpoint antimalware when you are:

  •  Running more than 50% of your servers as virtual machines
  •  Anticipating a VDI project
  •  Considering using public cloud

Note that public cloud made an appearance; more on that in another post. Also note that this post is all about performance. It goes without saying that if your endpoint security is not doing a good job of securing your environment, it’s a good time to re-examine your endpoint security!

 

Share This Post On

Author: Shaun Donaldson

Shaun Donaldson is Editor-at-large at Bitdefender Enterprise. Shaun is also responsible for supporting relationships with strategic alliance partners and large enterprise customers, and analyst relations. Before joining Bitdefender, Mr. Donaldson was involved in various technology alliances, enterprise sales and marketing positions within the IT security industry, including Trend Micro, Entrust, Bell Security Solutions and Third Brigade.