WannaCry is still fresh in our memory, reminding organizations of how distractive an unpatched vulnerability can be especially if weaponized as a wormable threat that delivers ransomware. BlueKeep has been estimated to have the same disruptive potential as EternalBlue (the exploit responsible for WannaCry) if sporting worm-like behavior, especially since RDP is a commonly used service in organizations, allowing IT and security teams to remotely dial into machines.
All about Virtualization and Cloud Security | Recent Articles:
Cryptographic keys and digital certificates used to uniquely identify machines or applications are vital to businesses that want to guarantee the integrity of in-transit data. However, even businesses with mature DevOps practices sometimes fail to follow practices designed to secure the use and storage of cryptographic keys and digital certificates.
After several years of peeking through the programming as a very niche topic at RSA Conference, DevOps has broken through to the limelight this week. The show has featured a number of talks and panels that discussed the security implications of DevOps and the corresponding increased dependence on cloud platforms and containerization in delivering IT services.
There is ongoing discussion around how DevOps impacts the role of enterprise security.
When the term DevOps was first seen on the horizon of enterprise IT in 2009, it was largely questioned how enterprise IT security teams could keep up with such environments.
In this latest installment of our series on security issues in a variety of industries, we look at the utilities and energy sectors. These companies represent a prime market for managed services providers (MSPs) and value-added resellers (VARs), because for any country, protecting the energy grid must be a high priority.
A chilling and widely reported bit of news surfaced recently when the director of the U.S. National Security Agency (NSA) warned that Chinese cyber attacks could shut down the U.S. infrastructure, including the power grid.
As reported by Reuters, China and "probably one or two" other countries have the ability to invade and possibly shut down computer systems of U.S. power utilities, aviation networks and financial companies, Admiral Mike Rogers, director of the NSA testified to the U.S. House of Representatives Intelligence Committee on cyber threats.
One of the most serious security challenges for enterprises today is the ease with which users can sidestep IT for the apps and information services they need. The danger is especially high when these employees are also creating and accessing confidential or regulated information. It means this data is sprawling out to apps and clouds that may not have the necessary controls to keep all of this data safe.
What makes this condition worse is that many companies don’t even believe this is going on within their organizations until they are forced to actually see it happening. For instance, just a few weeks ago I was sitting in on a live demo of a network monitoring application at a local company. The CIO there was positive that there was not any “unsanctioned” cloud apps running on their network. I told him I found that hard to believe, but would be impressed if it was so.
Creating software is a perpetual journey. Just like relationships, technologies start young and reach maturity over time as they evolve through several phases of completion. Some of them don’t reach adulthood because they’re ahead of their time or simply not practical, while others refuse to go quietly due to their massive popularity in the business world.
Regardless of industry and activity field, truly ground-breaking technologies are designed with a sole intention: to transform the customer experience in ways that no one has done before. With most businesses, however, change doesn’t come naturally, just as habits (good or bad) die-hard in a long-term relationship.
DevOps and continuous integration and deployment efforts boost productivity and agility, but it’s crucial that security moves along with the journey.
DevOps and continuous integration and continuous deployment methodologies are taking hold in enterprises everywhere – and those that do so are clearly more effective and efficient. If you’re not convinced of that, have a look at Puppet Labs’ State of DevOps survey for this year, and last, which found that DevOps organizations are deploying code 30 times faster and with half as many failures as non-DevOps enterprises.
Those DevOps outcomes, because of their focus on steady improvement through continuous collaboration and rapid iterations, are exactly what organizations are hoping to achieve. And from that, they reap a more agile and competitive enterprise.
The driver behind server virtualization is clearly cost savings, while agility and flexibility also have value. This well-known return on investment is achievable because servers have fairly predictable workloads, tend to be rather static in their workloads (an Exchange server tends to stay an Exchange server).
Also, the number of servers that can be run on each CPU across a datacenter tends to be low because, generally speaking, they need more horsepower than an end-user system.
Virtualized desktops are quite different. The number of desktops per-CPU across a Virtual Desktop Infrastructure (VDI) is much higher than with servers. The environments tend to be highly dynamic, with instance being instantiated and destroyed at a high rate.
Naturally, trying to lead with cost savings as a primary goal of a VDI deployment is problematic. Instead, agility and flexibility are key.