Security, at its most basic, is about understanding exposure, then trying to mitigate against the risk and fixing any issues that are uncovered or arise. It is straightforward enough if you know what and how to deal with the challenges, but the fact is many organisations do not truly appreciate their levels of exposure or the security functionality at their disposal. These unknowns are the security time bombs that keep CIOs up at night.
With 50% of server workloads virtualized (and this number is set to climb), security within virtualized environments becomes a key concern for all organisations. Although the security threats within a virtual environment are not any more dangerous than traditional infrastructure, it is the way that the enterprise deals with these concerns that is different.
One of the great benefits of virtualization is the speed and flexibility with which individuals can set up virtual machines. This strength becomes a security weakness as new workloads cannot be tracked and managed thanks to the speed of deployment. As more workloads are virtualized, as workloads of different trust levels are combined and as virtualized workloads become more mobile, the security issues associated with virtualization become more critical to address.
One large international company we spoke to recently discovered that thousands of previously undetected virtual machines existed inside their IT architecture. This scenario would have been unthinkable, or rather impossible, in a non-virtualised environment. There will be CIOs reading about discovering the existence of thousands of previously undetected virtualized machines who wince, partly out of empathy for the plight of their fellow IT professionals, but also because they can’t be 100% sure the same thing isn’t going within their technology infrastructure. The spread of unmanaged virtualization is endemic within large scale enterprises. Developers looking to take advantage of virtualization and Open Hybrid Cloud models can create environments and bring them online rapidly; they can abandon them just as rapidly, often lying dormant and forgotten.
System administration is often vilified for holding back business processes. These days self-service has seen an increase in the number of people within an enterprise ordering more IT services. For the business this can appear freeing as it enables rapid response rates, however, the focus is not necessarily on creating secure environments, erroneously believing that IT or system admins will keep things safe.
These great unknowns are disconcerting for IT security professionals since you can’t manage what you don’t know about. This is not simply an issue in the open source world, but also within proprietary environments. One of the most rectified approaches is to employ a virtualization manager of managers software to make sure that security and compliance rules can be checked in all provisioned virtualization platforms.
Proprietary vendors will tell you that open source is hard to control because so many different unknown parties can influence the code. But the benefit of using open source for virtualization is its total transparency, with more people looking at the code, it is harder for errors to get into the production environment. Development takes place in these specialist communities and this wisdom feeds back into the entire eco-system. With thousands of eyes examining the codes; bugs and errors are rapidly identified and fixed.
Ideally, security is built into IT architecture from the beginning. You need to define a process for compliance. Yet having the right processes to ensure exposures are highlighted is not common. Building from the ground-up is challenging since greenfield developments are uncommon. The majority of business IT architectures have evolved over time and so the challenge comes when a business takes a strategic decision to embrace virtualization and migrate its processes. That’s when the fun starts.
For brownfield deployments complex legacy systems exist and so CIOs need to consider a variety of security matters when deploying virtualized resources. First, that the host/hypervisor should be considered the primary area for focus since it is often a single point of failure for guests and data. Also resources and services can become difficult to track and maintain; with rapid deployment of virtualized systems comes an increased need for management of resources, including sufficient patching, monitoring and maintenance.
Virtualization does not remove any of the traditional security risks present in an IT environment; the entire solution stack, not just the virtualization layer, must be secured. There may also be a lack of knowledge among your workforce, gaps in skill sets, and/or minimal experience among technical staff. This is often a gateway to vulnerabilities while resources such as storage can be spread across, and dependent upon, several machines. This can lead to overly complex environments, and poorly-managed and maintained systems.
To deepen your virtualization journey and then to embark on a multi-hypervisor, hybrid cloud world, two considerations are key from a security perspective. First, you should aim wherever possible to use a base platform in your deployments that has been designed for security from the ground-up. Open source with technology like SELinux offer great out-of-the-box value, by locking down the hypervisor and isolating virtual machines. Second, you must re-gain overview over your virtual deployment and understand which risks you are actually taking. A cloud operations management technology like Red Hat CloudForms can help you to employ your security processes and policies across VMWare and OpenStack and re-gain insight into your environments.
In security the more you know, the more secure you are. Having a complete view and control of your IT infrastructure reduces unknowns and helps CIOs sleep more easily at night.