As demands for access to data have increased, so has the need for security. And although some security approaches still apply to modern enterprises and hybrid cloud architectures, many traditional approaches fail in the new ‘third platform’ generation of services. In essence, the cloud has created a breach in security that’s waiting to happen.
IT administrators and directors are now advised to not only look at obvious points of entry, but also to review their employees’ data access habits. Installing a cloud-based storage client on desktops can provide a direct conduit between primary data repositories within the enterprise to open and admittedly insecure data repositories. For this reason, data curation and encrypting data-at-rest will now need to be included in customers’ overall repertoire of security programs.
A recent CIO Insight article – Ten Tips to Secure the Data Center – recommended verifying and isolating open ports throughout the environment. But companies should also look at ports intended for one purpose potentially being co-opted for another. For example, many of the data exchange systems and cloud software-as-a-service (SaaS) systems utilize the HTTP port, which is open on most firewalls. As a result, the data transfers into and out of the enterprise can be completed without creating any visibility into the data transfers that end-users are engaging in.
Data center managers need to understand what these traffic patterns look like, so they can detect and separate them from standard web traffic. This requires surveilling and evaluating network traffic to isolate the telltale signs of cloud-based data traffic—a detection system that will become a standard component of all enterprise networks from now on.
In addition to protecting against cloud leaks, backdoor access to data in many of the modes described in the CIO article still represent clear and present vulnerabilities. Many companies have been reluctant to encrypt data-at-rest due to issues with performance, usability, and risk management. In the past, these arguments were compelling enough to dissuade IT organizations from implementing data encryption within their storage arrays.
However, in light of new and very public data breaches, it’s clear that strong encryption of application data and customer information is no longer optional for data storage systems. That means businesses need to start retrofitting their existing systems to provide greater data protection and implementing new practices and policies for managing and exchanging data, so that they’re maintaining the fidelity of the encrypted data sets.
Lastly, virtualized environments create their own security and control challenges for the modern IT manager. Specifically, they require fluid environments that allow security policies to follow the virtual services migrating from one physical platform to another. (In some cases, the services may even move from one facility to another.) Maintaining parity in the security of data across multiple physical and geographical locations requires us to look at security from a much broader sense. Plus, it brings the concept of carrying security policies with the data as a new goal for IT security management.