Securing the cloud and securing data in the cloud are not the same

This article will take a look at five of the most impactful vulnerabilities of storing data in the cloud, and offer details on why the most important thing for CSOs and security decision makers to consider is how they will effectively secure the data itself.

A December 2010 Cisco study revealed that 52 percent of IT officials surveyed use or plan to use cloud computing. Unfortunately, the biggest hurdle they face in bringing their data to the cloud is security. This is further evidence that despite growing concerns and security breaches, cloud adoption will continue to rise because of its benefits in cost savings, collaboration, efficiency and mobility.

Rather than jumping into the cloud unprepared, enterprises should be taking a close look at the benefits and risks of this IT environment, and creating a strategy to sufficiently protect their data.

There are many different vulnerabilities that come with doing business in the cloud, all of which lead to one rule of thumb that must be followed in order to ensure that sensitive data stored in the cloud is truly secure: The data itself must be protected.

The top five vulnerabilities of storing data in the cloud are:

1. Abuse and nefarious use of cloud computing
Infrastructure as a service (IaaS) providers offer their customers the illusion of unlimited computing, network and storage capacity. This coupled with an often painless registration process means that anyone with a credit card can get on the cloud.

If the provider goes as far as offering free limited trial periods, doors are opened for spammers, malicious code authors, and cyber criminals in general that can now operate within the cloud with absolute anonymity and relative impunity.

In the past, it used to be that only platform as a service (PaaS) providers were targeted for attacks, but the world is changing and IaaS providers are no longer safe. Organizations need to start worrying about things like key cracking, distributed denial of service (DDoS), hosting malicious data and rainbow tables.

2. Insecure interfaces and APIs
Getting in the cloud is imperative for almost all organizations. Unfortunately, service providers are so eager to get into the cloud that some of them don’t think about the vulnerabilities their APIs and interfaces introduce.

Typically management, orchestration, provisioning and monitoring are all performed through these interfaces – yet security and availability of general access control is dependent on the security inherent in these very basic APIs.

Everything from authentication to access control to encryption and monitoring needs to be designed into these APIs, and often they’re not. This means that not only can innocent accidents happen, but the opportunity for successful malicious activity is also increasingly likely.

Often both organizations and third parties will take shortcuts to build on top of these existing APIs which in turn adds a whole other layer of complexity you will need to peel off in order to focus back on the data.

3. Malicious insiders
These people are not going away and most organizations realize the potential for malicious employee data breaches is very real. Unfortunately, it does not get any easier in the cloud; in fact the threat of an internal data breach is only amplified within the cloud.

The reality is you have the convergence of IT services and customers under a single management domain. This, combined with the lack of transparency into provider processes and procedures, can lead to big vulnerabilities.

For example, a cloud provider may not reveal how it grants employees access to physical or virtual data, how it monitors those employees, or how they analyze reports on compliance. Why would they? They are offering you a quick and often cost effective way to grow your business. They’re not going to offer any visibility into how they hire employees to work within the cloud.

This makes the cloud even more attractive for hacker hobbyists, corporate espionage, organized crime, and even nation-state sponsored intrusion.

4. Shared technology issues
IaaS vendors build out their infrastructure so it can scale easily. This is usually done by sharing infrastructure. The challenge with this way of architecting a solution is that the underlying components often are not designed to offer strong or even significant isolation for a multi-tenant architecture.

The quick way to address this gap is a virtualization hypervisor that mediates access between the guest operating system and the physical computing resources. But this is not enough as hypervisors, too, have flaws and can enable guest operating systems to gain inappropriate levels of control or influence on the underlying platform.

The bottom line is that customers should not have access to any other customer’s data. The cloud cannot make that guarantee just yet.

5. Unknown risk profiles
One of the tenets of cloud computing is the reduction of hardware and software ownership and maintenance that frees up the end user to focus on their core business strengths.

This is very powerful, but the financial and operational benefit needs to be weighed against all of the security concerns. The cloud will no doubt make your life easier, but it could also lend itself to forgetfulness on practices you lived by when you owned the hardware and software yourself.

Things like software code updates, security practices, vulnerability profiles, intrusion attempts, and security design all need to be applied and voiced during your due diligence of a cloud provider.

Protecting the data

For the aforementioned reasons, it’s clear that the data itself needs to be protected to guarantee that it is secure in the cloud. You can’t depend on your cloud provider to make the environment 100 percent secure, so you have to take it upon yourself to ensure that the data is protected once it enters the cloud. So what is the best solution for securing data in the cloud?

While there is no silver bullet solution, I view tokenization as a key requirement for securing data in the cloud. For those who don’t know what it is, tokenization is the process of protecting sensitive data by replacing it with alias values or tokens that are meaningless to someone who gains unauthorized access to the data.

It can be used to protect sensitive information like credit card numbers and personally identifiable information (PII) such as healthcare related data, emails, passwords and logins.

However, it’s important to note that not all methods of tokenization are created equal. There are several different types of tokenization that have emerged in recent years, each of which varies greatly. Let me explain.

Dynamic and static tokenization
Dynamic tokenization is characterized by large lookup tables that assign token values to the original encrypted sensitive data. These tables grow dynamically as they accept un-tokenized sensitive data causing the lookup tables to increase in size.

Static tokenization is characterized by pre-populated token lookup tables that attempt to reduce the tokenization process by pre-populating lookup tables with anticipated combinations of the original sensitive data.

Both approaches carry large footprints which introduce a lot of challenges. Large token tables are not agile and lead to latency, poor performance and poor scalability. And the size of both dynamic and static token tables when tokenizing multiple types of data would soon become an impractical solution for the cloud or any other environment, should an organization need to tokenize more than one data type – credit cards, email addresses, healthcare information, etc.

In-memory tokenization
In-memory tokenization is the next step in the evolution of this emerging technology and an ideal solution for companies moving more and more of their business data to the cloud.

It is characterized by a very small system footprint that compresses and normalizes random data, can be fully distributed and does not require any kind of data replication or synchronization.

When compared to dynamic and static tokenization, it significantly reduces performance and scalability issues that are evident with these two other approaches because token servers with small footprints enable the execution of token operations to be in parallel and closer to the data.

This approach also enables different data categories to be tokenized without the impact that traditional tokenization methods impose on system performance.

Conclusion

Securing the cloud and securing the data in the cloud are not the same thing. By considering the five most impactful vulnerabilities of moving your data to the cloud, you should now have a better understanding of why the data itself must be protected.

In-memory tokenization helps businesses protect sensitive data in the cloud in an efficient and scalable manner, while allowing them to lower the costs associated with compliance in ways never before imagined. Although no technology offers a perfect solution, tokenization is a critical requirement for any enterprise looking to confidently bring its data to the cloud.

Don't miss