In an extremely short amount of time, the average person’s familiarity with and usage of the cloud has skyrocketed—both intentionally, and behind the scenes; you can even sip on a cup of coffee made via cloud integration. Amazon Web Services (AWS), the cloud service provider that controls more than 40% of the cloud market, advertises their clients in TV commercials. News organizations consider cloud adoption as one of the most significant technology trends of the decade.
What is alarming about cloud-based models is the frequency with which cloud-based sensitive data falls into the wrong hands. Uber's AWS storage was hacked, compromising the data of 57 million customers and drivers. CapitalOne's cloud storage was improperly accessed by a past employee with insider knowledge, leading to the theft of over 100 million customers’ social security numbers and credit card data. More than 68 million emails and passwords for Dropbox users were found for sale on the dark web. Billions of records in unsecured databases are found in the cloud on a regular basis. The danger of increasing cloud usage without improving data security is getting more serious with each passing day. It is no longer sufficient to issue an apologetic press release and rotate your leadership team.
Though some security experts would warn companies to keep their data out of the cloud until safety mechanisms improve, there are feasible ways to benefit from the conveniences of the cloud while still employing the highest levels of data security.
Though it will understandably take longer for these trends to reach cloud technologies, there are several steps companies can take in the short term to make a safer transition to the cloud. Creating data classifications based on the levels of protection they really need rather than simply treating everything as sensitive could allow for a less intrusive path towards data security. For data that is encrypted and stored in the cloud, the usage of cryptographic hardware like HSMs or TPMs to manage the cryptographic keys is essential to ensure that only authorized users have access to decrypted data. The final hurdle is the elimination of shared-secret-based authentication schemes like passwords or one-time passcodes, and the implementation of authentication mechanisms that qualify for the highest level of assurance from the National Institute of Standards and Technology.
None of the precautions require nonexistent technologies, massive reconfiguration, or even significant financial expenditure. What is required, however, is a shift in how organizations think about data security overall. Breaches continue to happen despite the use of security technologies that have been employed for decades. The more breaches continue to happen, the clearer it becomes that username/password authentication schemes and firewalls are just not enough to protect our sensitive data in the cloud anymore.
Secure Only What Needs to Be Secured
Many organizations admit to avoiding strong data protection technologies and strategies out of fear that it will interrupt and slow down day-to-day operations. This is a valid fear, as many popular security strategies require adding time-consuming steps and programs specifically to deal with sensitive data. Unfortunately, such tools wind up being nothing more than “security theater”—impressive plans on paper—which are disregarded by actual users if it so saves them a few minutes or clicks.
One of the biggest traps into which companies fall when outlining security strategies is to go too far overboard. If day-to-day operations require processing sensitive data, then their policies and workflows treat all data as if it is sensitive. The unnecessary encryption and safeguarding of non-sensitive data are a considerable waste of resources and time—waste that can be considered a direct result of unsophisticated application design. Rather than spending a little time modifying applications to implement focused security mechanisms, many companies try to force all their dataflows into existing inflexible ones—akin to using a sledgehammer to swat flies.
The problem can be solved by defining and applying tiers of data sensitivity, while ensuring only sensitive data is channeled through focused security mechanisms. This type of hybrid-cloud architecture has been gaining traction over the last few years, and represents a more optimized approach to data security than simply encrypting everything. By devoting resources and time only to the protection of sensitive data, the vast majority of a company’s information can be handled without wasteful precautions. Various types of data can also be treated in this manner if identifying details have been removed or encrypted. To make use of this architecture in cloud environments, however, proper use of cryptographic technology is required.
Use Encryption Hardware to Protect the Most Sensitive Data
As cloud service providers have matured, the number of services they offer has grown significantly. As part of efforts to enable more and more businesses processes to live in the cloud, cloud service providers have even started offering cloud security technologies. It is possible that for unregulated industries, many of these offerings are sufficient for data protection. But for business functions that specifically require data encryption, such as the Payment Card Industry Data Security Standards (PCI DSS), cloud-based security options are inadequate. This can be understood by drawing parallels to physical security: most recognize that it is prudent for a new home owner to request that a locksmith change all the locks and provide a new set of keys to ensure that no one who previously had access to the house can use an existing key to get in. Using cloud security technologies, however, would be equivalent to allowing that locksmith themselves to keep a copy of the key—a practice which would largely invalidate all other precautions.
When using cryptographic technologies to protect data, the encryption algorithms and key sizes are only half of the equation—it is just as vital to protect the keys capable of decrypting that data. This should not be surprising to those making use of cloud technologies—AWS, Google Cloud, and other CSPs explain clearly that their responsibility is to ensure “security of the cloud,” but that it is their customers’ responsibility to handle data “security in the cloud.” In other words, if data happens to leak out of the cloud it is your problem, not the service provider’s.
One way to address this is to make use of cryptographic hardware (Hardware Security Modules/HSMs or Trusted Platform Modules/TPMs) to store and manage cryptographic keys outside the cloud, and only send encrypted data into the cloud. Maintaining non cloud-based storage of encryption keys is a vital step towards adequately protecting data. If your keys are in the cloud, then you have no real way of knowing if and when your data is being accessed – even a company like Marriott didn’t know that 500 million customers’ data was being steadily siphoned for over four years.
Eliminate “Shared Secret” Security Practices with FIDO
For companies making use of hybrid cloud architecture and cryptographic hardware, the primary vulnerability that remains to be addressed is the one created by the username/password authentication scheme—perhaps the only computer technology from the 1950s still in use today. Despite the existence of much more powerful alternatives, passwords have survived because of consumer preference and online service providers’ reluctance to implement what they consider to be complex solutions. In cases where this hesitation does not exist, however, we see vastly improved data protection.
Google was able to completely eliminate improper employee account takeover for their staff of more than 85,000, ever since they mandated the usage of hardware security tokens in conjunction with a more modern authentication protocol. The protocol in use was established by the FIDO Alliance, a nonprofit group formed in 2012, dedicated to the elimination passwords from the internet. Fast Identity Online (FIDO) protocols center around the use of a hardware device—a USB security key, smartphone, or biometrics scanner—to authenticate to websites. FIDO protocols make use of public-key cryptography without requiring anything more than a few clicks from the end user, a significant differentiation from previous passwordless offerings. A move towards FIDO-based authentication is a significant step towards rendering phishing, brute force, and man-in-the-middle attacks irrelevant.
Although attackers are getting more and more sophisticated, there are several signs that security technologies are rising to meet the challenges. Passwordless and second-factor authentication (2FA) are already employed at many large companies. The US Department of Defense is working on implementing a Cybersecurity Maturity Model Certification, which will soon mandate that the entire defense industrial base and all DoD suppliers achieve a level of certification regarding their cybersecurity policies and technologies if they wish to remain a supplier. The rising popularity of DevSecOps makes security a shared responsibility that is considered throughout the entire application development lifecycle.
No single strategy outlined above is sufficient to guarantee data security—when employed in conjunction, however, they will provide organizations with the highest levels of data security and defense against the mounting number of threats that specifically target organizations moving towards the cloud.
Improvements to data security will be realized only when companies are able to accept that these antiquated technologies have been proven ineffective, and renewed focus is given to next-generation technologies.
Suhail Noor is the Technical Project Manager at StrongKey, based in Durham, NC. To learn more about Hybrid Cloud Computing or StrongKey, drop us a line.