Ever since the 2013 Snowden revelations, which uncovered the potential scale of US government data gathering, the integrity of the Safe Harbour Agreement has looked seriously weakened. In October, the European Court of Justice (ECJ) ruled it invalid, delivering a coup de grâce that, on the surface of it, leaves firms who use US cloud services with a huge headache. So what exactly does it mean for us all in the IT community?
First a bit of background: Safe Harbour came about because the EU's data protection laws forbid the movement of its citizens' data outside of the EU, unless it will be stored and handled with an equivalent protection of its privacy. Legal protection in the US is less stringent, so Safe Harbour was a way for companies like Facebook, Google et al to self-certify that they would apply EU-level controls to EU data they imported into the US. Edward Snowden's revelations, however, suggest that such self-certification may not be an adequate protection from the US government, and the ECJ has thus held that the agreement is no longer valid.
A new safe harbour agreement is in negotiation between the EU and the US, but in the meantime any firms hoping to store EU-citizen data in the cloud face the additional burden of either encrypting it, ensuring that it won't leave the EU, or implementing a 'model contract clause' that agrees the export with each individual.
Questioning the cloud
The need for Safe Harbour, and its demise, are the perfect illustration that what the cloud promises - an adaptable, amorphous, infinite data resource - can also be a weakness. In reality, there are many business, operational and legislative reasons why you need to know exactly where your data travels and where it is stored. This clearly isn't lost on CIOs: according to Forrester, over 60% of companies find their cloud use restricted because of compliance, transparency or support issues. Meanwhile, Vanson Bourne recently found that 86% of enterprise customers believe it's important for business-critical data to be stored with a UK-based provider.
By now it's becoming abundantly clear that the public cloud as it stands isn't up to the job of supporting firms who process sensitive data. Depending on the sector and regulatory environment, business-critical or sensitive data needs to remain within a computing environment that offers the necessary guarantees. Already we're seeing an increase in 'national' clouds, where the data centres and data remain within a single regulatory area such as the EU or UK: indeed, data giants like Facebook, Apple and Google are already building EU data centres. In early November 2015, Microsoft said that it would spend $2 billion on European cloud infrastructure, while Amazon has announced plans for a UK cloud 'region'.
While this sidesteps the regulatory headaches of exporting EU data to the US, it's likely to drive up cloud costs. Currently, data location and region design in the cloud is mostly a case of fulfilling supply and demand while minimising the costs of data centres and data links. Segmenting the cloud into regions creates additional demands and costs. For me, the change also points to a future, mature 'multi-cloud' model, where data and apps are sourced in multiple regions according to cost, availability, security and sovereignty requirements.
Need help with compliance and the cloud? Call us on 01273 957500, or send us a message