Direkt zum Inhalt

Aus der ZeitschriftLSR 1/2022 | S. 27–35Es folgt Seite №27

Enhanced Privacy for Data Analytics

How Confidential Computing Can Enhance Data Protection Compliance for Data Analytics

Keywords:
Privacy, Data Protection, Data Security, Privacy-by-Design, Data Analytics, Privacy-enhancing Computation, Confidential Computing, Trusted Execution Environments

Abstract

Data requires analysis to tap into its value. Processing of personal data, including analysis, is subject to increasingly demanding legal requirements. The trend towards externalization and an evolving cyberthreat landscape are testing the limits of traditional compliance measures. This article therefore assesses Confidential Computing, a technology that provides progressive privacy features to enhance data protection compliance for data analytics.

I. Introduction

At the 2006 Senior Marketer’s Summit of the Association of National Advertisers (ANA), Clive Humby coined the famous quote: Data is the new oil!1 Humby is one of the co-founders of consumer data science consultancy dunnhumby, which helped the British retail giant Tesco launch the Clubcard. Through the Clubcard, Tesco monitors and analyzes cardholders’ shopping behavior to customize their shopping experience, which provided a competitive advantage and thus initiated a period of unprecedented growth for Tesco.2 In this context, it comes as no surprise that Humby’s quote has become an allegory for illustrating the economic value of data as a commodity.

However, there are other interpretations of the quote. Referring to Humby, Michael Palmer of the ANA wrote in his blog: “Data is just like crude. It’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals, etc to create a valuable entity that drives profitable activity; so must data be broken down, analyzed for it to have value.”3 Palmer thus interprets Humby’s quote to illustrate the complexity of translating essentially useless raw data into valuable insights.

Today, the increasingly demanding legal and regulatory landscape in the field of data protection further drives said complexity. Since the Council of Europe (CoE) modernized its data protection Convention 108,4 and the EU published the notorious GDPR,5 data protection compliance causes headaches to compliance managers globally. Driven by the new data protection standards set by the CoE and first implement- Aus der ZeitschriftLSR 1/2022 | S. 27–35 Es folgt Seite № 28ed by the EU,6 jurisdictions around the globe started revising their data protection laws or introducing new ones, including Japan, the State of California, China, Canada, the United Arab Emirates and Switzerland to name a few.7

At the same time, the way organizations process their data has changed fundamentally. Nowadays, it is common to rely on external data processing infrastructure and services, to share data with external experts and consultants, and to collaborate across multiple organizations on combined data. Consequently, the original challenge of ensuring compliant use of personal data inside an organization has been surpassed largely by the challenge of ensuring external compliance by data processors and other third parties.

Considering the global data protection landscape today and the challenges resulting from modern data processing, reflecting on Humby’s quote, it seems that maintaining an appropriate data protection compliance program for the analysis and commercialization of personal data is almost as complex and fraught with risk as operating an oil-drilling platform in the Gulf of Mexico.8 Organizational measures such as processes, contractual obligations and ex-post audits may no longer be sufficient to effectively protect personal data and ensure compliance. Instead, for high-risk data processing in particular, technical measures that automatically enforce compliance may be necessary. Confidential Computing can just serve as that; it is an emerging technology that offers strong data security properties and limits the use of data to specific computations defined in advance. Thereby it drastically reduces the risk of data misuse in a demonstrable manner. In the world of information- and cybersecurity, Confidential Computing is well-known already. In the legal and regulatory sphere, however, it appears that it has not gained much attention yet. Therefore, in this article we will look at Confidential Computing from the legal and regulatory angle and assess how it may help improve data protection compliance.

II. Key Compliance Challenges for Data Analytics in Practice

To illustrate how Confidential Computing may help improve data protection compliance, let us imagine the following scenario:9

A number of diverse actors want to engage in a joint data analytics undertaking in the field of health economics. Actors include private and public healthcare providers, public health administration bodies in Switzerland and Europe and global pharmaceutical companies, including US-based entities.

Healthcare providers hold data, including sensitive personal health data, that – if combined for joint analysis – allows gaining insights that could not be gained from the isolated analysis of the data held by each of them alone. The aim is to gain insights on general patient behavior, and treatment efficiency in terms of duration and cost across multiple healthcare providers. The insights are potentially highly valuable for all involved parties.

All actors are facing the following key compliance challenges:

  • Data Security: appropriate security measures must be taken against risks such as accidental or unauthorized access to, destruction, loss, use, modification, or disclosure of personal data.10 Accordingly, the analytics undertaking should not expose personal data to additional security risk, including misuse by solution and infrastructure providers, as well as the other involved parties.

How does Confidential Computing enhance Data Security?

  • Privacy-by-Design: the right to the protection of personal data must be secured through appropriate technical and organizational measures at all stages of the processing (Privacy-by-Design).11 Accordingly, the analytics undertaking should not unduly interfere with data subject rights and Aus der ZeitschriftLSR 1/2022 | S. 27–35 Es folgt Seite № 29comply with key principles such as purpose limitation, proportionality, and data minimization.

How does Confidential Computing support Privacy-by-Design?

  • Cross-Border Data Transfers: personal data must not be transferred across borders unless an adequate level of data protection is secured in the country of destination. In the absence of legislation that provides adequate protection in the country of destination, adequate data protection must be secured by other effective and enforceable safeguards.12

How does Confidential Computing enable Cross-Border Data Transfers?

This article will focus on addressing these three key questions in Section IV–VI. Before that, however, we will lay the foundation by clarifying some technical background and provide a brief explanation of the Confidential Computing Technology in Section III.

III. What Is Confidential Computing?

A. Short Technical Background on Data Security

1. Three States of Data

A data state is a mode whereby computer equipment handles data. Computer science discerns three different data states: data at rest, data in transit and data in use. Data that is physically stored on a persistent data storage device, such as a hard drive, is at rest. Data that is being sent over a network or locally between devices is in transit. Data that is being processed by a computer application in (volatile) storage such as memory (RAM) or CPU registers is in use.13

2. Encryption

Encryption is the process of transforming plaintext into ciphertext using a cryptographic algorithm and key. Ciphertext is unintelligible without a deciphering mechanism and the decryption key.14 Encryption technology provides data confidentiality (preventing unauthorized viewing) and data integrity (preventing or detecting unauthorized changes).15 It is current state of the art to encrypt data at rest and in transit. To encrypt data in use, however, is more complex and did not seem to be prioritized until more recent attacks started targeting data in this state. It is also in this state that encryption using broadly established technology is the most likely to impact computing performance. Consequently, market-leading applications in many areas, including in data analytics, are still not using technology providing encryption in use.16

3. Data in Use and Hardware Vulnerabilities

The widespread use of encryption technology has made it more and more difficult for malicious actors to attack data at rest and in transit. Consequently, on their quest for the point of least resistance, they seem to shift their focus on data in use.17 This observation correlates with a broader issue in cybersecurity that concerns the vulnerability not only of data, but of entire systems.

IT systems consist of several stacked layers, with the physical hardware and firmware at the lower end, and the user-facing software at the upper end of the stack. Established approaches to securing IT systems focus on the upper levels of the stack, which is forcing attackers to push for vulnerabilities lower down in the stack.18 Consequently, over the past decade, cybersecurity specialists have been observing an increase in low-level hardware and firmware attacks, whereby attackers are hijacking administrator rights on entire machines. This allows by-passing encryption and accessing even encrypted data at rest and in transit. In an essay from 2018, American cybersecurity guru Bruce Schneier goes as far as to describing “vulnerabilities in the deepest recesses of the world’s computer hardware” as the “new normal”.19 The rapid evolution of cloud and edge computing with countless systems built on the same hardware platform further amplifies the impact of this shift. A vulnerability in one piece of hard- or firmware potentially exposes all systems built on top. In line with those Aus der ZeitschriftLSR 1/2022 | S. 27–35 Es folgt Seite № 30observations, the National Institute of Standards and Technology Interagency (NIST) is currently working on a report where it suggests enhancing existing high-level software security mechanisms by hardware-enabled mechanisms to secure IT systems from the ground up.20

B. Confidential Computing Technology Briefly Explained

Confidential Computing is the protection of data in use relying on hardware-based Trusted Execution Environments or TEEs.21 TEEs provide means to secure IT systems from the ground up in line with the NIST recommendations mentioned above.

The concept of a TEE is not completely new. The technology has been around for some time and most of us are probably already using it without even being aware. In fact, credit card chips are TEEs. Since the release of the iPhone 5s in 2013, the microprocessors of Apple’s smartphone come with an enclave to protect sensitive user information.22 Samsung has also introduced similar technology for its Galaxy S20 series in 2020.23 Those use cases are still rather simple and limited to secure storage of sensitive data on mobile devices. However, the performance of TEEs is constantly improving and use cases are expanding.24 US technology consultancy Gartner predicts that, by 2025, 60% of large organizations will use one or more privacy-enhancing computation techniques in analytics, business intelligence or cloud computing.25

There is no universal definition of a TEE.26 The properties of a TEE vary by vendor and model. Commonly a TEE provides a level of assurance of data integrity, data confidentiality, and code integrity using hardware-backed techniques to provide increased security guarantees for the execution of code (code integrity) and protection of data (data confidentiality and integrity). Code integrity means that the code inside the TEE cannot be replaced or modified by unauthorized entities, including system administrators with highest privileges controlling the operating system.27 Data confidentiality means that unauthorized entities cannot view data inside the TEE. Data integrity is preventing unauthorized entities from altering data when it is being processed. Together, these properties provide not only an assurance that the data is kept confidential, but also that the computations performed are indeed the correct and authorized computations, allowing one to trust the results thereof.28 In addition, certain TEEs provide mechanisms to evidence data confidentiality, integrity, and the authenticity of the code running inside the TEE in the form of hardware-signed attestations vouched for by the relevant vendor.29 Further in this article we will assess how these properties are beneficial to ensuring compliance with certain specific data protection requirements.

IV. How Does Confidential Computing Enhance Data Security?

Referring back to the practical example outlined above, the actors involved in the data analytics undertaking must take appropriate security measures against risks such as accidental or unauthorized access to, destruction, loss, use, modification or disclosure of personal data.30 Where they rely on third parties to process data on their behalf, so-called processors, they are legally responsible for the data security provided by the latter.31 This means that they must carefully select, instruct, and supervise their processors. Security measures may include pseudonymization (including by means of encryption), access restrictions and logs amongst other measures.32 It is further generally held that security measures should take into account the current state of the art of data-security methods and techniques in the field of Aus der ZeitschriftLSR 1/2022 | S. 27–35 Es folgt Seite № 31data processing and should therefore be subject to periodic review and update.33

Data analytics is almost exclusively performed by means of highly specialized software solutions, which are increasingly running in the Cloud.34 While some solutions on the market still support local deployments giving the data controller a higher degree of immediate control over data security aspects, there is a good chance that Cloud-based data analytics solutions will soon be the only option reasonably available for analytics of a certain complexity.35 For joint analytics undertakings involving multiple parties, as in our practical example, there is hardly a way around a Cloud-based solution already today. From a data protection perspective, the provider of a Cloud-based analytics solution is characterized as a processor. Traditionally, controllers conduct vendor due diligence to comply with their obligation to carefully select processors. They are using so-called data processing agreements (DPAs) to impose – amongst others – the obligation to implement adequate (often specific) security measures to comply with their obligation to carefully instruct their processors. Furthermore, they reserve the right to conduct audits to comply with their obligation to carefully supervise their processors. Yet, despite all these measures, as long as a processor has the possibility to access controller data in the clear the risk remains. A data breach may occur as a result from the processor’s willful misconduct, negligence, or the sheer insufficiency of the measures against attacks from the outside. In all cases, if anything, controllers are left with monetary claims as compensation for damages suffered.

Consequently, the actors involved in the data analytics undertaking of our practical example need to be able to extend trust to the provider of their analytics solution. Since the data concerned is highly sensitive, the benchmark for measuring the adequacy of the security measures provided must be very high if not highest state-of-the-art. As outlined above, traditional safeguards to mitigate the risk of a data breach resulting from the use of a Cloud-based analytics solution are essentially based on trust. From experience, lack of trust in a Cloud-solution and the absence of viable alternatives are unsurpassable hurdles for many promising data analytics undertakings.

A data analytics solution using Confidential Computing technology could tackle the issue of trust as it offers effective and technically enforced protection of data against breaches of confidentiality and integrity. Even users with the highest privileges do not have access to the data inside a TEE or the decryption keys. Confidential Computing thereby eliminates the risk of data misuse by both, malicious insiders and external attackers hijacking system privileges to access data. The protection is hardware-based and covers data in all states: at rest, in transit and in use, and is designed to resist even new forms of cyber-attacks and in line with the most recent security recommendations of the NIST, as described above. Confidential Computing may certainly not (yet) be considered state-of-the-art and its use – even where reasonably available – is not a legal requirement today. However, looking at the current trends in cybersecurity, it is not unlikely that Confidential Computing will become a state-of-the-art of data-security method in the field of data processing in the foreseeable future. In that case, controllers engaging in high-risk data analytics like the actors of our practical example would need to consider it at the very least.

V. How Does Confidential Computing Support Privacy-by-Design?

In our practical example outlined above, the actors involved must implement technical and organizational measures which take into account the implications of the right to the protection of personal data.36 To do so, they must first assess the likely effect of processing personal data on the rights and freedoms of the data subjects and then design the processing in such a way as to prevent or minimize the risk of interference with those rights and freedoms from collection to disposal of data.37 The measures taken must duly consider the technological state-of-the-art, the type and the scale of processing.38 This is commonly referred to as Privacy-by-Design.

Besides data security addressed in the previous section and the requirements for cross-border transfers addressed in the following section, key data protection principles for the actors in our practical example are purpose limitation, proportionality and data minimization, and accountability.

Purpose Limitation. The principle of purpose limitation provides that controllers must collect personal data only for explicit, specified and legitimate (primary) purposes – and not process such data in a way incompatible with those purposes.39 Consequently, Aus der ZeitschriftLSR 1/2022 | S. 27–35 Es folgt Seite № 32processing for (secondary) purposes that were not explicit and specified upon collection of the data is permissible only if it is compatible with the primary purposes for which the controller originally collected the data. Purpose limitation is necessary to guarantee data subjects’ right to informational self-determination, one of the primary concerns of data protection law. It also sets apart personal data from non-personal data as it gives data subjects a certain level of control over the use of their data. Further processing for compatible secondary purposes should not hamper the transparency, legal certainty, predictability, or fairness of the processing and should not be unexpected, inappropriate, or otherwise objectionable from the data subject’s perspective.40 Processing for statistical purposes41 is generally deemed compatible with any other legitimate primary purpose.42 This is provided that (i) other safeguards exist (e.g. anonymization or pseudonymization, except if retention of the identifiable form is necessary; provisions governing restricted access and communication of data; and other technical and organizational data-security measures) and (ii) the operations, in principle, exclude any use of the information obtained for decisions or measures concerning a particular individual (i.e. it can be used for non-personal purposes only).43 In addition to the above conditions resulting from data protection law, it should be noted here that controllers must also be mindful of specific prohibitions and conditions for such processing resulting from other areas of law.44

Purpose limitation and the rules on further processing for compatible secondary purposes are particularly relevant for data analytics involving personal data, as it is often a secondary purpose. From experience, cases where the analysis of (personal) data is part of the plan from the outset are rather the exception than the rule. More often, controllers only identify the potential after they have accumulated certain volumes of (personal) data arising from their day-to-day business. Also, analytics of personal data is usually conducted to characterize mass or collective phenomena in a considered population and producing aggregated results, rather than to enable decisions or measures concerning particular individuals. In those cases, controllers may rely on the permissibility of data analysis as a secondary purpose, subject to the safeguards and limitations outlined above.

Securing compliance is particularly challenging in a multi-party analytics undertaking. Traditionally, all actors involved govern their respective rights and responsibilities in a Joint-Controller Agreement. To preserve data subject privacy, such agreements usually include restrictive confidentiality clauses with “least access”-principles, disclosure limitations and secure data disposal rules, and define the relevant statistical purposes as the sole permitted data processing purpose for all actors involved. Compliance with these contractual obligations is further regularly incentivized with penalties in case of a breach.

The most effective way to address both, data subject privacy and purpose limitation, is through pseudonymization or even anonymization. De-identified data poses a lower risk for data subject privacy and may hardly be used for non-permitted decisions or measures concerning individuals. However, retention of the identifiable form is often necessary for gaining certain insights. In that case, the risk of a data breach resulting from one of the other actors’ willful misconduct or negligence remains. In our practical example, analysis must be performed on the data in identifiable form: only retracing each individual patient’s itinerary from one healthcare provider to the next allows gaining the desired insights in the form of mass or collective phenomena in general patient behavior and treatment efficiency.45 The healthcare providers need to make sure that the personal data they provide to the other actors is subject to adequate safeguards. Since the data concerned is highly sensitive, the benchmark for measuring the adequacy of such safeguards must be very high if not highest state-of-the-art.

Traditional safeguards to mitigate the third-party risk inherent to any multi-party analytics undertaking involving personal data in identifiable form are strongly based on trust, which is often lacking and thus impeding potentially fruitful data analytics collaborations.46 A data analytics solution using Confidential Computing technology could address this challenge as it offers effective and technically en- Aus der ZeitschriftLSR 1/2022 | S. 27–35 Es folgt Seite № 33forced purpose limitation without sacrificing the benefits of performing analysis on personal data in identifiable form: users can only execute specific computations identified in advance, which can be defined so as to generate aggregated statistical results only.47 As the solution executes the agreed upon computations as trusted processes inside the TEE and provides the results of the analysis, the user never has access to the underlying source data. A data analytics solution using Confidential Computing technology thereby fulfills all the requirements to legitimize processing for statistical purposes. First, even though the data is not technically anonymized – it persists in identifiable form inside the TEE – the protection provided by hardware-based encryption is equivalent. Second, restricted access is technically enforced as only the trusted processes running inside the TEE have access to the source data, no one else. Third, if the allowed computations are defined to generate aggregated statistical results only, any use for decisions or measures concerning a particular individual is excluded by nature.

Principles of Proportionality and Data Minimization. The principle of proportionality provides that processing of personal data shall be proportionate in relation to the legitimate purpose pursued and reflect at all stages of the processing a fair balance between all interests concerned.48 Processing should not lead to a disproportionate interference with the interests, rights and freedoms of the data subjects. The principle of proportionality is to be respected at all stages of processing, including at the initial stage, i.e. when deciding whether or not to carry out the processing and how.49 Consequently, controllers must use those means for the processing that best preserve data subjects’ right to privacy while still allowing it to reasonably achieve the purposes of the contemplated processing.50

The principle of proportionality also implies that data undergoing processing shall be adequate, relevant and not excessive in relation to the purposes for which it is processed (data minimization).51 The practical importance and consequences of data minimization are real. Purging bulk sets of data and removal of any data contained in a bulk that is not necessary for the relevant purposes is often burdensome. This issue may, for instance, be tackled by making use of special privacy-enhancing technology, to avoid using personal data at all, or at least reduce the ability to attribute data to a data subject, which results in a privacy-friendly solution.52

With the principle of proportionality in mind, the healthcare providers in our practical example must ensure that they do not provide personal data to the other actors involved if not necessary to achieve the intended purpose. As outlined in the preceding sections, Confidential Computing offers major advantages in terms of data security compared to a traditional data analytics solution and technically enforces purpose limitation. This significantly reduces the interference with the interests, rights and freedoms of the data subjects and positively impacts the balance of interests in favor of the undertaking. Furthermore, a data analytics solution using Confidential Computing technology allows all actors involved to perform analysis on their joint data in the identifiable form without the ability to access source data inside the TEE and attribute it to a specific data subject. Consequently, none of the actors actually transfers personal data to, and, conversely, none of the actors actually collects personal data from the others. Thereby, Confidential Computing effectively facilitates data minimization.

Accountability. The principle of accountability requires that controllers shall be able to demonstrate compliance of their data processing with applicable data protection principles and requirements. The essence of accountability is the controllers’ obligation to put in place measures which would – under normal circumstances – guarantee that data protection rules are adhered to in the context of processing operations; and have documentation ready which demonstrates to data subjects and to supervisory authorities the measures that have been taken to achieve compliance with those data protection rules.53 The principle of accountability thus requires controllers to actively demonstrate compliance and not merely wait for data subjects or supervisory authorities to point out shortcomings.54

There are various instruments to facilitate compliance with this requirement, including maintaining a record of processing activities; designating a data protection officer (DPO) who is involved in all data protection matters; adhering to approved codes of conduct or certification mechanisms; or conducting data protection impact assessments (DPIAs) for processing activities that are likely to result in a high risk to the rights and freedoms of the data subjects.55 All but the latter will help controllers ensure compliant operations in general. Conducting DPIAs will further help controllers ensure compliance on specific data processing activities. However, while generally considered “best practice” to facilitate compliance, none of these traditional measures can guarantee or provide proof of compliance in practice. A data ana- Aus der ZeitschriftLSR 1/2022 | S. 27–35 Es folgt Seite № 34lytics solution using Confidential Computing technology marks a step-change compared to traditional accountability measures. Through remote attestation it allows controllers to demonstrate that compliance was indeed guaranteed throughout the entire analytics undertaking.

Overall, Confidential Computing strongly supports compliance with various key data protection principles and requirements in a technically enforced and demonstratable way and thereby epitomizes Privacy-by-Design.

VI. How Does Confidential Computing Enable Cross-Border Data Transfers?

Data protection laws generally follow the principle of territoriality.56 Accordingly, different rules may apply depending on the location where processing of personal data takes place. Personal data may only be transferred abroad if adequate (i.e. equivalent) protection of data subjects’ rights is guaranteed in the country of destination. Where the applicable laws in the country of destination do not provide adequate data protection, the data exporter needs to implement additional safeguards.57 Traditionally, data exporters regularly relied on contractual arrangements with the data importer to achieve adequate protection.

However, in June 2020, in the so-called Schrems II ruling, the Court of Justice of the European Union (CJEU) held that contractual arrangements between a data exporter and a data importer alone do not always provide effective protection.58 Where the laws in the country of destination authorize government agencies to seek access to personal data within its territory and such legal authorization does not satisfy certain essential guarantees under the Charter of Fundamental Rights of the European Union, such laws void the protections provided by any contractual arrangements.59 Thus, where a data exporter has reason to believe that applicable laws in the country of destination might prevent the data importer to comply with its contractual obligations, it may no longer rely solely on contractual safeguards. To legitimize data transfers to such countries, the data exporter must take additional technical and organizational measures. The Schrems II ruling is binding for data exporters in the EU/EEA countries and the Federal Data Protection and Information Commissioner (FDPIC) adopted the views taken by the CJEU for Switzerland.60

Following the Schrems II ruling, data protection authorities in Switzerland and the EU issued further guidance specifying the requirements for cross-border data transfers as outlined by the CJEU.61 In a nutshell, where local law in the country of destination authorizes government agencies to seek access to personal data within their territory in disregard of the essential European guarantees referred to above, data exporters must provide that personal data remains encrypted at all times, making it effectively impossible (not just prohibited) for the data importer to procure the data to government agencies in the country of destination upon their order. This is difficult to achieve for any type of processing other than mere data storage. Meanwhile, the European Data Protection Board has relativized its position on the requirements for cross-border data transfers affected by the Schrems II ruling. In its final guidance, it allows data exporters to consider not only the power of government agencies, but also the likelihood that such agencies will actually make use of such powers to access the data concerned.62 However, the FDPIC in Switzerland and data protection authorities in certain EU countries stick to their restrictive interpretation of the Schrems II ruling. Consequently, there remains legal uncertainty on the requirements for cross-border transfers to countries that do not provide adequate data protection.

From experience, the legal uncertainty surrounding the conditions for cross-border transfers of personal data often stands in the way of data collaborations between European and non-European countries. The healthcare providers in our practical example might thus see issue in the participation of global pharmaceutical companies, in particular, those based in the US.

Aus der ZeitschriftLSR 1/2022 | S. 27–35 Es folgt Seite № 35The concern about disproportionate powers of government agencies to access data may be addressed by a data analytics solution using Confidential Computing technology. As described above, no one has access to the source data inside the TEE, neither the actors actively involved in the analytics undertaking, nor any other parties passively involved such as the solution-, platform- or infrastructure providers. Data inside the TEE is protected through hardware-based encryption and may only be decrypted by trusted processes. If ordered by any government agency to disclose the healthcare providers’ data, none of the parties actively or passively involved would be able to obey such order, since they do not have access to the data. The same is true for orders to disclose decryption keys since none of the parties have access to the keys either.63 Consequently, Confidential Computing fulfills the requirements outlined in the Schrems II ruling – even if interpreted restrictively – and enables joint data processing with actors based in the US and other countries that do not provide adequate data protection.

VII. Conclusion

Technological progress in recent years enables sophisticated data analytics. Accordingly, the desire to leverage its benefits is strong with private and public sector players alike. However, accessing the most advanced technology often comes at the price of being reliant on specialized third-party providers and losing immediate control over data processing infrastructure and services. The inherent risk of externalizing the processing of personal data is further aggravated by new forms of cyber-attacks. At the same time, legislators across the globe – most notably in Europe – are trying to keep pace with this rapid evolution by raising the bar on data compliance. In addition, organizations like Maximilian Schrems’ NGO None Of Your Business have made data protection a matter of general concern by dragging malefactors in front of the courts and into media spotlight. Consequently, processing of personal data bears important regulatory and reputational risks, even more so for sensitive personal data such as health data. To manage those risks, data controllers need a robust compliance strategy and framework. However, traditional (organizational) measures are reaching their limits with increasing reliance on external data processors and sharing data with third parties as they are strongly built on trust. The more people (potentially) have access to data, the more difficult it becomes to implement, monitor, and eventually enforce such measures.

In this context, Confidential Computing provides built-in and automatically enforced measures to ensure compliance, complementing the less reliable measures based on trust. Hardware-based encryption of data in use drastically reduces the number of people with (potential) access to data and strongly improves data security. This also addresses concerns about disproportionate government access to data in jurisdictions that do not provide adequate data protection from a European perspective. Furthermore, the possibility to pre-define authorized computations and increased guarantees of code integrity mitigate the risk of data misuse. Finally, remote attestation allows data controllers to demonstrate actual compliance and thereby exceed legal accountability requirements. With its hardware-based approach to information security, Confidential Computing incorporates most recent recommendations of cybersecurity experts, and renowned opinion-makers are expecting privacy-enhancing computation techniques like Confidential Computing to be widely used by 2025. As data protection laws require state-of-the-art measures to secure data subjects’ rights, even the notoriously technophobe legal profession should keep an eye on relevant technological trends. Confidential Computing is one worth following more closely for data protection officers and compliance managers in any data driven organization.

  1. * This article represents the author’s personal views on a technical matter of law. The author thanks David Sturzenegger, PhD, of dq technologies Ltd (better known as Decentriq) for inputs on the technological background. Decentriq is a founding member of the Confidential Computing Consortium.
  2. 1 Michael Palmer, Data is the New Oil, November 3, 2006 (blog-post at https://ana.blogs.com/maestros/2006/11/data_is_the_new.html, January 31, 2022).
  3. 2 Will Smale, The couple who helped transform the way we shop, November 24, 2004 (BBC News site at https://www.bbc.com/news/business-30095454, January 31, 2022).
  4. 3 Palmer, ibid. fn. 1.
  5. 4 Modernized Convention for the Protection of Individuals with Regard to the Processing of Personal Data of the Council of Europe (SEV 108+).
  6. 5 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data.
  7. 6 The SEV 108+ redefined the key principles and set a new benchmark for the data protection laws of the CoE member states. Very much simultaneously, the EU translated those principles into law by enacting the GDPR. Switzerland will be following up with 4 years delay when the revised Federal Act on Data Protection (revFADP) enters into force later this year. While the GDPR and local laws in the EU and Switzerland may have higher practical relevance locally, the SEV 108+ is arguably the most important source of data protection law globally. Therefore, in this article, we assess the general data protection principles under the SEV 108+. For ease of reference, the corresponding provisions of the GDPR and the revFADP are included in the relevant footnotes.
  8. 7 DLA Piper provides a comprehensive overview on data protection legislation around the globe on its data protection practice website (at https://www.dlapiperdataprotection.com/index.html?t=world-map&c=CA, January 31, 2022).
  9. 8 In 2012, the US Federal Department of Justice issued a USD 4.5 billion fine against oil giant BP for its shortcomings leading up to the Deepwater Horizon catastrophe of 2010. That equals around 1.5% of BP’s annual turnover in the year of the catastrophe. In comparison, the GDPR foresees fines up to 4% of the infringers group-wide annual turnover. *Note, however, that the potential claims for damages resulting from data protection infringements are not likely to get close to the civil claims successfully brought against BP in connection with Deepwater Horizon. The overall compliance risk including civil claims exposure is difficult to compare.
  10. 9 The practical example is designed to link to all the data protection principles and requirements assessed in this article. Therefore, it may seem farfetched. However, from practice the author is aware of one actual case that is quite similar and various cases that include individual compliance challenges assessed in this article.
  11. 10 Art.7 SEV 108+; see also art. 32 GDPR, and art. 8 revFADP.
  12. 11 Art. 10 para. 3 SEV 108+; see also art. 25 GDPR, and art. 7 revFADP.
  13. 12 Art. 14 para. 2 SEV 108+; see also art. 44 et seqq. GDPR, and art. 16 et seqq. revFADP.
  14. 13 The Confidential Computing Consortium, Confidential Computing: Hardware-Based Trusted Execution for Applications and Data, July 2020, p. 3 (online publication at https://confidentialcomputing.io/wp-content/uploads/sites/85/2020/06/ConfidentialComputing_OSSNA2020.pdf, January 31, 2022).
  15. 14 NIST National Institute of Standards and Technology, Computer Security Resource Centre (online glossary at https://csrc.nist.gov/glossary/term/encryption, January 31, 2022).
  16. 15 The Confidential Computing Consortium, ibid. fn. 12, p. 5.
  17. 16 E.g. Microsoft’s online data analytics service PowerBI uses encryption for data at rest and in transit on Microsoft servers. It also enforces encryption for incoming network connections from external data sources. On data in use, however, Microsoft states in its online documentation: “Power BI loads actively processed data into the memory space of one or more service workloads. To facilitate the functionality required by the workload, the processed data in memory is not encrypted.”
  18. 17 Ellison Anne Williams, Data in Use Is the Point of Least Resistance, in Security Week, April 26, 2019 (blog post at https://www.securityweek.com/data-use-point-least-resistance, January 31, 2022).
  19. 18 NIST National Institute of Standards and Technology, Hardware-Enabled Security: Enabling a Layered Approach to Platform Security for Cloud and Edge Computing Use Cases, 2nd Draft, December 2021, NN310 et seqq. (online publication at https://doi.org/10.6028/NIST.IR.8320-draft2, January 31, 2022).
  20. 19 Bruce Scheier, The New Way Your Computer Can Be Attacked, in The Atlantic, January 22, 2018 (online at https://www.theatlantic.com/technology/archive/2018/01/spectre-meltdown-cybersecurity/551147/, January 31, 2022).
  21. 20 NIST, Hardware-Enabled Security, ibid. fn. 17, N273 et seqq.
  22. 21 The Confidential Computing Consortium, ibid. fn. 12, p. 5.
  23. 22 Apple Platform Security (online at https://support.apple.com/guide/security/secure-enclave-sec59b0b31ff/web, January 31, 2022).
  24. 23 Samsung Newsroom (online at https://news.samsung.com/global/samsung-introduces-best-in-class-data-security-chip-solution-for-mobile-devices, January 31, 2022).
  25. 24 The Confidential Computing Consortium, ibid. fn. 12, p. 8 et seqq.
  26. 25 Gartner, Top Strategic Technology Trends for 2022 (e-book available at https://www.gartner.com/smarterwithgartner/gartner-top-security-and-risk-trends-for-2021, January 31, 2022).
  27. 26 Mohamed Sabt, Mohammed Achemlal and Abdelmadjid Bouabdallah, Trusted Execution Environment: What It is, and What It is Not. 14th IEEE International Conference on Trust, Security and Privacy in Computing and Communications, Aug 2015, Helsinki, Finland. The authors derive the following desired properties of a TEE from a variety of other definitions: A TEE is a tamper-resistant processing environment that runs on a separation kernel. It guarantees the authenticity of the executed code, the integrity of the runtime states (e.g. CPU registers, memory and sensitive I/O), and the confidentiality of its code, data and runtime states stored on a persistent memory. In addition, it shall be able to provide remote attestation that proves its trustworthiness for third parties. The TEE resists against all software attacks as well as the physical attacks performed on the main memory of the system. Attacks performed by exploiting backdoor security flaws are not possible.
  28. 27 Other unauthorized entities may include other applications on the host, the host operating system and hypervisor, service providers, and the infrastructure owner – or anyone else with physical access to the hardware.
  29. 28 The Confidential Computing Consortium, ibid. fn. 12, p. 5.
  30. 29 The Confidential Computing Consortium, ibid. fn. 12, p. 7.
  31. 30 Art. 7 para. 1 SEV 108+; see also art. 32 GDPR, and art. 8 revFADP.
  32. 31 European Union Agency for Fundamental Rights and Council of Europe, Handbook on European data protection law, 2018, p. 101; David Rosenthal, Das neue Datenschutzgesetz, in: Jusletter 16. November 2020, N15 et seq.
  33. 32 Rosenthal, Das neue Datenschutzgesetz, ibid. Fn. 30, N53 et seq.
  34. 33 Council of Europe, Explanatory Report to the Protocol amending the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, N63; Rosenthal, Das neue Datenschutzgesetz, ibid. fn. 30, N55.
  35. 34 For a lawyer-friendly definition of what the “Cloud” is, see David Rosenthal, Cloud 101 für Juristinnen und Juristen, presentation slides (online at https://www.rosenthal.ch/downloads/Rosenthal-Cloud101.pdf, January 31, 2022).
  36. 35 That does not automatically mean that local deployments provide better security and to get state-of-the-art security it may even be better to rely on Cloud-based solutions.
  37. 36 Art. 10 para. 3 SEV 108+; see also art. 25 GDPR, and art. 7 revFADP.
  38. 37 Art. 10 para. 2 SEV 108+; see also art. 25 GDPR, and art. 7 revFADP.
  39. 38 Handbook on European data protection law, ibid. fn. 30, p. 183.
  40. 39 Art. 5 para. 4 let. b SEV 108+; see also art. 5 para. 1 let. b GDPR, and art. 6 para. 3 revFADP.
  41. 40 Explanatory Report, ibid. fn. 32, N49.
  42. 41 See definition in Explanatory Report, ibid. fn. 32, N50: “Statistical purposes” refers to the elaboration of statistical surveys or the production of statistical, aggregated results. Statistics aim at analyzing and characterizing mass or collective phenomena in a considered population.
  43. 42 Art. 5 para. 4 let. b SEV 108+; see also art. 5 para. 1 let. b GDPR; there is no corresponding provision in Swiss law that explicitly authorizes processing for statistical purposes; however, see Rosenthal, Das neue Datenschutzgesetz, ibid. fn. 30, N36: Ein weiteres klassisches Beispiel für eine Bearbeitung, die mit dem ursprünglich angegeben Zweck normalerweise vereinbar ist, ist die Pseudonymisierung oder Anonymisierung von Daten, um sie für irgendeinen anderen Zweck zu verwenden.
  44. 43 Explanatory Report, ibid. fn. 32, N50.
  45. 44 Most notably for the life sciences sector in Switzerland the conditions set forth in the Federal Act on Research involving Human Beings (HRA, SR 810.30).
  46. 45 In theory, pseudonymization would be possible. This would, however, require that all actors involved use the same logic to pseudonymize their data prior to the analysis so that patient X in the records of healthcare provider A is also patient X in the records of healthcare providers B, C, etc. This is complex and only solves part of the problem.
  47. 46 Another issue are the restrictions on sharing patient data under professional secrecy rules. This article is focused on data protection laws and professional secrecy restrictions are thus not assessed. However, since Confidential Computing guarantees data confidentiality – including among the actors involved – it is obvious that Confidential Computing is also a very interesting technology from a professional secrecy perspective.
  48. 47 For instance, it is possible to configure a data analytics solution to implement the concept of K-anonymity.
  49. 48 Art. 5 para. 1 SEV 108+; see also art. 5 para. 1 let. a GDPR (fairness), and art. 6 para. 2 revFADP.
  50. 49 Explanatory Report, ibid. fn. 32, N40.
  51. 50 Handbook on European data protection law, ibid. fn. 30, p. 40.
  52. 51 Explanatory Report, ibid. fn. 32, N40.
  53. 52 Handbook on European data protection law, ibid. fn. 30, p. 126.
  54. 53 Article 29 Working Party, Opinion 3/2010 on the principle of accountability, WP 173, Brussels, 13 July 2010.
  55. 54 Handbook of European data protection law, ibid. fn. 30, p. 134.
  56. 55 Handbook of European data protection law, ibid. fn. 30, p. 135.
  57. 56 Note that this is not without reservation. Art. 3 para. 2 GDPR provides exceptions for situations where GDPR shall apply beyond EU borders. In Switzerland, the Federal Supreme Court has already applied the effects doctrine (Auswirkungsprinzip) in the context of data protection in 2012 (BGE 138 II 346; Google Street View). This will be reflected in Art. 3 para. 2 revFADP providing an even broader exception than those in the GDPR; cf. also David Rosenthal, Das neue Datenschutzgesetz, N88 et seq.
  58. 57 Art. 14 para. 2 SEV 108+; see also art. 44 et seqq. GDPR, and art. 16 et seqq. revFADP.
  59. 58 Judgement of the Court (Grand Chamber) of July 16, 2020, in case C-311/18 (Facebook Ireland – Maximilian Schrems).
  60. 59 European Data Protection Board, Recommendations 02/2020 on the European Essential Guarantees for surveillance measures, adopted on November 10, 2020.
  61. 60 Eidgenössischer Datenschutz- und Öffentlichkeitsbeauftragter, Stellungnahme zur Übermittlung von Personendaten in die USA und weitere Staaten ohne angemessenes Datenschutzniveau i.S.v. Art. 6 Abs. 1 DSG, September 8, 2020.
  62. 61 European Data Protection Board, Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data, Version 1.0, November 10, 2020; Eidgenössischer Datenschutz- und Öffentlichkeitsbeauftragter, Anleitung für die Prüfung der Zulässigkeit von Datenübermittlungen mit Auslandbezug, Juni 2021.
  63. 62 European Data Protection Board, Recommendations 01/2020, Version 2.0, June 18, 2021: [Exporters] may decide to proceed with the transfer without implementing supplementary measures if [they] consider and are able to demonstrate and document that [they] have no reason to believe that relevant and problematic legislation will be interpreted and/or applied in practice so as to cover [their] transferred data and importer.
  64. 63 Notwithstanding any debate around the limits (or limitlessness) of the powers of government agencies, in particular, in the US, Confidential Computing adds and extra layer of protection in that regard. In fact, the only party that does have access to the decryption keys is the hardware manufacturer. The latter is (usually) neither actively nor passively involved in the processing (unless, in the unlikely event that it also acts as solution-, platform-, or infrastructure provider).