Blockchains and the EU’s GDPR – An unlikely match

The European Union (“EU”) has an opportunity to lead the world in the adoption of blockchain networks by adjusting their data protection regulations to benefit EU citizens. 

Unfortunately, current EU regulations of data privacy are rooted in fundamentally different philosophies than blockchain-based approaches. These contrasts are exemplified in the EU’s General Data Protection Regulation (“GDPR”). The GDPR highlights the necessity of centralized forms of data management with a clear controller. The EU views centralized, governmental authority as essential in protecting consumers and their information against the abuses of private actors, particularly from massive data-driven technology companies, such as Google and Facebook. 

However, privacy rights are well protected by advanced cryptography and immutable, decentralized ledger systems of storage. Immutable ledgers provide individuals the ability to consent to and track third party use of private information. Customers can see when their information was accessed and by whom. This is because the blockchain offers a method of tracking data without central party manipulation. For this reason, the continued development of blockchain applications provides an avenue for GDPR regulators to achieve their privacy objectives while embracing this emerging technology. 

Erasure laws offer EU regulators a tremendous opportunity to begin the blockchain revolution, benefit citizens of the EU, and lead the world in the coexistence of blockchain technology and the legal field.


Blockchain is as significant as the internet was 25-years ago and will profoundly change our world. Blockchain offers a permanent, tamper-proof record outside the control of any single governing authority. An intimidating subject, blockchain, at its most basic, is a chain of blocks. However, one must adjust the traditional interpretation of “block” and “chain.” Here, a block refers to a package of digital information, while the chain is a public database storing the information. 

Blocks are composed of three parts: digital information, a hash identifying the block, and a hash of the previous block. 

The digital information includes details about the transactions, such as time, date, and dollar amounts, and the participants in these transactions. For example, in the Bitcoin blockchain, the digital information relates to the buying and selling of Bitcoins. Each block is given a unique code, called a “hash,” to label and distinguish it. The system creates a hash footprint for the particular transaction, to which it assigns a non-modifiable timestamp. An algorithm composed of a math function creates hash codes by transforming digital information into a string of numbers and letters (such as 8743b52063cd84097a65d1633f5c74f5). Each block contains its own identifying hash and the hash of the previous block. If the information within a block is altered in any way, the hash code changes. As a result, the data entered into the blockchain, once validated by the nodes, can no longer be modified or deleted. If one were to modify the information, the hash code changes while the ledger remains the same; the altered block is obvious.

To add a new block to the chain, four events must occur: (1) a transaction; (2) verified by a network of computers confirming the details; (3) with the details of the transaction compiled in a block with other transactions, and; (4) the block is given a hash.

New blocks can only be added to the “end” of the blockchain, linearly and chronologically. After a block is added to the end of the chain, it is essentially immutable, as it contains its own hash, as well as the hash of the previous block. The previous block contained its own hash, as well as that of the block before it. Therefore, the only possible method of discreetly altering a block’s contents is to change every single hash in the blockchain. 

When a block is added to the chain, it is accessible to all members of the specific blockchain. While transactions on the blockchain are not anonymous, personal information about users is limited to their digital signature or username. There is no access to other identifying information about users.

Trust in a blockchain network is critical. As blockchain requires consensus among users to verify transactions, these users must be vetted. Blockchain networks implement tests for computers desiring to add blocks to the chain. These tests are known as “consensus models” and require users to prove their trustworthiness before participating in the network. 

After being accepted as part of the network, computers are referred to as “nodes” and may verify transactions and add blocks to the chain. For a public blockchain, the decision to add a transaction to the chain is made by consensus. This means the majority of the computers in the network (nodes) must agree the transaction is valid. By verifying transactions, miners help prevent the “double-spending problem.” Double spending is a scenario in which a bitcoin owner illicitly spends the same bitcoin twice. This process of verification is called mining. 

To mine Bitcoin, a node must satisfy two conditions. “One is a matter of effort; one is a matter of luck. They verify ~1MB worth of transactions. This is the easy part. You have to be the first miner to arrive at the right answer, or closest answer, to a numeric problem. This process is also known as proof of work.”

Bitcoin’s proof-of-work system is an example of a consensus model. In this system, computers “prove” they have done “work” by solving a complex computational math problem. With this, a node shows that a certain amount of computational effort has been expended on solving the problem. The proof-of-work problems are difficult and require the computer to solve math problems at astronomical odds (one in 17.56 trillion as of August 2020). If a computer solves the problem, they are accepted in the network.

Once the network verifies a transaction, a new block is added to the existing blockchain and the miner receives a reward. These rewards are typically an amount of cryptocurrency from the blockchain being verified (i.e. Bitcoin). The Bitcoin reward incentivizes users to assist in the primary purpose of mining: to legitimize and monitor Bitcoin transactions, ensuring their validity. 

Blockchain networks can operate on private or public databases. Private databases are limited to certain groups of users (i.e. members of a Walmart supply chain) while public databases allow anyone with an internet connection to participate (i.e. Bitcoin). 

Blockchain users are identified through codes known as keys. There are private keys and public keys. For the Bitcoin blockchain, a private key is a 256-bit number, which can be represented several ways, typically 64 characters in the range 0-9 or A-F. The keys ensure public database users can access transaction details but not identifying information about the users. The private key is only known to the user and is used to access cryptocurrency, while their public key is displayed on transactions. 

It is a common misperception that public blockchain networks like Bitcoin are anonymous; because keys themselves have no identifying information, but remain consistent throughout each transaction, individuals are merely pseudonymized. Direct identification of individual users of a typical blockchain network is not possible, but indirect identification is. For example, receiving payment from a known public chain (i.e. an employer) combined with spending patterns (i.e. gym payments, coffee-shop) could identify an individual through pattern recognition.

Blockchain’s potential as a decentralized form of record-keeping is almost without limit. From greater user privacy and heightened security, to lower processing fees and fewer errors, blockchain technology will see applications beyond the imagination. 

However, Blockchains must address concerns from both the legal community and the general public before it can be utilized in the most beneficial way for society. These concerns include, among others, individual privacy rights in an immutable and pseudonymised environment.

European Privacy Laws

The General Data Protection Regulation’s “right to be forgotten” laws govern how personal data is collected, processed, and erased. The regulation applies to entities processing personal data operating in the EU and those who offer their products or services to EU residents. 

The GDPR’s erasure of personal data regulations appears irreconcilable with the immutability of blockchains. However, the opposite is true; the two are a perfect pair.

European Union | Definition, Purpose, History, & Members | Britannica
The European Union

Legislative History

Control over personal information is a fundamental aspect of the GDPR. First discussed in the EU in 1995, regulations involving erasure arose to minimize stigmatization based on past events, such as criminal convictions or foreclosures. This concept is referred to as “the right to be forgotten.”The concept of the right to be forgotten arises from pre-existing attitudes of influential European countries. There is a longstanding belief in the United Kingdom that after a certain period of time, information regarding an individual should not be considered when obtaining insurance or seeking employment. Due to their early 20th century history,

Germany and Austria understood the potential for abuse of power from the availability of copious amounts of sensitive information. Similarly, France values this right – le droit à l’oubli (the right to be forgotten) – officially recognizing it in French Law in 2010.

In 1995, the EU adopted the European Data Protection Directive (“1995 Directive”) to regulate the processing of personal data. In Article 12 of the 1995 Directive, the EU created a legal pillar of Internet protection for individuals by legislating the “Right of Access” stating “Member States shall guarantee every data subject the right to obtain from the controller as appropriate the…erasure …of data. 

The 1995 Directive set the precedent for the right of erasure provision contained in the GDPR. In 2012, the European Commission disclosed a draft European Data Protection Regulation to supersede the 1995 Directive, including specific protection for the right to be forgotten. The right to be forgotten was later replaced by a more limited right of erasure in Article 17 of the GDPR, adopted by the European Parliament and becoming EU law in 2016. Article 17 codifies an explicit right to erasure of personal data when certain conditions – found in Recitals 65 and 66 – are met. It states, “The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay,” when certain conditions apply. These conditions include, among others: If the personal data is no longer necessary for the purpose an organization originally collected or processed it; An organization must erase personal data in order to comply with a legal ruling or obligation, or; An organization relies an individual’s consent as the lawful basis for processing the data and that individual withdraws their consent.

Members of the public can request any organization remove their information and the recipient has one month to respond. However, an organization’s right to process someone’s data might override the individuals right to be forgotten. Many reasons are cited, but key reasons trumping the right to erasure include: The data is being used to exercise the right of freedom of expression and information, or; The data is being used to perform a task that is being carried out in the public interest.

Article 85 of the GDPR obligates EU Member States to pass national legislation balancing the right to personal data protection with the right to freedom of expression and information. Unfortunately, it is an unfair battle. Organizations can defend themselves from individuals making requests only by demanding a “reasonable fee” or denying a request to erase personal data if the organization can negate the request as unfounded or excessive. 

Thankfully, the new European Proposal for the GDPR also provides protection and exemption for companies listed as “media” companies, like newspapers and other journalistic work. However, collectors and processors of data – such as Google – are classified as “data controllers” under the meaning of the EU data protection directive. These “data controllers” are not exempt under the GDPR and are thus required under EU law to remove “inadequate or no longer relevant” data.

Case Law

Wikimedia – 2009

In 2009, lawyers for a convicted murderer sent the company behind Wikipedia (Wikimedia Foundation) a cease and desist letter requesting their client’s name be removed from a Wikipedia article, citing a decision allowing the suppression of a criminal’s name in news accounts once he is released from custody. 

The European Court of Human Rights agreed with the lower court’s decision to reject the request to ban publication of the killer’s name, saying, “the approach to covering a given subject was a matter of journalistic freedom, and reporting individualised information was an important aspect of the press’s work.”

Google – 2014

The landmark decision in 2014 saw the European Court of Justice (“ECJ”) giving individuals the right to ask search engines to remove results. The Court achieved this by recognizing search engines as data controllers under Article 2 of the Data Protection Directive. This case set the precedent for the previously mentioned right of erasure provision in Article 17 of the GDPR. This case – Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González (“Google Spain”) – was brought by a Spanish man, Mr. González, who requested the removal of a link to a digitized newspaper article about an auction for his foreclosed home for a debt he had subsequently paid. Mr. González complained to the Spanish Agency of Data Protection (“the Agency”), but the Agency rejected the claim on the grounds the article was lawful and accurate. The Agency did however, accept a complaint against Google and demanded the search engine remove the results. Google sued in Spain, and the Court referred a series of questions to the ECJ. 

Finding for Mr. González and the Agency, the ECJ ruled European citizens have a right to request commercial search firms gathering personal information for profit should remove links to private information when asked, provided the information is no longer relevant. The ECJ ruled individual rights may override the freedom of information of online users, so long as the balance between these fundamental rights is assessed on a case-by-case basis. When attempting to strike a balance, the following two factors must be taken into consideration: first, the nature of information in question and its sensitivity for the data subject’s private life, and; second, the public’s interest in accessing the information, which may vary depending on the data subject’s role in public life.

Following this decision, Google created an internal team to review takedown requests and delete links within the EU, as most EU Google searches occur on country-specific sites like in France or in Italy. 

However, in an increasingly global world with access to location-altering technology such VPNs, the issue of applying EU regulation to non-EU sites was inevitable. In late 2019, the ECJ issued two judgments pertaining to the right to be forgotten as established in Google Spain. Both complement the ECJ’s Google Spain decision in two crucial aspects.

Google France – 2019

In 2019, the ECJ found search engines are not exempt from compliance with EU data privacy laws, as a data subject now has the right under certain conditions to ask search engines to remove links with personal data. The subject must claim and prove the data is inaccurate, irrelevant, or excessive for the purposes of the processing; is not kept up to date; or is kept for longer than is necessary. The court ruled a search engine operator must only verify the lawfulness of its processing of sensitive data upon receiving a request for de-referencing; search engines are not required to be proactive in this regard. 

This ruling was a result of the case of GC, AF, BH, and ED v CNIL, Premier ministre, and Google LLC (“Google France”), concerning four individuals who requested Google stop listing websites containing unflattering content about the individuals in its search results. The common thread between the parties was the listed websites inclusion of categories of personal data within the meaning of Article 8(1) and (5) of the now repealed Data Protection Directive 95/46/EC. Google refused to comply with their requests, arguing the personal data of the four individuals, although sensitive, was important to the public interest and should remain available online. After the French data protection authority (CNIL) upheld Google’s decision, the applicants brought the case to the French council of state (Conseil d’Etat) which in turn referred a list of questions to the ECJ. The referring court wanted to know whether Google is a data controller for the special categories of data it processes.

The ECJ found search engines are considered data controllers and are thus responsible for the content they direct users to. In making their decision, the CEJ looked at balancing the right to access information and freedom of expression, and the rights to privacy and data protection. 

However, the erasure law found in Article 17(1) of the GDPR does not apply when processing is necessary for one of the grounds found in Article 17(3)(a) of the Regulation – one of which is the exercise of the right of expression and information, guaranteed by Article 11 of the Charter. Thus, the question becomes whether Google is processing special category data for reasons of substantial public interest, as the only exception Google has available is Article 9(2)(g) of the GDPR: ”Processing is necessary for reasons of substantial public interest.”

Google CNIL – 2019

In Google v Commission nationale de l’ínformatique et de Libertés (“Google CNIL”), the ECJ found no obligation for Google to apply the right to be forgotten outside the EU boundaries, but forced Google to apply the right throughout the EU.

In this case, the French regulator CNIL fined Google 100,000€ for refusing to remove links unavailable to EU residents but available outside EU boundaries. CNIL further ordered Google to apply the right to be forgotten to all domain names, including In response, Google argued CNIL had the power to order its application only to Google’s domain in the country of the dispute, France ( 

Taking the middle ground, the ECJ established the right to be forgotten should be applied to all domain names from the EU;, but also to,,, etc. The Court justified this decision as adoption of the GDPR requires consistent and harmonised protection for users across the EU. However, in an attempt to protect freedom of information rights, the ECJ provided each EU state the power to limit the application of this ruling. Unfortunately, this ensures the absence of standardized approaches across the EU.

It is important that the ECJ found no obligations to apply the right to be forgotten worldwide under EU law. The Court noted the EU could modify its laws to create such an obligation, but has not done so. This was followed by a warning; Lawmakers must ensure the right to be forgotten does not transform into a censorship tool.


The enactment and interpretation of the GDPR regulations caused widespread concern. Experts forecast a “serious chilling effect” on the freedom of speech and information. The Index on Censorship warns the Google Spain decision may “allow individuals to complain to search engines about information they do not like with no legal oversight. This is akin to marching into a library and forcing it to pulp books. It opens the door to anyone who wants to whitewash their personal history.”

The broad scope renders the regulations as potential accomplices in censorship. The requirement to take down information is ripe for easy abuse as the definition of personal data in Article 4(2) includes, “any information relating to” the individual (emphasis added). Companies will delete information rather than incur a fine or litigation. It appears the concept of informational self-determination superseded principles of media freedom and open justice, in spite of the ECJ’s continued referral to the balancing test and the need to take into consideration freedom of information.

This will restrict the ability of businesses and individuals to carry out tasks such as business intelligence, due diligence to comply with anti bribery, anti corruption, and know your customer laws. For example, Article 17 was invoked to remove from Google searches 120 reports about company directors published by Dato Capital, a Spanish company which compiles reports about private directors, consisting entirely of information they are required by law to disclose.

Ultimately, the end result of Article 17 and the ECJ’s rulings may compromise the integrity of Internet-based information, rendering search engines unable to produce accurate search results while articles are removed at will. Here, blockchain offers a solution.

Due to blockchain’s immutability, any data in blockchain transactions is virtually impossible to modify or be erased. So, instead of the right to be forgotten, there is the right to never forget. Unfortunately, immutability of data and decentralization of control, arguably the two most innovative aspects of blockchain, conflict with the GDPR.

Blockchain and Erasure Laws

Tension is not unnoticed by regulators and the private sector alike, as various councils have been created to address major points of contention. Contention is found in two assumptions upon which the GDPR was based.

First, the EU created regulations with the underlying assumption of one natural or legal data controller to whom data subjects can address their rights under EU data protection law. Blockchains, however, are distributed databases decentralized by replacing a unitary central actor with many different players; no central data controller exists.

Second, the GDPR assumes data can be modified or erased where necessary to comply with erasure laws. Blockchains intentionally render the unilateral modification of data nearly impossible. This ensures data integrity and increases trust in the network. 

Blockchain Solutions

At present, most blockchains are not GDPR compliant, and therefore illegal according to some; still, there are solutions to reconcile the legislation with blockchain while adhering to the right to be forgotten mandate. Options exist to enable blockchain coexistence with GDPR regulations. Two solutions are explored below: Sovrin applications and “crypto-shredding”. 

Sovrin Applications

Self-sovereign identity applications (“Sovrin”) provide individuals with control of their information and comply with GDPR regulations. The Sovrin protocol allows individuals to control their digital identities using blockchain technology. In their paper titled “Sovrin: digital identities in the blockchain era,” Dmitry Khovratovich and Jason Law detail a “practical digital identity project of a global scale” to  solve “privacy and scalability problems using the concepts of anonymous credentials and permissioned blockchains.” Their prototype has been developed, tested, and published as open source. Under this system, individuals can limit third-party access to information needed for the specific transaction. For example, if an individual purchases an ebook, only the information needed to digitally deliver the book goes to the seller – i.e. the digital address and payment. Other personal information, such as physical address, remains private. The record of who accessed the information, when it was retrieved, and what they did with the information is documented to the blockchain. 

The current internet data privacy systems rely on centralized data centers, such as Google or Facebook, who allow federated logins for sites across the web. This forces third-party sites to provide the federate login site (i.e. Google) with information about transactions, browsing history, and more to receive the benefit of a single login for customers. This is a solution with enormous reliance on the federate sites to not misuse your data; a dangerous game, given that these same sites make billions off consumer data.

Sovrin puts individuals’ data back in their control. Sovrin offers “a universally available digital identity system that lets individuals collect, hold and present any credentials they want, to whomever they want, whenever they want– without the reliance on a third-party managing access.”

Sovrin acts as a directory, pointing to an individual’s data, which is stored in more traditional, centralized databases. If an identity holder shares a verifiable credential using the Sovrin Network, they create a proof containing only the information requested. The Network uses elements from the available verifiable credentials in the identity holder’s digital wallet to do this. The verifier learns the information that was shared and nothing else.

Not unlike the Bitcoin blockchain, the Sovrin Network is built of “server nodes located around the world… [and] each node contains a copy of the ledger.” This ledger is a record of publicly accessed information needed to “verify the validity of credentials issued within the network.” The server nodes are hosted and administered by trusted entities called Stewards. Stewards cross reference each transaction to “assure consistency about what information is written on the ledger and in what order.” This is done with cryptography and an algorithm.Users gain access to the services on the Sovrin Network using varying access-points, such as mobile applications. These access-points are called Agents. Agents have the important job to hold and process claims on the Sovrin Network. Agents conduct transactions on the owner’s behalf and exchange information directly with other agents. “This way, only public identifiers of an issuer are anchored on the ledger, but an identity holder’s actual proof of their credential is privately transmitted to a validator.” Sovrin created these Agents in code, enabling Agents from different developers to work together within the Network. 

Sovrin offers centralized control in a decentralized system, satisfying GDPR requirements without negating the benefits of blockchain. A central controller remains in charge of the databases storing information; requests for deletion can be accepted. Using the Sovrin Network, each party who validates the identity holder’s proof can be confident that the information being relayed is accurate and timely. Businesses avoid regulatory burdens associated with storing mass amounts of customer data capable of being stolen or misused. As the information is stored in the central server, even if the blockchain directory points to it, once the information is deleted, the blockchain will be pointing to an empty file. Thus, the right to be forgotten as enumerated in the GDPR is satisfied.


A similar solution is making the information impossible to access by deleting decryption keys, commonly referred to as “crypto-shredding.” Without the decryption keys, the encrypted data is unusable, like something stored in a safe without a combination. All personal data can be stored on the blockchain and encrypted with a key or hash, with the key or hash capable of being revoked or deleted upon request or after a period of time. If a data subject requests their data erased, the key would be deleted. This renders the information unobtainable and lost in the blockchain, as the key is the only access point to the information. 

Crypto-shredding maintains trust in the blockchain network and achieves the GDPR erasure mandate. It works with both simple and complicated systems. For simple applications, the system “the keys and transactional data never exist alongside on a persisted medium,” allowing for deletion on the back end. The application reads keys from the key store and uses these to decrypt records in the transactional data store. In complicated systems, keys are loaded by an intermediary service which encrypts and decrypts data on behalf of the application; the business application never receives the key, minimizing the chance of compromising the key. The purpose of crypto shredding is to allow effective erasure of personal data without altering historical archives. Crypto-shredding personal information on the blockchain while making the information impossible to access has potential to improve consumer privacy, but remains at odds with GDPR regulation.

Traditional thinking by GDPR officials renders it difficult to predict whether crypto-shredding is acceptable as a solution. It is well-established that encrypted data still qualifies as personal data under EU law. As discarding an encryption key is not the same as ‘erasure of data’, existing GDPR rules may prohibit storing personal data on a blockchain level for this solution. This would result in individuals losing the opportunity for enhanced control of their personal data. The underlying issue is the lack of legal certainty pertaining to numerous concepts of the GDPR, rendering it difficult to determine how the regulations apply to blockchain technology.

Principles-Based Approach

When observing the principles behind both the GDPR and blockchain, common ground is clear. A closer look at blockchain’s underlying concepts and technologies reveals the technology’s efficacy for the fundamental aspects of data privacy and security desired in the GDPR. EU regulators and blockchain technologists alike therefore must remember blockchain advocates and the GDPR drafters share similar goals. Blockchain simply offers increased transparency, privacy, and security of data, while addressing the underlying goals in a different manner than GDPR.

Blockchain technology aligns with the common principle found in the GDPR: individual control over personal information. When used for digital identity solutions, blockchains offer individuals unprecedented command over how their personal data is shared and used. The blockchain uses encryption to remain confidential, but the ledger itself remains transparent; all transactions are transparent and immutable. Blockchain’s encryption and decentralized structure makes the network and data highly tamper-resistant and, in theory, less vulnerable to unauthorized modification than a single instance database. Therefore, each party accessing information is recorded and tracked, a major positive in light of recent scandals such as Cambridge Analytica in 2018.

Another common principle of the GDPR is anonymity. Blockchain offers modified anonymity, as only those transacting on the network are privy to the information on the network; and, in permissioned networks, even those on the network can be restricted from seeing other participants’ information. Private keys allow for access, while the public key is an address for inter-user transactions, detached from any personally identifying elements; in a public blockchain, no personal identifying information is made public.

By decentralizing transaction processing, distributed ledger systems remove the vulnerabilities commonly exploited in centralized data repositories. What makes blockchain so revolutionary is the ability to store information across a variety of systems for improved security. Rather than an identifiable single point of failure, a blockchain ledger makes single-breach failures nearly impossible.

The above factors, in conjunction, demonstrate the increased control blockchain gives individuals over their personal information. 

Moving Forward

For blockchain technology to reach its full potential, regulators must proceed carefully. The biggest challenge to address is reaching alignment in an area where both sides appear to be at odds. However, the path towards the EU embracing blockchain is undergoing positive developments. In 2018, the European Parliament passed a resolution on distributed ledger technologies and blockchain titled the “Blockchain Resolution.” This resolution stresses the importance of an innovation-friendly regulatory approach toward blockchain technologies compliance with the GDPR. However, the EU must go a step further. The GDPR must be rewritten to enable compliance with erasure laws. For instance, prominent authorities explain how it is “unclear how the notion of ‘erasure’ in Article 17 GDPR ought to be interpreted” in any given situation. The lack of legal certainty pertaining to numerous concepts of the GDPR is the greatest obstacle in determining how the regulations can co-exist with the technology of the future, blockchain.

The fundamental blockchain features, immutability and decentralization, may be relied upon to support the GDPR’s objectives. This rationale holds true regarding data subject rights, such as the right of access (Article 15 GDPR) or the right to data portability (Article 20 GDPR). Giving data subjects control over what others do with their personal data and what they can do with their personal data themselves aligns with the GDPR. Blockchain technology can also assist with the detection of data breaches and fraud.

When decentralized applications attempt to preserve privacy and comply with data protection regulations, many issues arise because the legal text states or expects things to be done in an old-fashioned, centralized way, without explicitly stating technologies. “But what the law actually implies with regards to the right to erasure is that data should not be publicly available anymore. We need to find a common language between technology and the legal field. When a lawyer talks about the “right to erasure,” one cannot assume he or she is talking about the same “erasure” a computer scientist learned at university, and vice versa.”

In their haste to enforce the GDPR, EU regulators must not forget the purpose of their regulations; protecting EU citizens. Massive potential lies in the collaboration of the legal and technological worlds, taking knowledge from both sectors, fusing them together, and developing products and software ideas for the betterment of mankind. Together, the EU regulators and blockchain advocates can use GDPR erasure laws as a platform to lead the coordination between worlds. 

Thank you to many for helping me write this article, especially Dr. Muharem Kianieff and Dr. Mark Giancaspro – your advice and guidance is greatly appreciated.

You can find more blockchain-law related material by following me on Twitter @JacobRobinsonJD and by subscribing to the Law of Code podcast.


  1. Blankrome. “On the Road to Reconciling GDPR and Blockchain.” November 2018. “
  1. “Mainnet.” Aug. 31, 2020
  1. C-136/17, GC, AF, BH, ED v Commission nationale de l’informatique et des libertés (CNIL), interveners: Premier ministre, Google LLC, successor to Google Inc. September 24, 2019.
  1. C‑507/17, Google LLC v. CNIL, 2019 EUR-Lex CELEX No. 62017CJ0507. September 24, 2019.
  1. Consensys. “Interpreting GDPR Through the Blockchain Lens.” February 3, 2020.
  1. Conway, Luke. “Blockchain Explained.” Investopedia. 17 Nov. 2020,
  1. De Meijer, Carlo R.W. “Blockchain versus GDPR and Who Should Adjust Most.” Finextra Research, Finextra. 9 Oct. 2018.
  1. Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data.
  1. Dorri A., Kanhere S., & Jurdak R., MOF-BC: A memory optimized and flexible blockchain for large scale networks, Future Generation Computer Systems, Volume 92, 2019, Pages 357-373.
  1. Dr Michèle Finck, “Blockchain and the General Data Protection Regulation”. July 5, 2019.
  1. EU General Data Protection Regulation (GDPR): Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), OJ 2016 L 119/1
  1. European Parliament resolution of 3 October 2018 on distributed ledger technologies and blockchains: building trust with disintermediation (2017/2772(RSP)) OJ C 11, 13.1.2020.
  1. Federal Constitutional Court in Proceeding Concerning the Constitutional Complaint of Publishing Company “Die Welt” and Mr. K.-H. V. February 14, 1973
  1. Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González. May 13, 2014.
  1. Halliday, Josh. “Google to fight Spanish Privacy Battle.” The Guardian. January, 2011.
  1. Irwin, Luke. The GDPR: How to respond to data subject access requests. February 26, 2020.
  1. Jacques S., Hempel F. (2020) The Right to Be Forgotten in the UK: A Fragile Balance?. In: Werro F. (eds) The Right To Be Forgotten. Ius Comparatum – Global Studies in Comparative Law, vol 40. Springer, Cham.
  1. Khovratovich, Dmitry and Law, Jason, “Sovrin: A Protocol and Token for Self Sovereign Identity and Decentralized Trust.” Sovrin Foundation. January 2018.
  1. Leiser M. & Schermer B. “GC & others vs CNIL and Google: This is a special case”. European Law Blog. November 20, 2019.
  1. Maria Riva, Gianluigi. “What Happens in Blockchain Stays in Blockchain. A Legal Solution to Conflicts Between Digital Ledgers and Privacy Rights.” Frontiers. August 4, 2020.
  1. Martin, Luther. “With GDPR, it’s time to get personal with data encryption.” 
  1. Nicky Morris, “How Sovrin will prevent identity leakages like Equifax.” 2019.
  1. Parloff, Roger. “Company directors are deep-sixing Google links citing ‘right to be forgotten.’” October 21, 2014.
  1. Rosen, Jeffrey. “The Right to be Forgotten.” Stanford Law Review. February 13, 2012.
  1. Satarinio, Adam. “‘Right to Be Forgotten’ Privacy Rule Is Limited by Europe’s Top Court.” New York Times. September 24, 2019.
  1. Sovrin: A Protocol and Token for Self Sovereign Identity and Decentralized Trust. Sovrin Foundation. January 2018.
  1. The Conversation. “Protecting the ‘right to be forgotten’ in the age of blockchain.” October 30, 2014.
  1. Wiewiorowski, Wojciech. “Right to be forgotten. The fundamental right of the person and the danger of the “ministry of truth”. May 10, 2012.
  1. Yang, Andrew, “Tech companies make money off your data. Shouldn’t you be paid, too?”  Miami Herald. June 26, 2020.,the%20backs%20of%20our%20data
  1.  “Cambridge Analytica and Facebook: The Scandal and the Fallout So Far.” New York Times. April 4, 2018.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: