July 9, 2020
July 9, 2020


Data territorialisation is the need of the hour but the ‘borderless cyberspace’ might mandate a halfway house halt.
Ex-CIA systems analysts, Mr Edward Snowden, in 2013, was charged in the United States of America with theft of government property, unauthorised communication of national defence information and wilful communication of classified communications intelligence. Reason being that he leaked classified files which blew the whistle on the state-sponsored spying program run by the United States of America. The USA’s NSA (National Security Agency) in consort with UK’s security agency GCHQ (Government Communications Headquarters) had access to the data of the public. This program essentially had two wings, the first one and also the least talked about was where the agencies were physically tapping the undersea Internet Cables to monitor the data being transmitted across borders, which supposedly was meant to be a protected transmission. The other one was the PRISM scandal, which is, for common people, no less than a safe with a back door, created for a State to monitor you. Whatever data we feed on the internet on various internet giants’ servers was being shared with the NSA of the US. The state coerced the companies to give them the access to their servers so that they can monitor the data for potential terror acts.
This incident brought into the light the problem with the borderless ‘Cyberspace’ which put into risk the data of the internet users, and opened up a Pandora’s Box named Territorialisation of Data. The world saw Germany as the first state to come up with answers to United States’NSA’S PRISM. Germany’s largest telecom provider Deutsche Telkom said that it would develop its system which will ensure domestic routing of most of the data being sent over its network and also that the data remains within the German soil, so that more stringent protection can be practised by the state on its citizens’ data,however, with an added caveat, that this would have cost the Germans, their connectivity to the global Internet. This was made possible because it was a collective initiative by Deutsche Telekom (a telecom company), WEB.DE(leading German Internet Portal) and GMX(communication service), all German companies. This raises the question on the role of several stakeholders which would be involved in ensuring Data Territorialisation.
To understand more on how the stakeholders will need to act to materialise the concept of Territorialisation of Data, we will need to understand the basic functioning of the internet, which runs on two protocols namely TCP/IP (Transmission Control Protocol/ Internet Protocol), which help the data sent over the internet to reach the recipient. The former governs how systems may exchange information and the latter tells how those systems must be addressed. The core architectural guideline of the Internet is the end-to-end principle. It is based on the idea that, in a distributed computing network, the functionality should be provided by end hosts rather than by the network itself, using … TCP/ IP.[1] The idea behind this principle was to put the functionality of the network with the clients, and not with the network itself or in central hubs that would be easy to destroy.[2]
Thence is the concept of dumb network and smart hosts. Now what territorialisation necessitates is making the dumb network smart, which can decide and govern the path through which the transmitted data should travel. And hence it will be necessary that the data which is not required to go outside the traditional borders should not leave the country’s soil or sky for that matter, for instance, the e-mail or messages you send over the internet are first transmitted to a server situated somewhere in Europe or the United States and then sent to the recipient, regardless of the recipient’s physical location, should not be routed that ways, rather the task is done using better and intelligent protocols which can identify whether the destination is within the allowed territory or outside of that and accordingly diverts the data traffic to respective servers.
Now when the functioning of the internet is understood it becomes quite clear that the onus to ensure the prerequisites is on the Government, the Internet Service Providers and the Tech Companies but most importantly the citizens who will have to follow the process of migration from an open borderless internet and adapt to the “new internet”.
There are several examples where the governments have strictly followed certain policies to favour the State policies and ensure that the data of their citizens is not misused. The most prominent one is the example of People’s Republic of China, which practices strong regulations over the internet in the country as a result of which several, otherwise immensely successful tech giants, are either not allowed or partially allowed to function in the country. To combat the connectivity issues, China has come up with its own local substitutes.
While such a solution can be advocated on the grounds that it ensures strict control over the data of the citizens, but then this cannot be seen as a global solution, for people who have learned and adapted themselves to live in a ‘borderless’ cyberspace. The one feasible solution here which can be looked upon is that the states separately ask the prominent tech giants to setup their servers within the territorial borders of the country and then control the movement and storage of data accordingly. However, this might not seem to be a doable measure for every country in the world. Not all countries have the diplomatic capability to make a company setup servers within their borders and hence one way out of this deadlock can be that the nations having friendly relations may come forward and form regional alliances and then the company can set up its servers for that alliance in any one of the country. To understand this better, we might look at the proposed model of Deutsche Telekom, where the megacorp had proposed an initiative named “E-mail made in Germany” where all the storage of data and the transmission of the that data would happen within the border of the country, and this model is proposed to be expanded to the whole Schengen region.Similarly, countries may decide to group together and make sure that the tech companies follow the policy changes.
In times when people have been habituated to live in a world which is though divided by the physical borders gets to connect back on the Internet, Territorialisation of data seems a distant possibility. However, the threat to an average user’s data compels one to rethink the idea, the stunning revelation by Edward, has definitely opened up this conversation, and today’s world when there’s no way to know where is your data going, how is it being used or analysed or monitored, and when given a thought it can be understood how important are one’s rights over one’s data and the stakeholders must act upon it immediately, for the interest of their citizens. This involves a lot of policy changes and logistical development, but unquestionably is the need of the hour.


[1]Glen, C. M., 2014. ‘Internet Governance: Territorializing Cyberspace?’, Politics & Policy 42 (5): 644

[2]Ahrens, Andreas Baur, 2017. ‘The Power of Cyberspace Centralisation’. ResearchGate. Available at (accessed 08 May 2020).



Kshitij Awasthi is a first-year law student at Hidayatullah National Law University. He has a keen interest in the interdisciplinary spectrum of law and technology.




In-content Picture Credits:

Leave a Reply

Your email address will not be published. Required fields are marked *