The new privacy road after the failure of EU data supervision
The endless cookie settings that pop up for every website feels a bit like the internet’s stubborn, unchanging hoax compliance. This is annoying. It feels a bit like the data market’s retaliation against regulators, giving the General Data Protection Regulation (GDPR) a bad reputation, so that it seems that political bureaucrats once again awkwardly interfere with the smooth innovation process.
However, the fact is that the privacy vision proposed by the GDPR will inspire an era of innovation that is more exciting than today’s inferior technology. However, as far as today’s situation is concerned, it has not done this at all. What is needed is an infrastructure approach with the right incentives. Let me explain.
Fine-grained metadata collected behind the scenes
As many of us are now keenly aware, laptops, mobile phones, and all devices with the “smart” prefix generate a lot of data and metadata. So much so that the concept of making sovereign decisions about your personal data is almost meaningless: if you click “no” to a cookie on a site, an email will still quietly send a tracker. Delete Facebook and your mother will tag your face with your full name in an old birthday photo, etc.
The difference today (and why CCTV cameras are actually scary representatives of surveillance) is that even if you choose and have the skills and knowledge to protect privacy, the overall environment of massive metadata collection can still harm you. This has nothing to do with your data (which is usually encrypted anyway), but about how the collective metadata stream will reveal things at a fine-grained level and expose you as a target-potential customers or potential suspects should be prominent in your pattern behavior.
However, as it may seem, everyone actually wants privacy. Even the government, enterprises, especially the military and national security agencies. But they want their own privacy, not the privacy of others. This puts them in a dilemma: On the one hand, how can the national security agency prevent foreign agencies from monitoring their population, and at the same time establish a back door so that they can snoop?
Governments and companies have no incentive to provide privacy
In the language that these readers are very familiar with: the demand exists, but there are problems Reward, to put it mildly.As an example of how many incentive problems currently exist, a report by Ernst & Young Values The market for health data in the UK alone amounts to 11 billion U.S. dollars.
Although such reports are highly speculative in terms of the actual value of the data, they will produce irresistible omissions or FOMOs, leading to self-fulfilling prophecies, because everyone is running for the promised profits. This means that although everyone from individuals to governments and large technology companies may want to ensure privacy, they simply do not have enough incentives to do so. FOMO and the temptation to sneak into the back door make the security system less secure, which is too powerful. The government wants to know what they (and others) are talking about, companies want to know what their customers are thinking, employers want to know what their employees are doing, parents and school teachers want to know what children are doing.
There is a useful concept in the early history of scientific and technological research that can help clarify this chaos to a certain extent. This is the theory of availability. The theory analyzes the use of objects through its defined environment, system, and the things it provides to people—the kinds of things that are possible, desirable, comfortable, and interesting due to the object or system. To put it mildly, our current environment provides an irresistible temptation to surveillance for everyone from pet owners and parents to the government.
In an excellent book, software engineer Ellen Ullman describe Program some network software for the office. She vividly described the horror of the boss who realized that after installing the system, it could also be used to track the keystrokes of a secretary who had worked for him for more than a decade. In the past, there was trust and a good working relationship. The power of novelty inadvertently allowed the boss to use this new software to become a worm, peering into the most detailed daily work rhythm of the people around him, the pause between the frequency of clicks and the keystrokes. This kind of unconscious surveillance, although through algorithms rather than humans, is generally considered innovation today.
Privacy as an important and infrastructure fact
So, where does this get us? We cannot simply place personal privacy patches in this surveillance environment. Nonetheless, your device, the habits of your friends, and the activities of your family will still be associated and identify you. In any case, metadata will leak. Instead, privacy must be protected by default. We know that this will not be achieved solely by the goodwill of the government or technology companies, because they have no incentive to do so.
GDPR and its direct consequences have fallen short of the requirements. Privacy shouldn’t just be our right to desperately try to click every time we visit a website, or most of us can only dream of exercising it through expensive court cases. No, it must be a fact of matter and infrastructure. This infrastructure must be decentralized and global, so that it does not fall into the scope of specific national or commercial interests. In addition, it must have the right incentives to reward those who operate and maintain the infrastructure, so that protecting privacy becomes profitable and attractive, while compromising privacy becomes infeasible.
Finally, I would like to point out a seriously underestimated aspect of privacy, which is its positive innovation potential. Privacy is often understood as a protective measure. But if privacy is just a fact, then data-driven innovation will suddenly become more meaningful to people. It will allow broader participation in shaping the future of all data-driven things, including machine learning and artificial intelligence. But there will be more next time.
The views, thoughts and opinions expressed here are only those of the author, and do not necessarily reflect or represent the views and opinions of Cointelegraph.
Jaya Clara Blake He is the chief strategy officer of Nym, a global decentralized privacy project. She is a researcher at the Weizenberg Institute and holds a doctorate degree. Politics of Blockchain Protocol from the Department of Geography of Durham University, and occasionally serves as an expert consultant on distributed ledger technology of the European Commission. She lectures, writes, and researches on the privacy, power, and political economy of decentralized systems.