There’s a fundamental disconnect in the discussion about online privacy. We are told that people don’t care about their online privacy. Evidence of people not reading terms of service, blindly accepting all permissions on their apps, and even filling out detailed questionnaires in return for an actual cookie, seem to support this position. But in the aftermath of a breach, or simply a news story pointing out how invasive the Facebook Messenger permissions are, the reaction implies a strong expectation of better privacy. It is as if people have an expectation of privacy but a contradictory expectation of not being required to do anything to get it. These two things seem mutually exclusive and yet they exist simultaneously. How can that be? As with most mysteries of the universe, the answer involves some physics.
For all of human history, right up to about 20 years ago, surveillance has always been constrained by the laws of physics. You could not surveil all the people all of the time because you’d need half of them to do the surveillance and because the cost of moving atoms around to do it (people, paper-based files, etc.) was prohibitively expensive at scale. You could take it on faith that overly broad surveillance was extremely shallow and that deep scrutiny was expensive and therefore only targeted. That surveillance was based in the world of atoms created natural cost constraints that placed an absolute limit on breadth and depth. Your chance of being surveilled was proportional to the chance that such surveillance would be worth the investment. In other words, privacy has largely a function of physics throughout nearly all of human history. All of our habits, law, policies and culture up to about 20 years ago are built on that implicit assumption.
We didn’t think about those naturally imposed limits to privacy because Newtonian physics made it unnecessary. Similarly, we do not have laws about keeping liquid in closed containers nor spend much time thinking about creating such laws. If I were going around arguing vehemently that we urgently need closed-container laws before it is too late, people would think I’m (even more?) nuts.
On the international space station they absolutely do have strict policies about keeping liquids contained because they can’t have water just floating into an electrical circuit panel. When we put large populations in space, we’ll need to change our culture to deal with lack of gravity – something we don’t think about at all nowadays but that will be crucial in that context. Suppose that we had already put a billion people in zero g without closed container laws and had built everything with unsealed electrical panels. Would it seem crazy to argue that “there oughta be a closed-container law”?
That’s the kind of societal transition required when we change the physics model of the world in which we live. Astronaut Marsha Ivins tells Wired what it is like to transition between zero g and terrestrial life:
When you return, your inner ear—which keeps you balanced on Earth and which has been essentially turned off for the duration of your trip—feels a little gravity and becomes unbelievably sensitive. Your balance is off and you have to relearn how to move in a gravity field. If I turned my head, I would fall over. Muscles you haven’t used in weeks have to reengage to help you do everyday stuff like walk, stand, and hold things. It can take days or weeks to get your Earth legs back.
Changing the physics model in which we live requires some adjustment. Changing it for entire populations requires that adjustment to touch interpersonal norms, policy frameworks and legislation.
Although we have not yet colonized space at scale, we have put large populations into cyberspace. That also involves a fundamental change in the physics model in which we operate. We will have to update our culture to deal with the change from atoms to bits and the subsequent loss of physical limitations formerly imposed by atoms. In the world of bits it is possible to surveil entire populations online. Once the infrastructure is in place, the incremental cost of surveilling each additional person approaches zero.
Think about that for a minute. In the real world of atoms surveillance is expensive and limited by that cost. In the world of bits the physics model flips the costs. Surveillance at scale is cheap, deep, and easy to reap. As our daily activities increasingly move online, we lose those cost constraints. There is no natural upper bound on surveillance anymore. It possible to secretly track everyone, everywhere, everywhen, to an intimate degree of detail, and in areas we have previously safely assumed to be private.
That is unprecedented. Just like we’d need to reevaluate our policies for a new physics model if we migrated into space, we need to reevaluate all our policies and customs to map out where they expect physical constraints which no longer exist. We aren’t in space yet so we have some time before figuring out behavior norms for large populations in zero g. However, we do have a billion people in cyberspace today and we need the equivalent of zero g open container laws. But that isn’t what we are getting. Instead, the liquid privacy is leaking into the electrical panels and rather than trying to curtail it our corporations and governments are busy installing new sprinklers everywhere, as fast as possible, and all the while telling us not to worry.
People expect privacy online to correlate to equivalent privacy offline because we all either grew up in the world of atoms, or at the very least grew up in a world where language and policy frameworks are all based on the world of atoms. Simply put, we do not have any framework of reference to deal with privacy in the world of bits.
Which is why people don’t expect to have to take any action personally to impose limits to surveillance. Atoms are our only frame of reference. We live in the world of bits but act as though we are in the world of atoms because it’s the only way we know. To expect parity between real-world and online privacy is the default if you do not understand that a physics model change has occurred. But when people discover that it doesn’t work that way, and the extent to which it doesn’t work that way, and the absence of any legal protections from deep surveillance technology, and the likelihood that there will never be any legal protection because of the government’s deep stake in retaining these capabilities, and the degree to which details of their private life are harvested, and that they never consented to this, then yes, they can and often do feel deeply, viscerally, and intimately violated.
Why is consent important? Your roommate, domestic partner, spouse or family knows you intimately. We do not think of that as surveillance because of an element of consent. In the context of a prison the same level of privacy loss is resented specifically because it isn’t consensual. What online trackers know about you approaches – sometimes even exceeds – that which someone living with you would know. But you didn’t consent to that level of tracking and, short of not going online, you cannot opt out.
Reconciling apparent conflicts
Once the changing physics model is factored in, there is no contradiction between people not taking steps to preserve their online privacy and their reactions when it is violated. This is perfectly consistent with behavior patterns based on a world of atoms colliding with the world of bits. In the old world you either knew when you were deeply surveilled or else suspected it because you were a high-value target. Ordinary people never needed to give a second thought to basic privacy nor take specific actions other than closing their blinds. In the new world you are surveilled deeply, not because you are a high value target, but on speculation that the information collected about you might someday exceed the near zero cost of surveillance.
Assumption of agency
At the end of the day the problem with the idea that people don’t care about privacy is the assumption of agency. To claim that people don’t care requires the assumption that people understand how all this works and that their inaction implies consent. The surveillance industrial complex is built on that assumption of consent. The narrative that people don’t care is used to justify the near total lack of any means to withhold that consent. Some even proclaim that harvesting and correlating data at scale is a right and that therefore consent is not required. If any of that were true, then companies could be transparent about their data collection and there would be little risk in offering effective opt-out mechanisms. That great pains are taken to keep online surveillance secret suggests that not even those harvesting your data believe the “people don’t care” narrative. So why should consumers believe it?
Future of privacy
In the new physics model of bits the natural constraints that atoms imposed on surveillance are gone. But clearly some constraints are needed. There must be a line beyond which we say “just because we can doesn’t mean we should.” Unfortunately, our trajectory is headed in the opposite direction. If we consumers do not draw that line nobody will and the window of opportunity to draw it is limited. There’s a point of no return where businesses “too big to fail” become dependent on your data and you won’t be able to get your privacy back. If we are to keep anything resembling privacy online, we’ll need to fight for it and soon. Will you join me? I can’t guarantee that you’ll get the privacy you fight for. I can guarantee that if you don’t fight for it you will get the privacy you deserve.