How ‘Big Brotherhood’ is building an all-seeing surveillance network

Imagine a huge bazaar teeming with activity. A sign at the entrance proclaims that traders come here from all over the world. Hundreds of billions of dollars change hands every year.

But this is not a normal market.

In the stalls and kiosks where one would expect to find fresh produce or household items, only one product is on sale: people’s personal data. What they eat for breakfast, where they sleep, what they do for work, what they believe and what matters to them, the type of television they watch, their age and sexual preferences , their insecurities and fears – there is data here on hundreds of millions of people. Some merchants sell dirty buckets of raw data that need cleaning. Others offer neat packages, perfectly adapted to the needs of their customers. A salesman peddles a list of rape victims – seventy-nine dollars per thousand – as well as a list of victims of domestic violence. There is no law enforcement official in sight.

This data market actually exists, but it is of course too large for a single physical space. It is thriving, especially in the United States, as there is ample supply and high demand. The world is increasingly reflected in data. Satellites capture the entire planet every day at such fine resolution that “every house, boat, car, cow and tree on Earth is visible”. On Earth, almost every human encounter with technology leaves a trail of data that can be retrieved and resold. The data is inexhaustible. We just generate it by existing, and more and more is captured and stored. Thousands of companies have sprung up to trade in it. And business is good.

What happens to all this data? Buyers are usually uninterested in the sordid details of individuals’ lives, a fact that can get lost in online privacy debates. The real value of data emerges when it is brought together in gigantic quantities to build computer systems that can find patterns and predict behavior. For these systems, each person ceases to be an individual and instead becomes a set of attributes – millennial, sausage dog owner, cheese addict – a “voodoo doll” which can be hashed, modified and aggregated with the attributes of thousands of others. The promise of “big data” is that there are correlations that can only be seen when thousands or millions of cases are looked at together. And when these patterns are revealed, strangers can know us in ways that we may not even know ourselves. It is a form of power.

***

In the 20th century imagination, surveillance meant a secret agent watching unseen behind the curtain of a darkened room. Some still cling to this image. Steven Pinker, for example, reassures that we have been able to install surveillance cameras in “every bar and every room” for a long time, but that we have not done so because “governments” do not have “the will and the means of imposing such surveillance on the turbulent accustomed to saying what they want”. Pinker, however, consoles himself for a world that no longer exists. Today’s authorities don’t have to force us to place surveillance devices in our homes. And they don’t need to see us to observe us. We expose our lives to scrutiny every time we interact with a “smart” phone, computer or home device; each time we visit a website or use an application. And personal data usually gets to those who need it most. So when US police wanted information on Black Lives Matter activists, they didn’t need to spy on them with cameras in bars. They bought what they needed from the data market. Facebook had a wealth of data on users interested in Black Lives Matter, which it sold to third-party brokers, who resold it to law enforcement authorities. For juicy data that can’t be found on the open market, Facebook has a special portal for police to request photos, data on ad clicks, apps used, friends (including deleted ones) , search content, deleted content, and likes and pokes. Facebook provides data 88% of the times it is requested.

Even in the physically “private” space of the home, there is already little escape from data collection devices. If you used Zoom to stay in touch with your family during the Covid-19 crisis, Zoom will have sent Facebook details about where you lived, when you opened the app, your laptop model, or of your smartphone and a “unique advertiser identifier”. allowing companies to target you with advertisements. Zoom isn’t even owned by Facebook. He just used some of his software. A commercially available dataset recently revealed more than thirty devices that used hookup or dating apps in secure areas of Vatican City, i.e. areas usually accessible only to senior members of Catholic Church. It’s the kind of secret that would probably have remained hidden in the past.

The future is not Big Brother, in the sense of a government monolith that concerns us all at once. Rather, it is a “great brotherhood” of hidden, staring eyes, some belonging to the state but countless others belonging to private parties who watch us while remaining invisible.

Another anxiety inherited from the 20th century is the feeling that anonymity is no longer possible; that even if we try to hide, powerful others will always know exactly who we are and where to find us. In recent years, this fear has led to concern about the spread of facial recognition systems. In fact, our faces are only a means of identifying and locating us. The unique identifiers of our smartphones and payment devices telegraph our presence everywhere we go. Using location data from the phones of millions of people, it took reporters minutes to track the whereabouts of the President of the United States. There are systems that can monitor people’s heartbeats and read their irises remotely. Others use WiFi technology to identify individuals through walls. With the right technology, a person’s gait can identify them as easily as a fingerprint. In the future, it may be possible to “Google space-time” to find where one of us was at a specific time and date.

In the longer term, the anxieties of being identified will eventually be supplanted by the concerns of being to analyse. We are not as mysterious as we like to think, although the capabilities of computers are sometimes overestimated. Systems are developed to interpret our feelings and moods using the smallest of physical clues. They are said to be able to absorb the feelings of crowds in the blink of an eye. They can tell if we are bored or distracted by the little movements of our faces. They can see if we are sad by the way we walk. They can detect cognitive impairment from the way we push our smartphones. They can predict our mental state from the content of our social media posts. It’s notorious that Facebook likes can be used to predict a person’s political preferences 85% of the time, their sexuality 88% of the time, and their race 95% of the time.

Subscribe to get counterintuitive, surprising and impactful stories delivered to your inbox every Thursday

Using data collected from a thousand sources, today’s systems examine us more closely and more thoroughly than any government agent ever could. And as we will see in the next chapter, this allows them not only to interpret our cognitive states but also to influence them. Tomorrow’s technologies will be even more powerful.


Source link