Since its publication, George Orwell’s “1984” has been calling public attention to the looming threat of a governmental surveillance state marking the end of individual privacy and liberties. However, decades after Orwell published his prophetic novel, some are warning that an unprecedented surveillance threat from an entirely different enterprise is already upon us.
Coined by Dr. Shoshana Zuboff, social psychologist and Harvard University professor emerita, “surveillance capitalism” refers to the alleged business model that modern technology and data companies operate under: harvesting, compiling and selling profiles of personal user data for profit.
“Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data,” Zuboff wrote in her 2019 book, “The Age of Surveillance Capitalism.” “Digital connection is now a means to others’ commercial ends. They accumulate vast domains of new knowledge from us, but not for us.”
Companies need a profit model in order to function; they need to sell products, services or some combination of the two in order to establish a revenue stream that offsets their costs. However, many big-name tech companies offer their software and services to consumers for free. This leads some to question how their record-setting profits are being earned.
Roger McNamee, an early investor in Facebook and author of “Zucked: Waking Up to the Facebook Catastrophe,” wrote about the history of Silicon Valley.
“The first 50 years of Silicon Valley, the industry made products – hardware, software, sold them to customers, [it was a] nice, simple business,” McNamee said. “For the last 10 years, the biggest companies in Silicon Valley have been in the business of selling their users.”
Just as internet users have been accessing news and data through search engines and social media, these companies have also been accessing and compiling data on users en masse.
According to Dr. Christopher Leberknight, a computer science professor at Montclair State University, behind what feels like a one-way data stream to users is a systematic business model of collecting and selling user information to the highest bidder.
“You may have also noticed that if you and your friend visit the same website you are often both presented with different advertisements,” Leberknight said. “The reason is that data is collected from you and your friend’s browsing activity and different consumer profiles are created from this data. The ability to display different ads to users visiting the same site is intrinsically linked to intentional data collection.”
According to Leberknight, two of the biggest offenders in the U.S. are household names: Google and Facebook. Google offers Gmail and its suite of cloud office services, but they also own companies like YouTube and Waze and are in the process of purchasing FitBit for a reported $2.1 billion. Facebook owns the popular social platforms Instagram and WhatsApp.
Such a reach allows these companies to track user activity and data across platforms and in the real world, allowing the creation of anonymized user profiles.
This data collection doesn’t just stop at browsing data or information that users consensually volunteer about themselves. According to Zuboff, there is an unequal trade-off being made when consumers use these services.
“We think we know what we’re giving them, but it’s not the case,” Zuboff said.
According to Zuboff, users telegraph information about themselves through things as simple as their grammar.
“These are indications that can be translated into what’s called the Five Factor Personality test,” Zuboff said. “And with that other kinds of things can be inferred: your sexual orientation, your political orientation, all kinds of things about yourself that you never intended to disclose.”
The Five Factor Personality Model, also known as the OCEAN Model, rates people on the traits of openness, conscientiousness, extraversion, agreeableness and neuroticism. Based on online activity, companies can compile highly specific OCEAN profiles on users, known as “psychographic” data. Even if those profiles don’t involve personally identifiable information like names or social security numbers, an OCEAN profile can still be created.
A report leaked to “The Australian” newspaper showed that Facebook employees were tracking the emotional states of teenagers and young adults in Australia and New Zealand through their posts, status updates and uploaded images. This report was allegedly part of research into how even emotional states might be used for advertisements or to sell products to age groups that are more psychologically or emotionally vulnerable.
Widespread and deeply personal psychographic data collection opens the way for more insidious usages of data. One such example was the recent case of data science firm Cambridge Analytica, which reportedly harvested information on 50 million Facebook profiles through data breaches and orchestrated voter manipulation in the 2016 presidential elections.
Alexander Nix, former CEO of the now defunct Cambridge Analytica, spoke at the 2016 Concordia Annual Summit about their campaign for electoral candidate Ted Cruz. Nix argued that psychographics and personalized advertisements based on the OCEAN model were the future of marketing.
“[T]he Second Amendment might be a popular issue amongst the electorate. If you know the personality of the people you’re targeting, you can nuance your messaging to resonate… with [your] audience,” Nix said.
Nix went on at the Summit to discuss how these companies market.
“For a highly neurotic and conscientious audience, you’re going to need a message that is rational and fear-based or emotionally-based. In this case, the threat of a burgularly and the insurance policy of a gun is very persuasive,” Nix said. “Conversely, for a closed and agreeable audience…people who care about tradition[,] family and community, this could be the grandfather who taught his son to shoot, and the father who will in turn teach his son. Obviously, talking about these values is going to be much more effective in communicating your message.”
Changing ideas in a targeted or preplanned way on an individualized scale is the essence of psychographic marketing. Cambridge Analytica, however, controversially tried to apply similar techniques as marketers on a population scale, shifting the focus from what Zuboff calls “guaranteed commercial outcomes” to “guaranteed political outcomes.”
Beyond manipulating votes or potentially beliefs on a population level, psychographics and new levels of privacy invasion present dangerous long-term implications for the human condition.
In light of surveillance capitalism and threats to privacy, thought and choice, users still hold the power to inform themselves, approach what they read on the internet with skepticism and rationality, as well as keep their digital presences minimal.
Leberknight, who researches digital democracy as well as free communication on the internet, advises a cognizant and cautious usage of the internet.
“I would like to encourage people to think more about what they’re sharing online and consider why companies would want this data beyond just improving their applications,” Leberknight said. “While we are currently a very free and open society, ask yourself what things could possibly occur that could change this? Be knowledgeable about what data is collected, but also about what information or content you voluntarily publish.”