By Mattathias Schwartz
IN 2014, TRACES of an unusual survey, connected to Facebook, began appearing on internet message boards. The boards were frequented by remote freelance workers who bid on “human intelligence tasks” in an online marketplace, called Mechanical Turk, controlled by Amazon. The “turkers,” as they’re known, tend to perform work that is rote and repetitive, like flagging pornographic images or digging through search engine results for email addresses. Most jobs pay between 1 and 15 cents. “Turking makes us our rent money and helps pay off debt,” one turker told The Intercept. Another turker has called the work “voluntary slave labor.”
The task posted by “Global Science Research” appeared ordinary, at least on the surface. The company offered turkers $1 or $2 to complete an online survey. But there were a couple of additional requirements as well. First, Global Science Research was only interested in American turkers. Second, the turkers had to download a Facebook app before they could collect payment. Global Science Research said the app would “download some information about you and your network … basic demographics and likes of categories, places, famous people, etc. from you and your friends.”
“Our terms of service clearly prohibit misuse,” said a spokesperson for Amazon Web Services, by email. “When we learned of this activity back in 2015, we suspended the requester for violating our terms of service.”
Although Facebook’s early growth was driven by closed, exclusive networks at college and universities, it has gradually herded users to agree to increasingly permissive terms of service. By 2014, anything a user’s friends could see was also potentially visible to the developers of any app that they chose to download. Some of the turkers noticed that the Global Science Research app appeared to be taking advantage of Facebook’s porousness. “Someone can learn everything about you by looking at hundreds of pics, messages, friends, and likes,” warned one, writing on a message board. “More than you realize.” Others were more blasé. “I don’t put any info on FB,” one wrote. “Not even my real name … it’s backwards that people put sooo much info on Facebook, and then complain when their privacy is violated.”
In late 2015, the turkers began reporting that the Global Science Research survey had abruptly shut down. The Guardian had published a report that exposed exactly who the turkers were working for. Their data was being collected by Aleksandr Kogan, a young lecturer at Cambridge University. Kogan founded Global Science Research in 2014, after the university’s psychology department refused to allow him to use its own pool of data for commercial purposes. The data collection that Kogan undertook independent of the university was done on behalf of a military contractor called Strategic Communication Laboratories, or SCL. The company’s election division claims to use “data-driven messaging” as part of “delivering electoral success.”
SCL has a growing U.S. spin-off, called Cambridge Analytica, which was paid millions of dollars by Donald Trump’s campaign. Much of the money came from committees funded by the hedge fund billionaire Robert Mercer, who reportedly has a large stake in Cambridge Analytica. For a time, one of Cambridge Analytica’s officers was Stephen K. Bannon, Trump’s senior adviser. Months after Bannon claimed to have severed ties with the company, checks from the Trump campaign for Cambridge Analytica’s services continued to show up at one of Bannon’s addresses in Los Angeles.
“You can say Mr. Mercer declined to comment,” said Jonathan Gasthalter, a spokesperson for Robert Mercer, by email.
The Intercept interviewed five individuals familiar with Kogan’s work for SCL. All declined to be identified, citing concerns about an ongoing inquiry at Cambridge and fears of possible litigation. Two sources familiar with the SCL project told The Intercept that Kogan had arranged for more than 100,000 people to complete the Facebook survey and download an app. A third source with direct knowledge of the project said that Global Science Research obtained data from 185,000 survey participants as well as their Facebook friends. The source said that this group of 185,000 was recruited through a data company, not Mechanical Turk, and that it yielded 30 million usable profiles. No one in this larger group of 30 million knew that “likes” and demographic data from their Facebook profiles were being harvested by political operatives hired to influence American voters.
Kogan declined to comment. In late 2014, he gave a talk in Singapore in which he claimed to have “a sample of 50+ million individuals about whom we have the capacity to predict virtually any trait.” Global Science Research’s public filings for 2015 show the company holding 145,111 British pounds in its bank account. Kogan has since changed his name to Spectre. Writing online, he has said that he changed his name to Spectre after getting married. “My wife and I are both scientists and quite religious, and light is a strong symbol of both,” he explained.
The purpose of Kogan’s work was to develop an algorithm for the “national profiling capacity of American citizens” as part of SCL’s work on U.S. elections, according to an internal document signed by an SCL employee describing the research.
“We do not do any work with Facebook likes,” wrote Lindsey Platts, a spokesperson for Cambridge Analytica, in an email. The company currently “has no relationship with GSR,” Platts said.
“Cambridge Analytica does not comment on specific clients or projects,” she added when asked whether the company was involved with Global Science Research’s work in 2014 and 2015.
The Guardian, which was was the first to report on Cambridge Analytica’s work on U.S. elections, in late 2015, noted that the company drew on research “spanning tens of millions of Facebook users, harvested largely without their permission.” Kogan disputed this at the time, telling The Guardian that his turker surveys had collected no more than “a couple of thousand responses” for any one client. While it is unclear how many responses Global Science Research obtained through Mechanical Turk and how many it recruited through a data company, all five of the sources interviewed by The Intercept confirmed that Kogan’s work on behalf of SCL involved collecting data from survey participants’ networks of Facebook friends, individuals who had not themselves consented to give their data to Global Science Research and were not aware that they were the objects of Kogan’s study. In September 2016, Alexander Nix, Cambridge Analytica’s CEO, said that the company built a model based on “hundreds and hundreds of thousands of Americans” filling out personality surveys, generating a “model to predict the personality of every single adult in the United States of America.”
Shortly after The Guardian published its 2015 article, Facebook contacted Global Science Research and requested that it delete the data it had taken from Facebook users. Facebook’s policies give Facebook the right to delete data gathered by any app deemed to be “negatively impacting the Platform.” The company believes that Kogan and SCL complied with the request, which was made during the Republican primary, before Cambridge Analytica switched over from Ted Cruz’s campaign to Donald Trump’s. It remains unclear what was ultimately done with the Facebook data, or whether any models or algorithms derived from it wound up being used by the Trump campaign.
In public, Facebook continues to maintain that whatever happened during the run-up to the election was business as usual. “Our investigation to date has not uncovered anything that suggests wrongdoing,” a Facebook spokesperson told The Intercept.
Facebook appears not to have considered Global Science Research’s data collection to have been a serious ethical lapse. Joseph Chancellor, Kogan’s main collaborator on the SCL project and a former co-owner of Global Science Research, is now employed by Facebook Research. “The work that he did previously has no bearing on the work that he does at Facebook,” a Facebook spokesperson told The Intercept.
Chancellor declined to comment.
Cambridge Analytica has marketed itself as classifying voters using five personality traits known as OCEAN — Openness, Conscientiousness, Extroversion, Agreeableness, and Neuroticism — the same model used by University of Cambridge researchers for in-house, non-commercial research. The question of whether OCEAN made a difference in the presidential election remains unanswered. Some have argued that big data analytics is a magic bullet for drilling into the psychology of individual voters; others are more skeptical. The predictive power of Facebook likes is not in dispute. A 2013 study by three of Kogan’s former colleagues at the University of Cambridge showed that likes alone could predict race with 95 percent accuracy and political party with 85 percent accuracy. Less clear is their power as a tool for targeted persuasion; Cambridge Analytica has claimed that OCEAN scores can be used to drive voter and consumer behavior through “microtargeting,” meaning narrowly tailored messages. Nix has said that neurotic voters tend to be moved by “rational and fear-based” arguments, while introverted, agreeable voters are more susceptible to “tradition and habits and family and community.”
Dan Gillmor, director of the Knight Center at Arizona State University, said he was skeptical of the idea that the Trump campaign got a decisive edge from data analytics. But, he added, such techniques will likely become more effective in the future. “It’s reasonable to believe that sooner or later, we’re going to see widespread manipulation of people’s decision-making, including in elections, in ways that are more widespread and granular, but even less detectable than today,” he wrote in an email.
Trump’s circle has been open about its use of Facebook to influence the vote. Joel Pollak, an editor at Breitbart, writes in his campaign memoir about Trump’s “armies of Facebook ‘friends,’ … bypassing the gatekeepers in the traditional media.” Roger Stone, a longtime Trump adviser, has written in his own campaign memoir about “geo-targeting” cities to deliver a debunked claim that Bill Clinton had fathered a child out of wedlock, and narrowing down the audience “based on preferences in music, age range, black culture, and other urban interests.”
Clinton, of course, had her own analytics effort, and digital market research is a normal part of any political campaign. But the quantity of data compiled on individuals during the run-up to the election is striking. Alexander Nix, head of Cambridge Analytica, has claimed to “have a massive database of 4-5,000 data points on every adult in America.” Immediately after the election, the company tried to take credit for the win, claiming that its data helped the Trump campaign set the candidate’s travel schedule and place online ads that were viewed 1.5 billion times. Since then, the company has been de-emphasizing its reliance on psychological profiling.
The Information Commissioner’s Office, an official privacy watchdog within the British government, is now looking into whether Cambridge Analytica and similar companies might pose a risk to voters’ rights. The British inquiry was triggered by reports in The Observer of ties between Robert Mercer, Cambridge Analytica, and the Leave.EU campaign, which worked to persuade British voters to leave the European Union. While Nix has previously talked about the firm’s work for Leave.EU, Cambridge Analytica now denies that it had any paid role in the campaign.
In the U.S., where privacy laws are looser, there is no investigation. Cambridge Analytica is said to be pitching its products to several federal agencies, including the Joint Chiefs of Staff. SCL, its parent company, has new offices near the White House and has reportedly been advised by Gen. Michael Flynn, Trump’s former national security adviser, on how to increase its federal business. (A spokesperson for Flynn denied that he had done any work for SCL.)
Years before the arrival of Kogan’s turkers, Facebook founder Mark Zuckerberg tried to address privacy concerns around the company’s controversial Beacon program, which quietly funneled data from outside websites into Facebook, often without Facebook users being aware of the process. Reflecting on Beacon, Zuckerberg attributed part of Facebook’s success to giving “people control over what and how they share information.” He said that he regretted making Beacon an “opt-out system instead of opt-in … if someone forgot to decline to share something, Beacon went ahead and still shared it with their friends.”
Seven years later, Facebook appears to have made the same mistake, but with far greater consequences. In mid-2014, however, Facebook announced a new review process, where the company would make sure that new apps asked only for data they would actually use. “People want more control,” the company said at that time. “It’s going to make a huge difference with building trust with your app’s audience.” Existing apps were given a full year to switch over to have Facebook review how they handled user data. By that time, Global Science Research already had what it needed.
No comments:
Post a Comment