Facebook data whistleblower: firm created algorithms that took 'fake news to the next level'

The Facebook logo
In a picture taken on May 15, 2012, a logo of social networking facebook is displayed on a laptop screen inside a restaurant in Manila, Phillipines.
TED ALJIBE | AFP | Getty Images

Updated: 4:30 p.m. | Posted: 11:33 a.m.

Facebook likes can tell a lot about a person. Maybe even enough to fuel a voter-manipulation effort like the one a Trump-affiliated data-mining firm stands accused of — and which Facebook may have enabled.

The social network is now under fire after The New York Times and The Guardian newspaper reported that former Trump campaign consultant Cambridge Analytica used data inappropriately obtained from roughly 50 million Facebook users to try to influence elections. Among that information were users' likes.

Political Coverage Powered by You

Your gift today creates a more connected Minnesota. MPR News is your trusted resource for election coverage, reporting and breaking news. With your support, MPR News brings accessible, courageous journalism and authentic conversation to everyone - free of paywalls and barriers. Your gift makes a difference.

Facebook stock plunged 7 percent in trading Monday. The head of the EU parliament has promised an investigation. U.S. congressional members and Connecticut's attorney general are seeking testimony or written responses. After two years of failing to disclose the harvesting, Facebook said Monday that it hired an outside firm to audit Cambridge Analytica and its activities.

What's not clear, though, is exactly how effective Cambridge's techniques are.

Researchers in a 2013 study found that Facebook likes on hobbies, interests and other attributes can predict a lot about people, including sexual orientation and political affiliation. Computers analyze the data to look for patterns that might not be obvious, such as curly fries pointing to higher intelligence.

Chris Wylie, a Cambridge co-founder who left in 2014, said the firm used such techniques to learn about individuals and create an information cocoon to change their perceptions. In doing so, he said, the firm "took fake news to the next level."

"This is based on an idea called 'informational dominance,' which is the idea that if you can capture every channel of information around a person and then inject content around them, you can change their perception of what's actually happening," Wylie said Monday on NBC's "Today."

Late Friday, Facebook said Cambridge improperly obtained information from 270,000 people who downloaded an app described as a personality test. Those people agreed to share data with the app for research — not for political targeting. And the data included who their Facebook friends were and what they liked — even though those friends hadn't downloaded the app or given explicit consent.

During the 2016 presidential elections, Cambridge worked both for the primary campaign of Texas Republican Sen. Ted Cruz and President Donald Trump's general-election campaign. Trump's campaign paid Cambridge more than $6 million, according to federal election records, although officials have more recently played down that work.

Cambridge was backed by the conservative billionaire Richard Mercer, and at one point employed Stephen Bannon — later Trump's campaign chairman and White House adviser — as a vice president.

The type of data mining reportedly used by Cambridge Analytica is fairly common, but is typically used to sell diapers and other products. Netflix, for instance, provides individualized recommendations based on how a person's viewing behaviors fit with what other customers watch.

But that common technique can take on an ominous cast if it's connected to possible elections meddling, said Robert Ricci, a marketing director at Blue Fountain Media.

Wylie said Cambridge Analytica aimed to "explore mental vulnerabilities of people." He said the firm "works on creating a web of disinformation online so people start going down the rabbit hole of clicking on blogs, websites etc. that make them think things are happening that may not be."

Wylie told "Today" that while political ads are also targeted at specific voters, the Cambridge effort aimed to make sure people wouldn't know they were getting messages aimed at influencing their views.

The Trump campaign has denied using Cambridge's data. The firm itself denies wrongdoing, and says it didn't retain any of the data pulled from Facebook and didn't use it in its 2016 campaign work.

Yet Cambridge boasted of its work after Cruz won the GOP caucuses in Iowa in 2016.

Cambridge helped differentiate Cruz from his similarly minded Republican rivals by identifying automated red light cameras as an issue of importance to residents upset with government intrusion. Potential voters living near the red light cameras were sent direct messages saying Cruz was against their use.

Even on mainstay issues such as gun rights, Cambridge CEO Alexander Nix said at the time, the firm used personality types to tailor its messages. For voters who care about tradition, it might push the importance of making sure grandfathers can offer family shooting lessons. For someone identified as introverted, a pitch might describe keeping guns for protection against crime.

It's possible that Cambridge tapped other data sources, including what Cruz's campaign app collected. Facebook declined to provide officials for interview and didn't immediately respond to requests for information beyond its statements Friday and Monday. Cambridge also didn't immediately respond to emailed questions.

Facebook makes it easy for advertisers to target users based on nuanced information about them. Facebook's mapping of the "social graph" — essentially the web of people's real-life connections — is also invaluable for marketers.

For example, researchers can look at people's clusters of friends and get good insight as to who is important and influential, said Jonathan Albright, research director at the Tow Center for Digital Journalism at Columbia University. People who bridge different friend networks, for example, can have more influence when they post something, making them prime for targeting.

The Pew Research Center said two-thirds of Americans get at least some of their news on social media, according on Pew Research Center. While people don't exist in a Facebook-only vacuum, it is possible that bogus information users saw on the site could later be reinforced by the "rabbit hole" of clicks and conspiracy sites on the broader internet, as Wylie described.