Effect of Cambridge Analytica’s Facebook ads on the 2016 US Presidential Election

Effect of Cambridge Analytica’s Facebook ads on the 2016 US Presidential ElectionRahul RathiBlockedUnblockFollowFollowingJan 13Cambridge Analytica an advertising company, and an offshoot of the SCL group was founded in 2013 but has gone defunct as of May 1st, 2018.

The company had a political and a commercial wing, and from their website, the political wing “combines the predictive data analytics, behavioral sciences, and innovative ad tech into one award winning approach.

” The company worked for the Ted Cruz campaign in the Republican primary elections and successfully helped him get the 2nd most number of votes, behind only to Donald Trump.

The company then worked for the Trump campaign in the general election and helped him win.

One of the strategies employed by the Cambridge Analytica in the Ted Cruz Republican primary campaign was running targeted advertisements to Facebook users depending on the user’s personality.

However, the company was made to delete all the data it obtained via Facebook in 2015, (before the general election between Trump and Clinton) and so most probably, this strategy only used by the Cruz campaign.

This article will illustrate how all three of the previously mentioned tactics (behavioral science, data analytics, innovative advertising technology) are extremely important when running micro-targeted Facebook advertisements.

Furthermore, this article will depict how the success of the ads was limited and exaggerated by the media, by CEO Alexander Nix, and by whistle-blower and ex-employee Christopher Wylie, each for different reasons.

Before analyzing the effectiveness of the Facebook advertisements, it’s important to understand in more detail the company’s motives, and why it became a mainstream news topic.

The company aimed to persuade users to vote a certain way by showing different advertisements on the same issue, to different people.

The persuasion was done by gathering information on the Facebook page likes of users and using that data to create models that predict personality.

The company that sourced the personality data for Cambridge Analytica, Global Science Research, headed by Aleksandr Kogan, was found to have broken Facebook terms of service and so Facebook asked all the companies that Kogan gave the data to, including Cambridge Analytica, to delete all Facebook data in 2015.

The deletion of data happened before the 2016 general election between Trump and Clinton began, and so it is claimed by Cambridge Analytica that none of the data from GSR was used by them in helping Trump’s campaign.

News about the company came to light when a former employee turned whistle-blower Christopher Wylie approached the media.

Wylie and the media have focused on two aspects.

One aspect was the legality of Cambridge Analytica’s actions in getting users’ Facebook data, and the second was the effectiveness of the Facebook advertisements and the ethical questions about them.

Legal implications are beyond the scope of this paper, given that many of the investigations are on-going, but the success of the Facebook advertisements will be evaluated.

To understand Cambridge Analytica’s strategy for Facebook advertisements, one must understand how behavioral science, data analytics, and innovative ad tech all tie together.

Behavioral science refers to fields of study relating to human or animal, behavior and actions.

Cambridge Analytica’s end goal was to persuade users to vote in favor of their client, which involved displaying messages that would most likely result in a change of behavior.

The following example, taken from Alexander Nix’s presentation at the Concordia summit illustrates the importance of keeping behavioral science in mind when conveying messages.

Imagine you are walking on a beach and see the sign below, to the left.

Image sources: http://massrealestatelawblog.

com/wp-content/uploads/sites/9/2012/07/public-beach-sign.

jpg https://encryptedtbn0.

gstatic.

com/images?q=tbn:ANd9GcQUmyDglOdnQ6JC2AqpK6b0veP97Zw5LwScJIdTVZsoX-YyXMoaThe sign on the left may prompt you to turn away, but if you saw the sign on the right, you would have a far stronger urge to do so.

The motivation of both signs is to ensure that you leave the area, but the sign on the right resonates stronger, probably for all people, because of the fear of an attack by a shark.

In this example, everyone would probably have the strongest urge to undertake the action of leaving the beach after seeing the message on the right.

In the political realm, however, depending on pre-existing beliefs and values, different people would be better persuaded with varying messages in support of the same candidate or issue.

Cambridge Analytica thus hypothesized that if they had a measure of someone’s personality, they could curate an advertisement that would be most convincing to them.

Therefore, they would be able to convince many different people with different personalities, to have the same opinion on a particular issue or candidate by using personalized, targeted advertisements.

To persuade potential voters to vote a certain way through use behavioral science, one would need some information about who the target for the ad is.

To do this, Cambridge Analytica first had to find a way first to obtain raw data about millions of users.

Then they had to use that data to make predictions about user’s personalities.

This is where data analytics, the science of extracting useful information from raw data, comes in.

Cambridge Analytica turned to Aleksandr Kogan, a psychologist, and researcher at Cambridge University.

Christopher Wylie, then an employee of the company, managed the relationship with Kogan, and Kogan set up a company called Global Science Research (GSR).

Kogan then created a survey which asked respondents about their personality and used Qualtrics, a third-party online survey vendor that recruits participants by paying each one a few dollars, to get participants.

The survey asked for respondents’ consent to access of their Facebook data, including the pages that they liked.

Crucially, Kogan also got access to respondents’ friends’ page likes, since this was a feature for developers by Facebook.

Kogan mentioned in his testimony to the British parliament’s ‘Digital, Culture, Media and Sport committee’ how from 2006 to 2015 core feature of the Facebook API was that you could gather data of user’s friends as long as those friends did not turn off certain security settings.

Friends of the respondent who did not change certain privacy settings on Facebook, by default also gave up access to their page likes.

Approximately 270,000 people took this survey, which gave Kogan access to around 30 million people’s data.

This method of mining data was not a data breach, in that users gave their consent to giving their data away to GSR, and the friends of the users who took the survey could have changed their privacy setting to ensure that GSR did not extract their data.

However, perhaps Facebook could have better communicated to users that such a privacy setting existed.

Currently, Facebook mentions this in its Terms of Service, which is probably not read carefully by most people.

Perhaps Facebook can re-work this, by making users take a quiz, ensuring that they fully understand the contents of the Terms of Service, before creating an account.

Cambridge Analytica then used the Facebook data of each user to predict his or her personality.

They aimed to quantify personality by scoring individual users on five key personality traits: openness, conscientiousness, extraversion, agreeableness, neuroticism, which refers to the Big 5 or OCEAN personality model.

Each person’s score on each attribute was determined by examining the Facebook page likes of each user and creating predictive models of personality based on page likes.

The final strategy mentioned on Cambridge Analytica’s website was innovative ad technology.

It refers to the more individualized ways with which the company aimed to reach out to users, including Facebook advertisements.

Nix described this technique as the opposite of blanket advertising which entails displaying the same message to millions of users.

He explained how “today communication is becoming ever increasingly targeted.

It’s being individualized for every single person in this room”.

(Nix) Thus, innovative ad tech, or as Nix termed it a presentation at the Concordia summit, addressable ad tech, relied on taking the results obtained after using behavioral science and data analytics and displaying it to viewers, through highly individualized channels, one of which was Facebook advertisements.

David Sumpter, a professor of applied mathematics at the University of Uppsala in Sweden, analyzed the accuracy of Cambridge Analytica’s models in his book ‘Outnumbered.

’ The company used a regression model to predict personality, He describes a regression model as a “model that takes the data we already have about a person and uses them to predict something we don’t know about him or her.

” (Sumpter, pg.

45).

Based on the personality of the user, different advertisements were shown on the same issue in hopes of convincing different users of the same message.

One such example, taken from Alexander Nix’s presentation at the Concordia summit is shown below.

The motive of both advertisements is to persuade viewers to have an opinion in favor of the second amendment which allows people the right to keep and bear arms.

People high in neocriticism and conscientiousness, tend to worry a lot and prefer order and so the message on the left would resonate more.

Closed and agreeable people put other people’s needs before theirs, but don’t enjoy new experiences and so the message on the right would resonate greater.

Christopher Wylie believes that this crosses a line, and is no longer persuasion but rather, manipulation.

He claimed that the company operated in an “ethical grey area” and “attempted to manipulate voters by latching onto their vulnerabilities.

” (McCausland) However, “using devious methods to subvert the public’s electoral preference is nothing new,” (Berghel) and in the United States it’s been a “recurring companion to elections,” and so one should not be surprised by the use of allegedly manipulative tactics.

The idea of microtargeting users based on personality seems far more nuanced than traditional marketing methods, wherein the same message gets shown to everyone.

Sumpter analyzed the accuracy of Cambridge Analytica’s regression models in his book ‘Outnumbered.

’ He used a publicly available dataset created by Michal Kosinski and his colleagues, a psychologist, who created an anonymized database of 20,000 Facebook users.

Of the 20,000 Facebook users 19,742 were US-based, and of that amount 4,744 had registered their preferred political party Democratic or Republican, and had also liked over 50 Facebook pages.

Sumpter first aimed to test the accuracy of regression models in general, and so created a model which predicted political party allegiance based on Facebook page likes.

He concluded that the regression model worked “very well for hardcore Democrats and Republicans” but “does not reveal anything about the 76 percent of users who did not put their political allegiance on Facebook” (Sumpter, pg.

52–53).

He also describes how just because the model may have revealed, for instance, that Democrats tend to like Harry Potter, it does not necessarily mean that other Harry Potter fans like Democrats.

Therefore, a strategy employed by Democrats to aim to get Harry Potter fans to vote, may not necessarily benefit them.

Another limitation that Sumpter ran into, was that “the regression model only works when a person has made more than 50 ‘likes’ and, to make reliable predictions, a few hundred ‘likes’ are required” (Sumpter pg.

53).

He found that in that dataset only 18 percent of users ‘liked’ more than 50 sites.

However, this problem appears to be limited only to the dataset used by Sumpter since, in 2013, a study showed that the average number of Facebook page likes per person in the US was 70, and that number had been steadily rising.

Arguably the biggest limitation of the regression models used specifically by Cambridge Analytica to predict personality, that he found was the accuracy of personality predicting models from Facebook likes.

The data set that Sumpter used had information for each individual’s score on the OCEAN personality model.

First, Sumpter created a regression model that determined personality from Facebook page likes.

Then he randomly selected two people from the dataset and ranked them based on neuroticism according to the prediction made by the model about the level of neuroticism of each person.

Sumpter then compared this with the ranking based on neuroticism according to the actual data of the two individuals given in the dataset.

He found that the ranking produced by the model matched the correct ranking only 60 percent of the time.

He then tried this method on the other four personality traits and got very similar results around 60 percent.

The most accurate trait to predict was Openness, but even that was accurate only about 67 percent of the time.

The difficulty in predicting people’s personality from their Facebook page likes is backed up by Aleksandr Kogan and Alexander Nix.

In Aleksandr Kogan’s testimony to British parliament’s ‘Digital, Culture, Media and Sport committee,’ he stated that it was “scientifically ridiculous” (UK, RT) to claim that the regression models led to accurate results.

In the written evidence that he submitted to the parliament, he stated that the scores on each personality trait predicted by the algorithm were more accurate than randomly guessing a score on each trait.

However, it was less accurate than merely guessing everyone was precisely in the middle for each attribute, i.

e.

, equally introverted and extroverted, equally closed and open, and so on.

Furthermore, the model correctly predicted all five of the personality traits for only 1% of people but was wrong about all five traits for 6% of people.

Kogan claimed that he did not know that the data collected from his app would be used to create such regression models.

He argues that there is a much more effective way to get data to do targeted advertising on Facebook.

Cambridge Analytica took the data generated by Aleksandr Kogan to predict peoples personalities based on the OCEAN model.

Thus, Cambridge Analytica was able to run advertisements on only the 30 million people whose information was collected by Kogan.

A better method, he argues is to use the Facebook advertising platform.

The Facebook advertising platform enables developers to build audiences similar to a manually selected audience.

Kogan describes how it would be much easier to get details of certain people that accurately represent a particular personality trait, and then use Facebook’s tools to create a lookalike audience to reach a group of people with the same personality trait as the original group.

CEO Alexander Nix himself corroborates these results.

He claimed in his testimony to members of the British parliament’s ‘Digital, Culture, Media and Sport committee,’.

He contended that Kogan’s dataset wasn’t very useful and that made up a tiny part of their overall strategy for the 2016 United States presidential election.

How do we resonate this admission, with his presentation at the Concordia summit, in which Nix openly bragged about the ability to wield Facebook data to tune an incredibly powerful instrument that significantly impacts elections?.The answer to that came from Nix himself in his testimony, claiming that he has in the past used hyperbole when pitching his company to potential clients.

This view is corroborated by Kogan who mentions how “Nix is trying to promote (the personality algorithm) because he has a strong financial incentive to tell a story about how Cambridge Analytica have a secret weapon” (Sumpter, pg.

54).

Nix was not the only person to exaggerate the influence of the company on the 2016 election.

Considering headlines in the mainstream media, most people may believe those regression models that predict personality from Facebook page likes, and by extension, Cambridge Analytica’s model was very accurate.

Thus, how do we resonate headlines such as ‘How Facebook knows you better than your friends do’ by Wired magazine, ‘Facebook knows you better than your members of your family’ by both CNBC and The Telegraph, and ‘Facebook knows you better than anyone else’ by The New York Times?.All of these articles cited the same study, conducted by Wu Youyou, Michal Kosinski, and David Stillwell, titled ‘Computer-based personality judgments are more accurate than those made by humans.

’ Christopher Wylie also cited the same study in his written evidence submitted before his testimony to the British parliament’s in the ‘Digital, Culture, Media and Sport committee,’ and states that it is evidence that the company’s methods are successful.

This study first had participants take a test with 100 questions to determine their scores on the OCEAN personality model.

The study then compared the predictive capabilities of a regression model created using each individual’s Facebook page likes with the predictive capabilities of friends, relatives, colleagues, and spouses of the individual.

They measured the predictive capabilities of the other people by making these people answer a questionnaire with ten questions.

They found that the regression model correlated better with the 100-item test than the answers in the 10-item questionnaire by other people.

However, the researches themselves concede that “Our study is limited in that human judges could only describe the participants using a 10-item-long questionnaire on the Big Five traits.

In reality, they might have more knowledge than what was assessed in the questionnaire.

” (Youyou)Thus, the implication that has been drawn by these articles such as the first line of an article in CNBC “A computer can determine your personality better than your closest friends family by using your Facebook “likes” to judge your character, university researchers found” is far exaggerated as compared to the ground reality.

Brian Connelly, who studies personality in the workplace and is also an associate professor at the Department of Management at the University of Toronto, Scarborough claims that while the research paper cited was “interesting and provocative,” “the media are sensationalizing the findings” (Sumpter.

Pg.

55).

He said that an accurate news headline reflecting the conclusion of the study would be “Preliminary findings suggest that Facebook knows you about as well as a close acquaintance (but we’re holding out to see whether Facebook can predict your behavior)” (Sumpter, Pg.

55).

The media, however, fueled by Christopher Wylie, have claimed that Cambridge Analytica’s microtargeting was hugely influential in helping Trump get elected.

In Christopher Wylie’s written evidence to the British parliament’s ‘Digital, Culture, Media and Sport committee,’, he cited a series of studies as evidence that “highlight the efficacy of using social media, natural language or Internet clickstream data for psychological profiling or mass persuasion.

” However, only a select few of these studies deal with predicting personality from Facebook likes.

Among those, none of them conclude by stating that such models would “break Facebook” or that they are so powerful that they should be termed as a “psychological warfare tool,” as described Wylie (The Guardian).

Furthermore, Wylie also claimed that one of the reasons he became a whistle-blower, was because the company’s Facebook strategy helped Trump win the election.

He claimed that “Donald Trump makes it click in your head that this has a much wider impact” (The Guardian) but did not mention that the Facebook data was deleted by the company, before the general election.

One prominent instance of hyperbole in the media, is a segment on ‘The Daily Show’ by Trevor Noah, on the 21st of March 2018.

Noah perpetrated the notion that Cambridge Analytica’s models were exceedingly accurate and played a series of clips to support this narrative.

The first clip was from a CNN story where the presenter stated that “the level of what can be predicted about you based on what you like on Facebook is higher than that your wife would say about you, what your parents or friends could say about you.

” The second one was from Christopher Wylie who stated that the company “will try to pick whatever mental weakness or vulnerability that we think you have and try to warp your perception of what’s real around you.

” Noah then implied that the Trump campaign used this data, and stated that Cambridge Analytica was operating “ten levels” above traditional advertising and played a clip which said “the data firm hired by Donald Trump’s presidential election campaign used secretly obtained information from tens of millions of unsuspecting Facebook users to directly target potential American voters” and that “the entire operation centered around deception, false grassroots support, and a strategy that seems to border on electronic brainwashing”.

Noah does not show these clips in the full context, but only shows small snippets, and furthermore, the fact that Cambridge Analytica deleted, or at least claimed to delete all Facebook data and not use this at all in helping Trump, was not at all mentioned.

Claims made in all the clips are hyperbolic, and the scientific backing behind such claims is not provided.

Given the evidence that the company’s regression model was not as accurate as claimed, it is worth examining some potential reasons behind this hyperbole.

Simple and sensationalized stories and headlines probably get better viewership, due to the oversimplified, clickbait nature of such articles.

Furthermore, claims that Donald Trump somehow benefited from Cambridge Analytica’s Facebook advertisements are also inaccurate given that the company most probably deleted the Facebook data before the general election between Trump and Clinton began.

However, perhaps polarizing stories, critical of Trump, get higher viewership, than more balanced stories.

Furthermore, Wylie did not leave the company on good terms.

He created a competing company called Eunoia Technologies, which essentially did microtargeting and psychographic modeling just like Cambridge Analytica.

He too received data from Global Science Research and was subsequently asked to delete it.

He then “unsuccessfully pitched his services to a pro-Brexit faction but did land a $100,000 contract from a Canadian legislative entity” (Freeze) and “pitched Republican political operative Corey Lewandowski on microtargeting tools that could be deployed on behalf of Donald Trump’s 2016 presidential campaign” (Mac).

He thus has an incentive to question the ethics of Cambridge Analytica and create negative publicity for the company, which may explain his exaggeration in describing the impact of the company’s efforts.

The impact of Facebook advertisements run by Cambridge Analytica on the 2016 US Presidential election has been overstated by CEO Alexander Nix, the media, and Christopher Wylie.

Regression models are not yet too accurate, and the micro-targeted Facebook advertisement strategy was probably not used for the Trump campaign.

While the Facebook advertisements run by Cambridge Analytica may not have had that much of an impact on the 2016 US election, this is just the beginning in the field of micro-targeted advertising through social media.

With 2.

3 billion monthly active users on Facebook, and 1.

5 billion daily active users, Facebook is a tool unlike no other.

The fact that your friends taking a quiz could extract your data on Facebook may have caught many people by surprise.

Thus, as the number of users on Facebook continues to grow around the world, Facebook has the critical responsibility of clearly communicating privacy options to users.

This is just the beginning of the use of micro-targeted Facebook advertisements, don’t be surprised in the future if you see political ads on Facebook that strongly resonate with you, before an election.

Works CitedBerghel, Hal.

“Malice Domestic: The Cambridge Analytica Dystopia.

” Computer, vol.

51, no.

5, 2018, pp.

84–89.

, doi:10.

1109/mc.

2018.

2381135.

Davies, Harry.

“Ted Cruz Campaign Using Firm That Harvested Data on Millions of Unwitting Facebook Users.

” The Guardian, Guardian News and Media, 11 Dec.

2015, www.

theguardian.

com/us-news/2015/dec/11/senator-ted-cruz-president-campaign-facebook-user-dataFreeze, Colin.

“Business Plan Hatched by Christopher Wylie Sheds Light on Whistle-Blower’s Ambitions, Anxieties about Big Data.

” The Globe and Mail, The Globe and Mail, 5 May 2018, www.

theglobeandmail.

com/politics/article-business-plan-hatched-by-christopher-wylie-sheds-light-on-whistle/.

Kennedy, Merrit.

“‘They Don’t Care’: Whistleblower Says Cambridge Analytica Aims To Undermine Democracy.

” NPR, NPR, 27 Mar.

2018, www.

npr.

org/sections/thetwo-way/2018/03/27/597279596/they-don-t-care-whistleblower-says-cambridge-analytica-seeks-to-undermine-democr.

Lafferty, Justin.

“How Many Pages Does The Average Facebook User Like?” — Adweek, Adweek, 11 Apr.

2013, www.

adweek.

com/digital/how-many-pages-does-the-average-facebook-user-like/.

Mac, Ryan.

“The Cambridge Analytica Whistleblower Wanted His New Company To Work With Trump Campaign’s Manager.

” BuzzFeed News, BuzzFeed News, 28 Mar.

2018, www.

buzzfeednews.

com/article/ryanmac/cambridge-analytica-chris-wylie-eunoia-trump-campaign.

McCausland, Phil, and Anna Schecter.

“Trump-Linked Consultants Harvested Data from Millions on Facebook.

” NBCNews.

com, NBCUniversal News Group, 17 Mar.

2018, www.

nbcnews.

com/news/us-news/cambridge-analytica-harvested-data-millions-unsuspecting-facebook-users-n857591.

News, Channel 4.

“Former Cambridge Analytica CEO Alexander Nix Faces MPs (Full Version).

” YouTube, YouTube, 6 June 2018, www.

youtube.

com/watch?v=weQ9E6e3aJo&t=5747s.

News, Guardian.

“Cambridge Analytica Whistleblower Christopher Wylie Appears before MPs — Watch Live.

” YouTube, YouTube, 27 Mar.

2018, www.

youtube.

com/watch?v=X5g6IJm7YJQ&t=4872s.

Nix, Alexander.

YouTube, Concordia, 27 Sept.

2016, www.

youtube.

com/watch?v=n8Dd5aVXLCc.

Smith, Dave.

“Weapons of Micro Destruction: How Our ‘Likes’ Hijacked Democracy.

” Towards Data Science, Towards Data Science, 17 Oct.

2018, towardsdatascience.

com/weapons-of-micro-destruction-how-our-likes-hijacked-democracy-c9ab6fcd3d02.

Sumpter, David J.

T.

Outnumbered: from Facebook and Google to Fake News and Filter-Bubbles — the Algorithms That Control Our Lives.

Bloomsbury Sigma, 2018.

Chapter 5The Daily Show with Trevor Noah.

“Electronic Brainwashing: Cambridge Analytica’s Sinister Facebook Strategy | The Daily Show.

” YouTube, YouTube, 21 Mar.

2018, www.

youtube.

com/watch?v=t7epj5tK54M.

Youyou, Wu, et al.

“Computer-Based Personality Judgments Are More Accurate than Those Made by Humans.

” PNAS, National Academy of Sciences, 27 Jan.

2015, www.

pnas.

org/content/112/4/1036?sid=fefde0d1-d260-40e6-84d3-a1992208031a.

UK, RT.

“LIVE: Cambridge Analytica Researcher Aleksandr Kogan Testifies to MPs.

” YouTube, YouTube, 24 Apr.

2018, www.

youtube.

com/watch?v=hpzc26bzp1M&t=5106s.

.

. More details

Leave a Reply