0

Targeted | The Cambridge Analytica Whistleblower’s Inside Story | Brittany Kaiser | Book Summary

JOIN THE ‘BEST BOOK CLUB’ NOW HERE

DOWNLOAD THIS FREE PDF SUMMARY HERE

CHECK OUT THE FOLLOWING Book | 150 PDF Summaries | Course | YouTube | Coaching |Spotify | Instagram | Facebook | Newsletter | Book Club | Website


Targeted: My Inside Story of Cambridge Analytica and How Trump, Brexit and Facebook Broke Democracy by Brittany Kaiser

In this explosive memoir, a political consultant and technology whistleblower reveals the disturbing truth about the multi-billion-dollar data industry, revealing to the public how companies are getting richer using our personal information and exposing how Cambridge Analytica exploited weaknesses in privacy laws to help elect Donald Trump.

When Brittany Kaiser joined Cambridge Analytica – the UK-based political consulting firm funded by conservative billionaire and Donald Trump patron Robert Mercer – she was an idealistic young professional working on her fourth degree in human rights law and international relations. A veteran of Barack Obama’s 2008 campaign, Kaiser’s goal was to utilize data for humanitarian purposes, most notably to prevent genocide and human rights abuses. But her experience inside Cambridge Analytica opened her eyes to the tremendous risks that this unregulated industry poses to privacy and democracy.

Targeted is Kaiser’s eyewitness chronicle of the dramatic and disturbing story of the rise and fall of Cambridge Analytica. She reveals to the public how Facebook’s lax policies and lack of sufficient national laws allowed voters to be manipulated in both Britain and the United States, where personal data was weaponised to spread fake news and racist messaging during the Brexit vote and the 2016 election.

In the aftermath of the U.S. election, as she became aware of the horrifying reality of what Cambridge Analytica had done in support of Donald Trump, Kaiser made the difficult choice to expose the truth. Risking her career, relationships, and personal safety, she told authorities about the data industry’s unethical business practices, eventually testifying before Parliament.

Packed with never-before-publicly-told stories, Targeted goes inside the secretive meetings with Trump campaign personnel and details the promises Cambridge Analytica made to win. Throughout, Kaiser makes the case for regulation, arguing that legal oversight of the data industry is not only justifiable but essential to ensuring the long-term safety of our democracy.

 

 

1) A Late Lunch, Early 2014

His name was Alexander Nix and he was the CEO of a British-based elections company. The company, Chester went on, was called the SCL Group, short for Strategic Communications Laboratories. A glorified advertising firm.

SCL was a wildly successful company. Over a span of twenty-five years, it had procured defense contracts worldwide and run elections in countries across the globe. Its basic function, he said, was into power presidents and prime ministers and, in many cases, ensuring that they stayed there.

It had never occurred to me that there existed entire companies dedicated to the goal of getting people elected to political office abroad.

Nix was thrilled with himself.

He said, so busy and so hopeful for the future that the SCL Group had had to spin off an entirely new company just to manage the work in the United States alone.

That new company was called Cambridge Analytica.

It had been in business for just under a year, but the world had best pay attention to it.

Nix said. Cambridge Analytica was about to cause a revolution.

The revolution Nix had in mind had to do with Big Data and analytics.

In the digital age, data was “the new oil.” Data collection was an “arms race,” he said. Cambridge Analytica had amassed an arsenal of data on the American public of unprecedented size and scope, the largest, as far as he knew, anyone had ever assembled. The company’s monster databases held between two thousand and five thousand individual data points (pieces of personal information) on every individual in the United States over the age of eighteen. That amounted to some 240 million people.

But merely having Big Data wasn’t the solution, he said. Knowing what to do with it was the key. That involved more scientific and precise ways of putting people into categories: “Democrat,” “environmentalist,” “optimist,” “activist,” and the like. And for years, the SCL Group, Cambridge Analytica’s parent company, had been identifying and sorting people using the most sophisticated method in behavioral psychology, which gave it the capability of turning what was otherwise just a mountain of information about the American populace into a gold mine.

Nix told us about his in-house army of data scientists and psychologists who had learned precisely how to know whom they wanted to message, what messaging to send them, and exactly where to reach them. He had hired the most brilliant data scientists in the world, people who could laser in on individuals wherever they were to be found (on their cell phones, computers, tablets, on television) and through any kind of medium you could imagine (from audio to social media), using “microtargeting.” Cambridge Analytica could isolate individuals and literally cause them to think, vote, and act differently form how they had before. It spent its clients’ money on communications that really worked, with measurable results, Nix said.

That, he said, is how Cambridge Analytica was going to win elections in America.

Cambridge Analytica was filling an important niche in the market. It had been formed to meet pent-up, unmet demand. The Obama Democrats had dominated the digital communications space since 2007. The Republicans lagged sorely behind in technology innovation. After their crushing defeat in 2012, Cambridge Analytica had come along to level the playing field in a representative democracy by giving the Republicans the technology they lacked.

Using people’s personal information to influence them and, hence, to change economics and political systems around the world.

Sway voters to make irreversible decisions not against their will but, at the very least, against their usual judgment, and to change their habitual behavior.

I had always felt that it was imperative for academics to find ways to spin the threads of the high-minded ideas they came up with in the ivory tower into cloth that was of real use to others.

2) Crossing Over, October-December 2014

“Our children, won’t live in a world with ‘blanket advertising,’”

Referring to the messaging intended for a broad audience and sent out in a giant, homogenous blast. “Blanket advertising is just too imprecise.”

“Traditional Advertising Builds Brands and Provides Social Proof but Doesn’t Change Behavior.

“The SCL Group offers messaging built for a twenty-first-century world,”

If a client wanted to reach new customers, “What you have to do,” he explained, was not just reach them but “convert” them.

“The Holy Grail of communications,” he said, “is when you can actually start to change behavior.”

The next slide read, “Behavioral Communications.” On the left was an image of a beach with a square, white sign that read, “Public Beach Ends Here.” On the right was a bright yellow, triangular placard resembling a railroad crossing sign. It read, “Warning Shark Sighted.” Which one was more effective? The difference was almost comical. “Using your knowledge of people’s fear of being eaten by a shark, you know that the second would stop people from swimming in your piece of sea.

SCL wasn’t an ad agency. It was a “behavior change agency,”

In elections, campaigns lost billions of dollars using messages like the Private Beach sign, messages that didn’t really work.

The flatness of contemporary advertising.

And until this moment, I had seen the Obama New Media campaign of 2008, for which I’d been a dedicated intern, as so sophisticated and savvy.

That campaign had been the first to use social media to communicate with voters. We’d promoted Senator Obama on Myspace, YouTube, Pinterest, and Flickr. I’d even created the then-senator’s first Facebook page, and I’d always treasured the memory of the day Obama came into the Chicago office, pointed at his profile photo on my computer screen, and exclaimed, “Hey, that’s me!”

Now I saw that, however cutting-edge we’d been at the time, in Alexander’s terms, we had been information-heavy, repetitive, and negligible. We hadn’t converted anyone, really. Most of our audience consisted of self-identified Obama supporters. They’d sent us their contact information or we gathered it from them with their permission once they posted messages on our sites. We hadn’t reached them; they had reached us.

Our ads had been based on “social proof,” Alexander explained; they had merely reinforced preexisting “brand” loyalty. We had posted endlessly on social media Obama content just like the Private Beach sign.

They didn’t cause “behavioral change” but were “information-heavy” and provided mere “social proof” that our audience loved Barack Obama. And once we had Obama lovers’ attention, we sent them even more information-heavy and detailed messaging. Our intention might have been to keep them interested or to make sure they voted, but according to Alexander’s paradigm, we had merely flooded them with data they didn’t need.

Alexander pulled up another slide, one with charts and graphs showing how his company did much more than create effective messaging. It sent that messaging to the right people based on scientific methods. Before campaigns even started, SCL conducted research and employed data scientists who analyzed data and precisely identified the client’s target audiences. The emphasis here, of course, was on the heterogeneity of the audience.

But whereas 1960s communication was all “top down,” 2014 advertising was “bottom up.”

With all the advances in data science and predictive analytics, we could know so much more about people than we ever imagined, and Alexander’s company looked at people to determined what they needed to hear in order to be influenced in the direction you, the client, wanted them to go.

Cambridge Analytica had grown out of the SCL Group, which itself had evolved from something called the Behavioural Dynamics Institute, or BDI, a consortium of some sixty academic institutions and hundreds of psychologists. Cambridge Analytica now employed in-house psychologist who, instead of pollsters, designed political surveys and used the results to segment people. They used “psychographics” to understand people’s complex personalities and devise ways to trigger their behavior.

Then, through “data modeling,” the team’s data gurus created algorithms that could accurately predict those people’s behavior when they received certain messages that had been carefully crafted precisely for them.

Alexander said that in 1994, the work SCL did with Mandela and the African National Congress had stopped election violence at the polls. That had affected the outcome of one of the most important elections in the history of South Africa.

It had started out running elections in South Africa, and now it ran nine or ten elections each year in places such as Kenya, Saint Kitts, Santa Lucia, and Trinidad and Tobago.

In 1998, SCL had expanded into the corporate and commercial world, and after September 11, 2001, it had begun to work in defense, with the U.S. Department of Homeland Security, NATO, the CIA, the FBI, and the State Department. The company had also sent experts to the Pentagon to train others in its techniques.

SCL had a social division as well. It provided public health communications, in case studies where he explained they persuaded people in African nations to use condoms and people in India to drink clean water. It had had contracts with UN agencies and with ministries of health worldwide.

The company used “psyops” in defense and humanitarian campaigns.

Short for “psychological operations,” which itself was a euphemism for “psychological warfare, ”psyops can be used in war, but its applications for peacekeeping appealed to me. Influencing “hostile” audiences can sound terrifying, but psyops, for example, can be used to help shift young men in Islamic nations away from joining Al-Qaeda or to de-escalate conflict between tribal factions on Election Day.

3) Power in Nigeria, December 2014

The SCL and Cambridge Analytica staff were energized by Alexander’s vision. The opportunity open to them was the equivalent of that at Facebook in the early days, and it hadn’t taken Facebook too many years to go public to the tune of an $18 billion valuation. Alexander wanted a similar outcome, and as Millennials, the staff looked to Mark Zuckerberg’s baby as a model of remarkable innovation. In spaces no one had even thought to occupy until the company came along.

Cambridge Analytica was based on the same idealistic notion of “connectivity” and “engagement” that had fueled Facebook.

Alexander said that data was an incredible “natural resource.” It was the “new oil,” available in vast quantities, and Cambridge Analytica was on track to become the largest and most influential data and analytics firm in the world.

SCL was just a part of what made it different from all other election companies in the world. It was not an advertising firm, Alexander said, but a psychologically astute and scientifically precise communications company.

The solution, Alexander said, isn’t in the advert. “The solution is in the audience.”

A brilliant concept: to get people to act, you created the conditions under which they would be more likely to do what you wanted them to do.

JOIN THE ‘BEST BOOK CLUB’ NOW HERE

DOWNLOAD THIS FREE PDF SUMMARY HERE

CHECK OUT THE FOLLOWING Book | 150 PDF Summaries | Course | YouTube | Coaching |Spotify | Instagram | Facebook | Newsletter | Book Club | Website

5) Terms and Conditions, February – July 2015

Perhaps the most important first thing that made CA different from any other communications firm was the size of our database. The database, Tayler explained, was prodigious and unprecedented in depth and breadth, and was growing ever bigger by the day. We had come about it by buying and licensing all the personal information held on every American citizen. We bought that data from every vendor we could afford to pay.

We bought data about Americans’ finances, where they bought things, how much they paid for them, where they went on vacation, what they read.

We matched this data to their political information (their voting habits, which were accessible publicly) and then matched all that again to their Facebook data (what topics they had “liked”). From Facebook alone, we had some 570 individual data points on users, and so, combining all this gave us some 5,000 data points on every single American over the age of eighteen – some 240 million people.

The special edge of the database, though, Tayler said, was our access to Facebook for messaging. We used the Facebook platform to reach the same people on whom we had compiled so much data.

CA had developed, they explained, a personality quiz called “the Sex Compass”- a funny name, I thought. It was ostensibly aimed at determining a person’s “sexual personality” by asking probing question about sexual preferences such as favorite position in bed. The survey wasn’t just a joyride for the user. It was, I came to understand, a means to harvest data points from the answers people gave about themselves, which led to the determination of their “sexual personality,” and a new masked way for SCL to gather the users’ data and that of all their “friends,” while topping it up with useful data points on personality and behavior.

The same was true for another survey that had crossed my desk. It was called “the Musical Walrus.” A tiny cartoon walrus asked a user a series of seemingly benign questions in order to determine that person’s “true musical identity.” It, too, was gathering data points and personality information.

And then there were other online activities that, as Tayler explained, were a means to get at both the 570 data points Facebook already possessed about users and the 570 data points possessed about each of the user’s Facebook friends. When people signed on to play games such as Candy Crush on Facebook, and clicked “yes” to the terms of service for that third-party app, they were opting in to give their data and the data of all their friends, for free, to the app developers and then, inadvertently, to everyone with whom that app developer had decided to share the information. Facebook allowed this access through what has become known as the “Friends API,” a now-notorious data portal that contravened data laws everywhere, as under no legislative framework in the United States or elsewhere is it legal for anyone to consent on behalf of other able-minded adults. As one can imagine, the use of the Friends API became prolific, amounting to a great payday for Facebook, And it allowed more than forty thousand developers, including Cambridge Analytica, to take advantage of this loophole and harvest data on unsuspecting Facebook users.

Cambridge was always collecting and refreshing its data, staying completely up to date on what people cared about at any given time. It supplemented data sets by purchasing more and more every day on the American public, data that Americans gave away every time they clicked on “yes” and accepted electronic “cookies” or clicked “agree” to “terms of service” on any site, not just Facebook or third-party apps.

Cambridge Analytica bought this fresh data from companies such as Experian, which has followed people throughout their digital lives, through every move and every purchase, collecting as much as possible in order, ostensibly, to provide credit scores but also to make a profit in selling that information.

Users do not need to opt in, a process by which they agree to the data collection, usually through extensive terms and conditions meant to put them off reading them – so with an attractively easy, small tick box, collecting data is an even simpler process for these companies.

Users are forced to click it anyhow, or they cannot go forth with using whichever game, platform, or service they are trying to activate.

If you bought this book online, your search data, transaction history, and the time spent browsing each Web page during your purchase were recorded by the platforms you used and the tracking cookies you allowed to drop on your computer, installing a tracking device to collect your online data.

Speaking of cookies, have you ever wondered what Web pages are asking when they request that you “accept cookies”? It’s supposed to be a socially acceptable version of spyware, and you consent to it on a daily basis. It comes to you wrapped in a friendly-sounding word, but it is an elaborate ruse used on unsuspecting citizens and consumers.

Cookies literally track everything you do on your computer or phone.

Think of those ads about education that just happen to play on the radio at the precise moment you’re dropping your kids off at school. You’re not paranoid. It’s all orchestrated.

And what’s also important to understand is that when companies buy your data, the cost to them pales in comparison to how much the data is worth when they sell advertisers access to you. Your data allows anyone, anywhere, to purchase digital advertising that targets you for whatever purpose – commercial, political, honest, nefarious, or benign- on the right platform, with the right message, at the right time.

But how could you resist? You do everything electronically because it’s convenient. Meanwhile, the cost of your convenience is vast: you are giving one of your most precious assets away for free while others profit from it. Others make trillions of dollars out of what you’re not even aware you are giving away each moment. Your data is incredibly valuable, and CA knew that better than you or most of our clients.

The term psychographics was created to describe the process by which we took in-house personality scoring and applied it to our massive database. Using analytic tools to understand individuals’ complex personalities, the psychologists then determined what motivated those individuals to act. Then the creative team tailored specific messages to those personality types in a process called “behavioral microtargeting.”

With behavioral microtargeting, a team Cambridge trademarked, they could zoom in on individuals who shared common personality traits and concerns and message them again and again, fine tuning and tweaking those messages until we got precisely the results we wanted. In the case of elections, we wanted people to donate money; learn about our candidate and the issues involved in the race; actually get out to the polling booths; and vote for our candidate. Likewise, and most disturbing, some campaigns also aimed to “deter” some people from going to the polls at all.

Cambridge took the Facebook user data he had gathered from entertaining personality surveys such as the Sex Compass and the Musical Walrus, which he had created through third-party app developers, and matched it with data from outside vendors such as Experian. We then gave millions of individuals “OCEAN” scores, determined from the thousands of data points about them.

OCEAN scoring grew out of academic behavioral and social psychology. Cambridge used OCEAN scoring to determine the construction of people’s personalities. By testing personalities and matching data points, CA found it was possible to determine the degree to which an individual was “open” (O), “conscientious” (C), “extroverted” (E), “agreeable” (A), or “neurotic” (N).

CA could segment all the people whose info they had into even more sophisticated and nuanced groups than any other communications firm.

OCEAN scoring was nuanced and complex, allowing Cambridge to understand people on a continuum in each category. Some people were predominantly “open” and “agreeable.” Others were “neurotic” and “extroverts.” Still other were “conscientious” and “open.” There were thirty-two main groupings in all.

Third, CA then took what they had learned from these algorithms and turned around and used platforms such as Twitter, Facebook, Pandora (music streaming), and YouTube to find out where the people they wished to target spent the most interactive time. Where was the best place to reach each person?

By purchasing lists of key words from Google, CA was able to reach users when they typed those words into their browsers or search engines. Each time they did, they would be met with materials (ads, articles, etc.) that CA had designed especially for them.

At the fourth step in the process, another ingredient in the “cake recipe,” and the one that put CA head and shoulders above the competition, about every political consulting firm in the world, they found ways to reach targeted audiences, and to test the effectiveness of that reach, through client-facing tools such as the one CA designed especially for its own use. Called Ripon, this canvassing software program for door-to-door campaigners and phone bankers allowed its users direct access to your data as they approached your house or called you on the phone. Data-visualization tools also helped them determine their strategy before you’d even opened your door or picked up your phone.

Then campaigns would be designed based on content our in-house team had composed-and the final, fifth step, the micro-targeting strategy, allowed everything from video to audio to print ads to reach the identified targets. Using an automated system that refined that content again and again, we were able to understand what made individual users finally engage with that content in a meaningful way.

What CA did was evidence-based. CA could provide clients with a clear picture of what they had done, whom they’d reached, and, by scientifically surveying a representative sample, what percentage of the people they had targeted were taking action as a result of the targeted messaging.

It was revolutionary.

As successful as this five-step approach had been, I learned in 2015 that it was about to change, when Facebook announced that as of April 30, it would, after so many years of openness, be closing its user data to “third-party app” developers, companies like CA.

“What Cambridge Analytica offers is the right message for the right target audience from the right source on the right channel at the right time. And that’s how you win.”

JOIN THE ‘BEST BOOK CLUB’ NOW HERE

DOWNLOAD THIS FREE PDF SUMMARY HERE

CHECK OUT THE FOLLOWING Book | 150 PDF Summaries | Course | YouTube | Coaching |Spotify | Instagram | Facebook | Newsletter | Book Club | Website

6) Meetings and Reunions, June 2015

The new developments in Google Analytics. Everything was becoming data-driven, and Google Analytics was now being used to collect and analyze the data from visitors to almost half the world’s top-performing websites. By placing tracking cookies on the devices of people worldwide, Google Analytics was amassing a behavioral data set on a huge number of people across the globe-allowing Google to provide clients with everything in the form of data visualization and tracking measurements of a given website’s effectiveness. Clients could see click-through rates, what people were downloading, what they were reading and watching, and how much time they spent doing those things. They could literally see the mechanisms that got people’s attention and engaged them the longest.

What Alexander was after when he created Cambridge Analytica to bring the power of behavior predictability to the elections business.

The place to launch such a business was in the United States. Because the United States lacked fundamental regulations about data privacy, and individuals were automatically opted in to their data being collected without additional consent required besides just being in the country – and the buying and selling went on unabated almost entirely without government oversight. Data was everywhere in the United States; the same is true today.

Sophie Schmidt’s father, Eric, founded Civis in 2013, the same year that Sophie interned at CA. Civis’s mission was to “democratize data science so organizations can stop guessing and make decisions based on numbers and scientific fact.” Interestingly enough, one of the pillars of its mission statement was also “No a **holes.”

After September 11, in the wake of these acts of terrorism, civil liberties in America eroded. The nation became a surveillance state. On October 26, 2001, the Patriot Act passed without much protest, giving the government the right to collect data on citizens entirely without their consent. (Ironically, of course, it would be the extension of the government’s powers in 2001 that would lead to the Big Data free-for-all at the tail end of the same decade.

8) Facebook, December 2015 – February 2016

We advertised as much on our pitch materials; our brochures and PowerPoint presentations openly declared that we had data on some 240 million Americans, which included Facebook data that averaged 570 data points per person for more than 30 million people.

Beginning in 2010, the infamous Friends API had allowed companies such as SCL to install their own apps on Facebook to harvest data from the site’s users and all their friends.

When Facebook users decided to use an app on Facebook, they clicked a box displaying the app’s “terms of service.” Hardly any of them bothered to read that they were agreeing to provide access to 570 data points on themselves and 570 data points of each of their friends. There was nothing illegal about the transaction for the individual who consented: the terms of agreement were spelled out in black and white for the few who cared to attempt to read the “legalese.”

Still, in a rush to get to the quiz or game the app was providing, users skipped over reading the document and gave their data. The problem lay in the fact that they were also giving away their friends’ data, friends who had not legally consented.

Kogan’s 2013 data gathering had first taken place on an Amazon Marketplace platform called “Mechanical Turk.” He had paid users a dollar apiece to take the personality quiz This Is Your Digital Life. When users completed the quiz on Facebook, the app connected to the Friends API to take their data and that of their entire list of friends.

Mass communication needed to be individualized to be effective- and that our data collection, however rudimentary, was profoundly important.

The innovation that Facebook offered users at the time to help them reach people was the “like” button. When other users “liked” your page, they would then “see” your posts on their page. There were no paid ads on Facebook back then. At the time, I remember a great deal of discussion in the public sphere about whether Facebook could be a sustainable business model, given that no one had figured out yet how to monetize it. The reality was that the “like” button gave users the ability to gather basic information on their followers, but it gave Facebook even more: hundreds of thousands of new data points on each user’s “likes,” information that Facebook could compile and eventually turn into dollars.

One of the most lucrative was the development of its Friends API. For a fee, developers could build their own app on the Facebook platform, and that app would give them access, as Kogan’s did, to users’ private data.

Yet no matter how creative the terms of service were in concealing the fine print, individual users should not have been able to consent on behalf of their friends to share their data, and Facebook therefore was not legally allowed to give developers access to this wider network of data.

Closure of the Friends API meant only one thing: no one would be able to further monetize Facebook data except for Facebook itself. No longer able to access the API, developers now had to use Facebook ad tools to reach users on the platform. No Facebook data could be used for external modeling anymore-or so most of the world thought.

Facebook had become the world’s best advertising platform. If it was unsafe or if users’ privacy was being violated, the finger was pointing in another direction.

9) Persuasion, September 2015 – February 2016

10) Under the Influence, Summer 2016

I had assisted them in writing the proposal, I knew they were planning on building a database with modeled data for everyone in America, and then dividing the campaign strategy into three overlapping programs. The first part of the campaign would focus on building lists and soliciting donations from them, especially as the Trump team had not yet started any fund-raising campaign. Funds were key to our starting immediately and to scaling up the national campaign.

No matter what Donald said on TV, he wasn’t funding it himself.

The second program, to be started a month afterward, would focus on persuasion, also known as finding the swing voters and convincing them to like Donald somehow.

And the third program would be focused on getting out the vote, and would involve everything form voter registration to getting likely Trump supporters to the polls for early voting or on Election Day.

Through Cambridge alone, the Trump campaign spent $100 million on digital advertising, most of it on Facebook.

With that kind of spend came a higher level of service-not just from Facebook but from the other social media platforms. This white-glove service was something the social media companies would frequently pitch to us, showcasing new tools and services that could help campaigns in real time.

But the social media giants offered not only new tech but also man power.

Seated beside Molly, Matt, and our data scientists were embedded employees from Facebook, Google, and Twitter, among other tech companies. Facebook called its work with the Trump campaign “customer service plus.” Google said it served the campaign in an “advisory capacity.” Twitter called it “free labor.” While the Trump team welcomed this help with open arms, the Clinton campaign for some reason decided not to accept such help from Facebook, which must have given Trump a distinct advantage that cannot be easily quantified.

As I would later learn, the Facebook embeds showed campaign.

Personnel and Cambridge staff how to aggregate look-alikes, create custom audiences, and implement so-called dark ads, content that only certain people could see on their feeds. While the Clinton campaign may have had some of these skills internally, the hands-on assistance the Trump campaign had was invaluable on a day-to-day basis, enabling it to take the utmost advantage of new tools and features the moment they came out.

On Google, the Trump campaign team had increased its ad spends for search terms, persuasion search advertising, and the controlling of first impressions. Google key word purchases had worked like crazy, too. If a user searched for “Trump,” “Iraq,” and “War,” the top result was “Hillary Voted for the Iraq War – Donald Trump Opposed It,” with a link to a super PAC website with the banner.

“Crooked Hillary voted for the war on Iraq. Bad Judgment!” If a user entered the terms “Hillary” and “Trade,” the top result was “lying crookedhillary.com.” The click-through rate for this was incredibly high.

Google sold inventory to Trump each day, notifying the campaign when new, exclusive ad space was available for the taking, such as the homepage of YouTube.com, the most coveted digital real estate online. Google topped that off by making it easy for the campaign to bid for use of search terms in order to control users’ “first impressions.” Google sold this to Trump for Election Day, November 8, and it pulled in new supporters in droves, and directed them to their local polling places.

JOIN THE ‘BEST BOOK CLUB’ NOW HERE

DOWNLOAD THIS FREE PDF SUMMARY HERE

CHECK OUT THE FOLLOWING Book | 150 PDF Summaries | Course | YouTube | Coaching |Spotify | Instagram | Facebook | Newsletter | Book Club | Website

11) Brexit Brittany, Spring – Summer 2016

Arron Banks and Leave.EU paid millions of pounds to run their own online campaign, which Banks claimed was all driven by data science, name dropping Cambridge whenever it was convenient for him. He boasted that Leave.EU was the biggest viral political campaign in the United Kingdom, with 3.7 million engagements in one week on Facebook. “The campaign,” he said, “must be doing something right to annoy all the right people consistently.”

Just days before the vote, Leave.EU published an “undercover investigation” on Facebook, purporting to show how easy it was to smuggle migrants across the Channel.

Andy Wigmore also posted content, which Leave.EU reposted: a series of photographs purportedly showing a woman being violently attacked by a man wearing a hooded jacket. “Migrant[s] beating up girl in Tottenham [on] Saturday,” Wigmore wrote.

Mass protests and violent arguments followed hard upon such messaging-as did a murder.

On the day of the referendum

After the polls closed.

Huddled around the only TV in the house, monitoring the results of the once-in-a-lifetime vote that could change the course of European history.

In the end, the vote hinged on 1 percent of the British electorate. The result was 52-48 percent in favor of leaving. The effect of the vote was immediate. The value of British currency tumbled, with the pound hitting a thirty-one-year low, and global markets, including the Dow Jones, took an enormous hit.

Nigel Farage once said that Brexit was the “petri dish” for the Trump campaign – it was tribal, populist, and enough to tear a nation apart. It was also, in so many ways, the technological precursor to the 2016 U.S. presidential campaign – and just across the pond on the day of the Brexit vote, the Cambridge Analytica machine was up and running.

The strategy used focus groups, psychographic modeling, and predictive algorithms, and it harvested private user data through online quizzes and contests, using a perfectly legal opt-in. for Brexit, the campaign had matched user data to British voter logs and then it injected itself into the blood stream of the internet, using targeted messaging to incite a nation.

The ten-week showdown had played out in the real world the biliousness of what was happening online. Vote Leave messaging delivered misinformation and fake news about countries such as Turkey, which was negotiating its accession to the European Union. They riled up swing voters by suggesting that a vote to remain was a vote to impoverish Britain’s sacred National Health Service.

Such messaging was terribly flawed and even criminal: from fear mongering about funding for government services to the imagery of immigrants and terrorists storming the border, the Leave campaign was one of fear.

13) Postmortem, November – December 2016

In the aftermath of Trump’s victory, we were finally able to start advertising the role we’d had in the campaign – and now everyone wanted us to do the same for them.

The president of Ghana, whom Alexander had been chasing for years, wanted us to work for him in the upcoming election there. CEOs of major U.S. and foreign corporations wanted us for commercial campaigns – Unilever, MGM, Mercedes. Campaign managers and politicians on nearly every continent wanted us.

The world had ended, but Cambridge Analytica’s life had really just begun.

Our clients wanted to know how we had done it, but before we could tell them, we had to learn the details ourselves. Those of us on the outside had no real sense of the specifics. And those details would become our ammo as we went out and did commercial sales.

570 data points for thirty million individuals

The CA team had used telephone and internet polling such as Survey Monkey.

They’d then segmented people into two large groups, one on the Trump side and one on the Clinton side, and then segmented those groups. The first Trump group comprised “Core Trump Voters,” those you’d turn into volunteers and get to donate and attend rallies. “Get Out the Vote” targets were those who intended to vote but might forget to; CA targeted them on issues that were most important to them that they were already excited about, so they’d be certain to head to the polls. And CA spent money on the “Disengaged Trump Supporters” only if they had cash left over.

On the Clinton side, you had the “Core Clinton Voters.” Then you had the “Deterrence” group: these were Clinton voters who would possibly not go out to the polls if you persuaded them not to.

Throughout my time in the world of human rights, I had witnessed governments and powerful individuals using the suppression of movements, free thought, and voters as a strategy to retain power, sometimes at the cost of violence. This is why in the United States, voter suppression tactics are illegal. I wondered how the Trump campaign had drawn the line between negative campaigning and voter suppression. Usually, there was a clear difference, but in the digital age it was hard to track and trace what had been done. Governments no longer needed to send the police or military out into the streets to stop protests. They could instead change people’s minds simply by paying to target them on the very screens in their hands.

They were said to have had a “symbiotic relationship” with Silicon Valley, with the nation’s other key tech companies, and with data brokers.

There were sometimes hundreds or thousands of versions of the same basic ad concept, creating an individual journey, and altered reality for each person. Over half of the Trump campaign’s expenses had gone toward digital operations, and every message had been highly targeted so that most of the population didn’t see what their neighbors saw. The CA team ran more than 5,000 individual ad campaigns with 10,000 creative iterations of each ad.

They’d been enormously successful. Overall, the campaign had led to an average and measurable 3 percent increase in Trump’s favorability. Considering the narrow margin by which he had won in certain states, this increase had been a significant help in the general election. In the get-out-the-vote campaign, the CA team drove a 2 percent increase in voters’ submission of absentee ballots. This was a huge win because a lot of voters who request absentee ballots typically never even fill them out and mail them back.

The strength of Cambridge Analytica as a company wasn’t only its amazing database; it was its data scientists and its ability to build great new models.

With Siphon’s dashboard. The campaign could see the return on the campaign’s investment in real time: cost per email; costs broken down by traffic type; cost per impressions per ad; click-throughs. They could also shift the ad to a different delivery system. And if an ad wasn’t returning enough on investment, the team could pull it and run it elsewhere or substitute it for another ad entirely. Someone monitored the dashboard twenty-four hours a day, seven days a week.

In the aftermath of the release of the infamous Access Hollywood tape, in which Trump was recorded in 2005 giving full expression to his misogyny and entitlement, boasting about grabbing women and forcing himself upon them against their will, Cambridge Analytica’s data scientists ran a model on a test group of persuadable voters in key swing states. Nicknamed the “pussy model,” it was designed to determine the public’s response to the tape. The results were shocking. Among “persuadables,” the tape actually produced a favorable response- an increase in favorability for Donald Trump – among mostly men but also some women.

The popular narrative about Donald Trump and, by extension, his campaign had been that he rejected data. While that may have been true for him personally – I have no idea, but I heard that he doesn’t even use a computer- the company presentation under scored just how essential data, and data-backed decision-making, had been to his campaign. Whatever he did or didn’t believe about the role of data, the people around him clearly understood not just its importance, but how to deploy it. Data, metrics, measurements, carefully crafted messaging – all these and more had been deployed to great effect and efficiency during Cambridge’s months of working on his behalf.

The Trump campaign may have been behind the times when Cambridge arrived, but by Election Day, it had become not just an effective political machine, but a winning one. Cambridge used all the tech at its disposal, along with the new innovations being sold to it by the social media embeds, to wage a social media battle against Hillary Clinton that was unprecedented in scale.

But the battle hadn’t been against only Hillary – it had been against the American people. Voter suppression and fear-mongering.

JOIN THE ‘BEST BOOK CLUB’ NOW HERE

DOWNLOAD THIS FREE PDF SUMMARY HERE

CHECK OUT THE FOLLOWING Book | 150 PDF Summaries | Course | YouTube | Coaching |Spotify | Instagram | Facebook | Newsletter | Book Club | Website

14) Bombs, January – June 2017

Kosinski had used the Facebook app My Personality (developed, he said, by a colleague named David Stillwell) to build the first precise models of millions of Facebook users. By 2012, he claimed to have proved that these models could predict quite specific information about people based on only sixty-eight Facebook “likes” an individual user might have garnered.

According to the article, he could use those few “likes” to predict skin color, sexual orientation, political party affiliation, drug and alcohol use, and even whether a person had come from an intact or a divorced household. “Seventy ‘likes’ were enough to outdo what a person’s friends knew [about them]; 150 ‘likes’ [and he] ‘knew’ [about users] what their parents knew; 300 ‘likes’ what their partner knew. More ‘likes’ could even surpass what a person thought they knew about themselves.”

The world ought to be “paying closer attention to who owns companies collecting data on American voters.”

15) Quake, July – September 2017

In early 2017, I made strong efforts to be introduced into the “blockchain technology” crowd, a group of wide eyed and optimistic technologists, cryptographers, libertarians, and anarchists who saw data security, ownership of one’s own assets and information, and even the management of one’s own currencies outside a bank as of ultimate importance. It was an exciting time in that industry, which involved an emerging and disruptive technology that, in addition to many other uses, enabled people to take control over their own data with ethical technology built on transparency, consent, and trust.

A “blockchain” is a public database or ledger, decentralized across hundreds or thousands of computers around the world that validate transactions and record them, so that no one central authority can edit or delete any data. Among other things, users can store and encrypt data safely and track its transfer transparently. Every transaction is recorded publicly, and once enough transactions are gathered, they are put into one “block” of data that is “chained” to every other block of data since the platform’s inception. In order to edit a transaction, someone would need to hack every block ever made before that transaction, which has never been done.

What made blockchain so revolutionary was that it was a completely new “electronic cash system that was fully peer-to-peer, with no trusted third party,” so, at the time, it was an ideal way to provide value without being tracked by governments.

SCL’s high-tech op center in Jakarta, the company, he claimed, had spent eighteen months orchestrating massive student rallies that otherwise wouldn’t have occurred and had incited demonstrations that spilled out into the streets and had led to the resignation of the long-ruling dictator, Suharto.

SCL had anticipated this as it carried out the second phase of its operation: a propaganda campaign that reassured the nation that life without Suharto was “a positive development.” Finally, in SCL’s third phase, it had run the election campaign for Abdurrahman Wahid, who, Alexander said, in an uncomfortably light and unapologetic aside, turned out to be far more corrupt than Suharto.

17) Inquiry, February – March 2018

Wasn’t OCEAN modeling a less-than-subtle form of understanding people’s personalities in order to manipulate them into doing what you wanted them to do? Did Alexander fancy himself an “all-powerful presence”?

The company claimed it had up to five thousand data points on every adult in the United States, the entire voting population. Did every adult in the United States know that?

It seemed odd that Cambridge worked in the United States and not the United Kingdom. Was it collecting data on the British public as well?

18) Restart, March 16-21, 2018

Kogan, Chris said, had been able to scrape the data of fifty million people form Facebook in a matter of weeks, thanks to the Friends API.

Chris said he knew that both the swiftness and the enormity of Kogan’s extraction had triggered alarms at Facebook, but for some reason, the company had chosen to ignore them – a sign of clear negligence in how Facebook guarded its users’ privacy and data.

The figure of 50 million was nearly twice the number of users whose data was stolen, and was what Cambridge had used to model some 240 million Americans. With that single, prodigious harvest, and personality profiling, Cambridge had been able to categorize, through predictive algorithms and the other data it has purchased, every single American over the age of eighteen according to many different models, including OCEAN scoring; that’s how it knew which individual Americans were “open,” “conscientious,” “neurotic,” and so on. And that’s what had made its microtargeting so precise and effective. It was one of the main ingredients in Cambridge’s secret sauce.

More shockingly, Chris also alleged that CA was still in possession of that raw data it had more than likely used during the Trump campaign to target Americans to sway the general election. In short, CA’s illicit Facebook harvest had changed the course of history forever.

Cambridge was a part of an enormous cover-up, the center of both the biggest data breach in the history of modern technology and one of the biggest scandals of our time.

People were calling it Datagate.

What had I been a part of? What had gone on behind my back while I acted as a face of CA?

“Facebook data is for sale all over the world.”

20) The Road to Redemption, March 23, 2018 – Present

The problem was bigger than Cambridge; the problem was Big Data. It was that Facebook, in particular, had enabled companies like Cambridge to harvest the data of billions of people, and how, in turn, those companies had sold that data, promiscuously, to anyone who could pay for it; and how those parties had abused it without anyone ever knowing how or for what purpose. All this has been going on since the beginning of our digital lives, without our knowledge, and without government oversight. Even the few laws governing the use of data were completely unenforceable, with no technology in place to allow for the transparency and traceability that would be required to confirm that individuals or companies were obeying the law.

The problem also lay in how easy it was for Facebook, Twitter, and the like to become the globe’s new town hall, and what happened there: the devolution of civility; the rise of tribalism; and how an online war of words and images escaped the bounds of the internet and altered the moral landscape of the real world.

The problem was that bad agents could poison minds, and that that poison had led to bloodshed. Fake news infiltrated our phone screens and laptops and made us deaf, dumb, and blind to reality and willing to kill each other for causes that were not even real. Hatred poured out of those who were usually peaceful. The dream of a connected world instead tore us all apart. And where was it going to stop?

Citizens of the United States are more vulnerable to their data being weaponized against them than in Britain: in the United States there were so many more data points available on each person, and hardly any legal or regulatory constructs to manage data or trace how it was used (or abused) by private and government entities alike. The ability to have full transparency or traceability was next to impossible. That needed to change.

“It is the use of data,” The algorithms drive us apart because we get pushed far into our own rabbit hole of beliefs, directly in confrontation with the people we are supposed to be working with.

Sometimes good people get wrapped up in dark things.

But there was now an opportunity for me to use my experience to change what we had in store for us for the future.

The dangers that lurked behind the rapid growth of a firm with zero competition, with nothing else like it in the marketplace. Both CA and Facebook had been led by privileged white men who thought nothing of exploiting people in the name of advanced communications, never stopping to wonder if their algorithms were intrinsically flawed or if what they were bringing to the world did more harm than good.

As long as Facebook experienced sustained growth in value, they didn’t care.

“What’s happened to Facebook is the saddest case of a company getting blinded by its success.

They’ve achieved more than their wildest dreams, and now they’re having a fight over pride.

As for Facebook, it has had to change some of its policies: We now have disclaimers for fake news and edited video content, and notifications label political advertising and where it has come from. It has been convicted of breaking data-protection laws in so many countries, and has just received a fine of $5 billion from the Federal Trade Commission- an all-time high- which will hopefully become a government budget for technology that can protect consumers.

As for Brexit, there is still no deal, a people’s vote is possible. The Brexiteers have been found guilty of breaking both data-protection laws and violating election-spending regulations.

Now where do we look next? How do we make sense of all of this? Is it possible to have a free and fair election ever again, or even to have self-determination in our daily lives? Let’s look at the key players and where we can expect to see still more of the same, for the sake of group vigilance:

Cambridge Analytica and The SCL Group have been dissolved.

But what does that mean? Many of my former colleagues are still out there, consulting on elections and working in data analytics. This includes Alexander Nix, who, according to press reports, met with former prime minister Theresa May upon her exit and the newly minted prime minister Boris Johnson.

Facebook, while having made a dirty laundry list of cosmetic fixes, has not made any progress on policing fake news, algorithms that prioritize inflammatory and false information, or the ability to really block bad actors from targeting users on the platform.

Big Data, Trump, and Facebook have broken our democracy. It lies in pieces at our feet, with individuals left struggling to piece it back together.

We have a man in Menlo Park who is also in power grab mode: his latest announcement of Libra, a blockchain payments ecosystems I wish I could support, but cannot. Libra, a consortium of big corporations, such as Facebook, Uber, and Visa, that want to launch their own financial system, would allow for data abuse so rife that governments around the world have risen up to stop our generation’s most negligent manager of our digital assets from becoming the world’s new digital central bank. Imagine a dystopia where you could be sold products at a different price because the seller knows how much you have in your bank account. It’s already happening, and Libra will hurtle us into a connected world we never hoped for or dreamed of – a nightmare is more like it.

And lastly, there is the endless flow of data, still unregulated, and mostly untraceable. Once it’s out there we cannot get it back.

Now is the time. We must come together to pick up the pieces of our digital lives and build toward protecting our future.

Epilogue – Ending the data wars

We will have to want peace, want it enough to pay for it, in our own behavior and in material ways. We will have to want it enough to overcome our lethargy and go out and find all those in other countries who want it as much as we do. – Eleanor Roosevelt

Ultimately, the question of data rights is the pivotal issue of this generation. Data, our intangible digital assets, is the only asset class to which the producers have no rights to its value or consent to its collection, storage, trade, or ultimately any profits from its production. Throughout history, we have looked back with disdain at the crusaders’ exploitation of land, water, and oil from indigenous owners, not as powerful as those who forcibly take their valuable goods, and we have considered it a stain on our past.

I am an eternal optimist, or I wouldn’t be issuing this warning: We must move quickly while the momentum is in our favor. If we choose to sit idly by, then the dystopian realities of 1984 and Black Mirror will become even more real what we experience today. Tribalism will grow, the line between truth and manipulation will blur, and our rights to our digital identity, the world’s most valuable asset, may never be recoverable. The time to act is now.

So, how do we protect ourselves? How do we protect our democracy? We stand up, speak out, and act. It is the duty of a every good citizen not to be silent.

You can start today with the following:

1) Become digitally literate

Visit http://www.dqinstitute.org

Visit: http://ownyourdata.foundation

2) Engage with legislators

3) Help companies make the ethical choice

4) Ask regulators to hold abuses of power to account

5) Make ethical choices in your own digital life.

As Albert Einstein said, “I am not only a pacifist but a militant pacifist. I am willing to fight for peace. Nothing will end war unless the people refuse to go to war.” We must fight to fix our democracy before it breaks beyond repair.

Remember: you have agency! It is not only up to big tech and our governments to protect us. We have to stand up for ourselves as well. You do not need that viral Facebook app, or to answer that quiz, or to give away the value of your facial-recognition data to see what you look like when you are older.

JOIN THE ‘BEST BOOK CLUB’ NOW HERE

DOWNLOAD THIS FREE PDF SUMMARY HERE

CHECK OUT THE FOLLOWING Book | 150 PDF Summaries | Course | YouTube | Coaching |Spotify | Instagram | Facebook | Newsletter | Book Club | Website

Leave a Reply

Scroll to top