The Big Idea

With the 2020 presidential election season in full swing, Facebook faces a big test. Will the social media giant repeat the mistakes of 2016, when Russian propagandists used the site to target American voters, and Cambridge Analytica, a political firm with ties to the Trump campaign, obtained millions of users’ data without their knowledge? Or can Facebook convince Americans that it will act in good faith when it comes to protecting their privacy and their votes?

The Scenario

In 2016, more Facebook users got their news from social media than from anywhere else — a testament to the power and influence of the site.

With more than 2.4 billion users globally, Facebook allows people to share their lives with their friends and families. But the platform had weak spots: Through “friend permissions,” third-party app developers could access the data of people who’d signed up for the app and their unknowing friends.

A social psychologist and researcher at Cambridge University, Aleksandr Kogan, designed a survey app called This Is Your Digital Life. He sold data from the app’s 300,000 registered users and 87 million of their friends to Cambridge Analytica. When news of the data breach surfaced, Facebook users were outraged. They boycotted the site, shares plummeted, and Congress launched investigations. In 2019, the Federal Trade Commission fined Facebook $5 billion for privacy violations.

Of course, Facebook isn’t the only tech player to use data in ways that startled and sometimes infuriated customers — in 2014, Snapchat settled with the FTC for privacy violations. And Google paid $22.5 million in 2012 for privacy misrepresentations. Nor was Facebook the only tech platform targeted by propagandists. YouTube, which is owned by Google, was heavily targeted and some fake news stories — for example, that President Trump had won the popular vote — appeared higher in search engine results than accurate stories.

The public is uneasy about how much big platforms know and control, yet irresistibly drawn to their convenience, connection and content. Few consumers read the details of privacy agreements (Facebook’s runs about 4,000 words) or track down “opt out” features. Yet Facebook and other platforms don’t charge for their services — they make money by selling ads, which means in order survive financially, they must share some information about who is using the site and how. Sometimes the line is clear only when it is crossed.

The Resolution

In 2018, while speaking to Congress, Zuckerberg signaled Facebook would change how it operated. “It’s clear now that we didn’t do enough to prevent these [Facebook] tools from being used for harm as well. That goes for fake news, foreign interference in elections and hate speech, as well as developers and data privacy. We didn’t take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I’m sorry,” he said. In 2019, as part of the FTC settlement, Facebook formed a privacy committee within its board of directors. In May, leading what would become broad action by the world’s largest platforms to combat misinformation related to the election, Twitter began flagging misleading tweets. It posted warnings saying it would “prohibit attempts to use our services to manipulate or disrupt civic processes,” including voting and census participation. In July and August, respectively, Google and Facebook followed suit, saying they would actively combat misinformation about the 2020 election. Google said it would ban websites distributing hacked material. Facebook launched a voting information center and said it would remove misleading posts, such as those claiming people need a COVID-19 test to vote.

The Lesson

Data is not only big business, but a driver of modern life. Americans don’t want to feel spied on or manipulated, but our lives have been shaped by data-driven conveniences (quick and effective search results, purchase recommendations, credit card fraud alert, and the like).

Recognizing that corporations may not be able to police themselves effectively, Congress has been increasingly active in calling big tech companies to account. Many states have passed or introduced legislation to regulate how user data is collected and sold. According to a Pew Research survey conducted in June, nearly three-quarters of U.S. adults believe big tech companies wield too much power in politics.

In 2020’s election, Facebook is a bellwether, worth watching to see where the American public — and perhaps lawmakers — draw the line.

The preceding is based on the case Facebook, Cambridge Analytica and the (Uncertain) Future of Online Privacy (Darden Business Publishing), by Darden Professor Tami Kim and Senior Researcher Gerry Yemen.

About the Expert

Tami Kim

Assistant Professor of Business Administration

Kim’s research delves into firm transparency, consumer empowerment and implicit contracts, with special interest in interpersonal relationships in the digital age. Not only has her work been published in leading academic journals, it has also been featured in media outlets including Harvard Business ReviewThe New York TimesThe Washington Post and The Atlantic.

Kim holds an A.B. in government from Harvard College and a doctorate of business administration in marketing from Harvard Business School, where she received the Wyss Award for Excellence in Doctoral Research and the HBS Dean’s Award.

A.B., Harvard College; DBA, Harvard Business School

READ FULL BIO