Chat with us, powered by LiveChat For your final project go to the website: https://hbsp.harvard.edu Open a free student account there if you don’t have one and then go to the following link: https://hbsp. - Writingforyou

For your final project go to the website: https://hbsp.harvard.edu Open a free student account there if you don’t have one and then go to the following link: https://hbsp.

Exercise Content

  1. For your final project go to the website:

    https://hbsp.harvard.edu

    Open a free student account there if you don't have one and then go to the following link:

    https://hbsp.harvard.edu/import/1073148

    You will need to purchase (Cost is $4.25) the following case:

    Facebook, Cambridge Analytica, and the (Uncertain) Future of Online Privacy

    After reading the case carefully, write a minimum 3 page  giving a summary of the case and answering the questions at the end of page 7.

UV8106 Rev. Feb. 9, 2021

Facebook, Cambridge Analytica, and the (Uncertain) Future of Online Privacy

We don’t exactly have the strongest reputation on privacy right now, to put it lightly.

—Mark Zuckerberg, May 3, 20191

When it was widely reported that during the 2016 US presidential election, more Facebook users got their news from social media than anywhere else, alarms around unverified news and disinformation rang. Fake news became more widely read than real news items. The commotion got even louder as elected officials and regulators started to investigate the public’s growing apprehension around the internet. Accusations of Facebook spreading disinformation, allowing foreign influences in US elections, and even promoting genocide grew. Facebook’s contact-importing practices, called “friend permissions,” became a lightning rod for privacy advocates. Yet throughout the turmoil, advertisers continued to favor Facebook over other social media platforms.

For Facebook’s founder, Marc Zuckerberg, there was a certain amount of incredulity around the sustained attacks and new restrictions regulators proposed for tech firms and privacy. The pivotal point seemed to arrive with a third-party data broker called Cambridge Analytica, which purchased data gleaned from Facebook users and used it to inform political operatives. How could something that had started as a survey end with Facebook under fire in such a public way?

The company was at a time of reflection, Zuckerberg had said, midway through 10 hours of testimony on Capitol Hill. He noted that the first decade of company strategy had focused on creating tools that brought folks together and empowered them to do good things.2 By the spring of 2019, Zuckerberg had admitted to an urgency to launch a new business phase that would go beyond building tools and include examining the firm’s responsibility to “make sure that they’re used for good.”3 With the Federal Trade Commission (FTC) ruling that Zuckerberg must make quarterly reports to Facebook’s board (and its newly formed privacy committee) about actions his business took regarding privacy and personal data, did the tech giant have any choice?

1 Julia Carrie Wong, “Facebook’s Zuckerberg Announces Privacy Overhaul: ‘We Don’t Have the Strongest Reputation,’” Guardian, April 30, 2019, https://www.theguardian.com/technology/2019/apr/30/facebook-f8-conference-privacy-mark-zuckerberg (accessed Mar. 10, 2020).

2 “Transcript of Mark Zuckerberg’s Senate Hearing,” Washington Post, April 10, 2018, https://www.washingtonpost.com/news/the- switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/ (accessed Sept. 15, 2019).

3 https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/.

This public-sourced case was prepared by Tami Kim, Assistant Professor of Business Administration; and Gerry Yemen, Senior Researcher. It was written as a basis for class discussion rather than to illustrate effective or ineffective handling of an administrative situation. Copyright  2020 by the University of Virginia Darden School Foundation, Charlottesville, VA. All rights reserved. To order copies, send an email to sale[email protected]. No part of this publication may be reproduced, stored in a retrieval system, used in a spreadsheet, or transmitted in any form or by any means—electronic, mechanical, photocopying, recording, or otherwise—without the permission of the Darden School Foundation. Our goal is to publish materials of the highest quality, so please submit any errata to [email protected].

For the exclusive use of j. Oliva, 2023.

This document is authorized for use only by jassiel Oliva in MAN3084 SUMMER B taught by Christos Christou, Florida National University from Jul 2023 to Jan 2024.

Page 2 UV8106

Data Collection: Risk and Opportunity

As the world’s largest internet peer-to-peer platform with more than 2.4 billion users,4 Facebook was built upon a simple premise—people wanted to share things with their friends and family (see Exhibit 1). And share they did, in over 100 different languages worldwide. Through online profiles, users posted images and videos, shared information and news, played games with one another, and discovered new products and services. Part of the experience was facilitated by developers of third-party apps, which were allowed to integrate with Facebook. Through all these touch points, Facebook collected and stored 96 data categories, which generated 192 billion data points from users around the world.5 (See Exhibit 2 for the type of data collected and the usage thereof.) Data centers in the United States and Europe stored the data Facebook collected.

The data Facebook shared with partners6 through its analytics services was quite extensive. Indeed, some referred to its practice as leasing: “[Facebook] is most certainly the largest data broker in the history of the data industry.”7 Zuckerberg was quick to point out that Facebook sold ads (thereby earning money that was almost entirely revenue for Facebook8), not data. When data was reported to advertisers, Facebook user statistics were commonly aggregated. Personal identifiers like name and address were shared only if Facebook users gave the company permission.9 Advertisers were provided information around which ads led the user to buy something or take an action (or not) around a product or service they had viewed.10 If a Facebook user logged off its site or app, the firm continued to track his or her internet activity.

“We do that for a number of reasons,” Zuckerberg said, “including security, and including measuring ads to make sure that the ad experiences are the most effective.”11 Users could opt out of this feature. While Zuckerberg acknowledged that users were often uncomfortable with companies’ gathering information about them, they seemed willing to do it as long the ads were of interest. “What we found is that even though some people don’t like ads,” Zuckerberg said, “people really don’t like ads that aren’t relevant.”12 Facebook earned income based on the number of user clicks, likes, and shares of customer ads.

In addition to gathering users’ data to target advertising, Facebook shared it with data brokers who collected and sold consumers’ personal information. Data brokers claimed that the purpose for wanting data included marketing, verifying identities, and revealing fraud—all seemingly appropriate. But there were benefits and risks to consumers from this practice. For example, the use of personal data collected prevented someone from getting a bank loan using someone else’s identity was a good thing. But someone being denied a bank loan because of a mistaken identity was a bad thing. Both situations could happen using data-brokered information. Concerns around personal privacy and data collection surfaced, and Facebook users started to pay more attention to what was happening.

While Facebook had been considered a favored platform in the first decade of the 2000s, its favor seemed to be declining in the second decade of the 2000s. Indeed, in 2014, the FTC settled an investigation of Facebook over privacy violations, which forced the company to strengthen efforts to guard users’ information (five years later, the FTC fined Facebook $5 billion for privacy violations). Facebook was not alone. The same year, the

4 Facebook annual report, 2019. 5 https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/. 6 Partners included advertisers, aggregators, product/service sellers and vendors, researchers and academics, and law enforcement. 7 “Written Testimony of John Battell—Cambridge Analytica and Other Facebook Partners: Examining Data Privacy Risks,” US Senate Committee

on Commerce, Science, & Transportation, June 17, 2018, https://nsarchive.gwu.edu/news/cyber-vault/2019-02-06/congressional-hearing-documents (accessed Sept. 13, 2019).

8 Facebook annual report, 2019. 9 “Data Policy,” Facebook, https://www.facebook.com/about/privacy/update/printable (accessed Oct. 1, 2019). 10 https://www.facebook.com/about/privacy/update/printable. 11 https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/. 12 https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/.

For the exclusive use of j. Oliva, 2023.

This document is authorized for use only by jassiel Oliva in MAN3084 SUMMER B taught by Christos Christou, Florida National University from Jul 2023 to Jan 2024.

Page 3 UV8106

Snapchat mobile messaging application settled an FTC privacy case for keeping photo and video messages that were posted through third-party applications—even though Snapchat promised users that their messages disappeared once opened. Likewise, Google had to pay a $22.5 million fine for privacy violations. Search engines Google, Chrome, and Mozilla Firefox, as well as tech behemoth Amazon, came under scrutiny because their browser extensions (called add-ons or plug-ins) were harvesting data around browser history and page views.13 For instance, a marketing-intelligence service called Nacho Analytics provided personal information such as “usernames, passwords, and GPS coordinates” along with “names of patients, doctors, and even medications to clients using data from plug-ins.”14 The third-party browser-extension companies running the apps defended the practice as their terms of services stated they may collect personal data. At least Amazon paid users $10 for using the extensions and allowing collection of user data.

The tech industry was not the only sector under attack for its practices around consumers’ personal data. Target, the large retailer with an often-envied reputation of being a “cooler” company than other discount retailers, tarnished its standing among many consumers with privacy violations. It came to light that every Target customer was given a guest ID number linked to their credit card, name, and email address. Within that number was everything that person purchased as well as any demographic data that could be gleaned. As consumer profiles grew, shopping behaviors could be predicted. For example, the firm’s digital team ran test data searching for patterns and discovered a connection between its baby registry and the purchase of unscented baby lotion. They also noted that supplements were frequently purchased early on in pregnancy, and that large bags of cotton balls and scent-free soap were common during late stages of pregnancy. Target’s data team ran data around categories of shoppers and items that fit and came up with a “pregnancy prediction” score that was eerily accurate.15 With that knowledge, Target started to send baby-item coupons to customers as their due date approached. This practice earned screeching headlines when a father read a coupon mailer for baby items addressed to his high-school-aged daughter.16 Target’s reputation went from “cool” to “snoop” as its customers objected to the practice. Consumers organized on Facebook and Twitter calling for boycotts (using the hashtags #boycotttarget, #boycotttargetcouponing, and #pregnant). Although Target continued to use predictive data, it changed the coupon practice by creating a coupon booklet to make baby items appear arbitrary. Within a short period of time, consumers relaxed, used Target coupons, and decided they were not being spied on by the company.17

Savvier consumers understood how their personal data was used and shared, and some were okay with the lack of transparency in which it occurred. Other consumers had no knowledge that their personal information was being collected. Some wanted their personal data used only with their consent, and others wanted to be compensated. There were even calls for personal data to be protected as a human right.18 For those who were deeply concerned about privacy but still wanted a social media account, there was MeWe, which had its “privacy bill of rights” on its homepage and marketed itself as the social media firm that “doesn’t own your personal information and content.”19 In contrast to Facebook’s 4,000-plus words and 72 links on its data policy, MeWe’s privacy policy contained 1,000-plus words and a single link to archived policies on its homepage.

13 Geoffrey A. Fowler, “Your Data’s For Sale. I Found It.,” Washington Post, July 19, 2019, https://www.washingtonpost.com/technology/2019/07/18/i-found-your-data-its-sale/?utm_term=.7ff63ac62014 (accessed Jul. 19, 2019).

14 https://www.washingtonpost.com/technology/2019/07/18/i-found-your-data-its-sale/?utm_term=.7ff63ac62014. 15 Kashmir Hill, “How Target Figured out a Teen Girl Was Pregnant before Her Father Did,” Forbes, February 16, 2012,

https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/#5e5f64f76668 (accessed May 13, 2019).

16 See Gus Lubin, “The Incredible Story of How Target Exposed a Teen Girl’s Pregnancy,” Business Insider, February 16, 2012, https://www.businessinsider.com/the-incredible-story-of-how-target-exposed-a-teen-girls-pregnancy-2012-2 (accessed Oct. 1, 2019); or https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/ for more.

17 https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/#5e5f64f76668. 18 “The World’s Most Valuable Resource Is No Longer Oil, but Data,” Economist, May 6, 2017. 19 MeWe, “MeWe’s Privacy Bill of Rights—Check It Out,” https://mewe.com/#bill (accessed Jul. 18, 2019).

For the exclusive use of j. Oliva, 2023.

This document is authorized for use only by jassiel Oliva in MAN3084 SUMMER B taught by Christos Christou, Florida National University from Jul 2023 to Jan 2024.

Page 4 UV8106

Overall, in a Pew Research Center survey, 88% of respondents believed they had little to no control over their purchase histories, 85% said they had little to no control over their internet history or social media activity.20 The majority felt they had little to no control over their location data (82%). And they certainly weren’t confident that companies would take responsibility when their data had been misused or compromised.21

Cambridge Analytica

Although Facebook users often agreed to share their own data, many were caught off guard by learning that they had inadvertently shared their contact-list information with the company. This all came to head in 2015 in a privacy breach with a political consulting company called Cambridge Analytica. A social psychologist and researcher at Cambridge University, Aleksandr Kogan, designed a survey app called This Is Your Digital Life. The app involved personality questions and invited users to participate through Facebook. The survey app could be logged into using Facebook, which in turn authorized Kogan to access Facebook users’ data (“names, birthdays, gender, location, affinities, and page likes”22). In addition, Kogan was given permission from Facebook survey participants to use their friends’ data if they used “friend permissions” on their Facebook settings.23 The survey app’s terms of service stated that respondents’ data could be sold or transferred (this was not allowed by Facebook, but in this case, it wasn’t prevented). Kogan used participants’ responses to build personality profiles that could be used to predict behavior. Roughly 300,000 Facebook users downloaded the app and took the survey, but because of their privacy settings, “Kogan was able to access some information about tens of millions of their friends.”24

Kogan’s research moved away from academic research when, for $800,000, he sold the data—which essentially had been mined from 87 million of Facebook’s users without their knowledge—to Cambridge Analytica, a Facebook advertising client.25 The data enabled Cambridge Analytica to identify “undecided” voters, and it then sold this data to political operatives in the United States—to Ted Cruz’s presidential nomination campaign and to the Donald Trump campaign—as well as to pro-Brexit operatives in the United Kingdom to help hone political messaging.26 According to Facebook, the violation of the company’s Platform Policy occurred when Kogan sold the data.27

Once this information was made public in 2015 in the Guardian, Facebook contacted Kogan and Cambridge Analytica. Both verified that the report was accurate and were told to delete the data.28 Facebook also revoked Kogan’s Facebook account. Facebook did not contact users who had been impacted by the breach. Writing on January 18, 2016, Cambridge Analytica confirmed that it had deleted Kogan’s data and all backups thereof.29

20 Farhad Manjoo, “We Hate Data Collection. That Doesn’t Mean We Can Stop It,” New York Times, November 15, 2019, https://www.nytimes.com/2019/11/15/opinion/privacy-facebook-pew-survey.html (accessed Jun. 23, 2020).

21 https://www.nytimes.com/2019/11/15/opinion/privacy-facebook-pew-survey.html. 22 United States District Court Northern District of California, Security and Exchange Commission vs. Facebook, Inc., Case 3:19-cv-04241, July 27, 2019,

https://www.sec.gov/litigation/complaints/2019/comp-pr2019-140.pdf (accessed Jun. 23, 2020). 23 Lesley Stahl, “Aleksandr Kogan: The Link between Cambridge Analytica and Facebook,” 60 Minutes, September 2, 2018,

https://www.cbsnews.com/news/aleksandr-kogan-the-link-between-cambridge-analytica-and-facebook-60-minutes/ (accessed Jun. 23, 2020). 24 https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/. 25 Hannah Kuchler, “How Facebook Grew Too Big to Handle,” Financial Times, March 28, 2019, https://www.ft.com/content/be723754-501c-11e9-

9c76-bf4a0ce37d49 (accessed Jul. 18, 2019). 26 Carole Cadwalladr and Emma Graham-Harrison, “Revealed: 50 Million Facebook Profiles Harvested for Cambridge Analytica in Major Data

Breach,” Guardian, March 17, 2018, https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election (accessed Sept. 12, 2019).

27 https://www.sec.gov/litigation/complaints/2019/comp-pr2019-140.pdf. 28 https://www.sec.gov/litigation/complaints/2019/comp-pr2019-140.pdf. 29 “House Energy and Commerce Questions for the Record,” US House of Representatives Energy and Commerce Committee, June 29, 2019,

https://docs.house.gov/meetings/IF/IF00/20180411/108090/HHRG-115-IF00-Wstate-ZuckerbergM-20180411.pdf (accessed Mar. 30, 2020).

For the exclusive use of j. Oliva, 2023.

This document is authorized for use only by jassiel Oliva in MAN3084 SUMMER B taught by Christos Christou, Florida National University from Jul 2023 to Jan 2024.