<<

Former Eastonite David Carroll Featured in Netflix Documentary ‘

If you told the average user that their data had been stolen, chances are, they probably wouldn’t be in too much of a rush to reclaim it.

Fortunately, David Carroll is not your average social media user.

Professor Carroll is an associate professor at the Parsons School of Design in New York City, and the former director of the Master of Fine Arts Design and Technology program in Parson’s School of Art, Media and Technology. He is also the subject of the original documentary The Great Hack, a film that explores the data scandal involving social media giant and British political consulting firm .

The documentary underlines the controversial data harvesting conducted by Cambridge Analytica that occurred prior to the 2016 U.S. presidential election, as well as the significance of data security in the digital world. The film’s audience follows Carroll’s journey as he attempts to reclaim his data through a series of legal battles, which ultimately thrusts him into the national spotlight.

But long before he was a data activist advocating for the implementation of online privacy legislation, Carroll was a third grader living in Easton, Conn., attending Samuel Staples Elementary school before advancing to Helen Keller Middle School for the sixth grade.

“I have a lot of memories of those school grounds,” said Carroll, a 1993 graduate of Joel Barlow High School in Redding, Conn. “I remember picking strawberries from Candee Farm, and I knew the Silverman family and their town institution, because I knew their kids growing up.”

Back then, phrases such as “data security,” “targeted social media advertisements,” and “the commercialization of the Internet,” weren’t in Carroll’s vocabulary. Today, however, Carroll is viewed as a pioneering force in the realm of data rights and a leading authority on data protection, and his presence in The Great Hack captures his passion for the issues that have become increasingly relevant during this ongoing period of rapid digital transformation.

The film offers first-hand accounts from those who were directly involved with Cambridge Analytica and the 2016 Trump campaign. Through these perspectives, newcomers to the data security world can gain a better sense of the issues that affect them as consumers and citizens that were brought to the forefront of the technology industry as a result of the scandal.

To briefly summarize, in the months leading up to the 2016 presidential election, Cambridge Analytica, a U.K.-based political consulting firm, was hired by ’s campaign to assist with the implementation of strategic advertisement and sales strategies.

During this time period, Cambridge Analytica’s team created data points on every American voter after analyzing user data from several mainstream social media platforms, with Facebook being the main source of information. These data points were then used to create content that was intended to influence a demographic known in the movie as “the Persuadables,” or the people whose minds data scientists at Cambridge Analytica thought they could change.

At a glance, it sounds like a comprehensive and well-thought-out marketing plan. But there was one major problem: The data from Facebook was collected illegally, without the consent of the users themselves.

Data Heist

In an article posted to the company newsroom on April 4, 2019, Facebook’s Chief Technology Officer Mike Schroepfer announced the figure that everyone in the technology world and beyond was waiting to hear: 87 million. Cambridge Analytica collected and harvested the private information of about 87 million Facebook users, over 70 million of whom were U.S. residents

Carroll believes that the lack of data privacy and overall data protection in the U.S. stems from an absence of legislation.

“We don’t have a rights-based model,” explained Carrol. “In all other forms of our society we have rights-based models.

“Europe has one, and so it’s easier to determine what is appropriate and what’s not. There are guidelines, a regulatory framework, and expectations based on the rights. In the U.S., we’re stuck between user expectation and business practice.”

Carroll went on to discuss how this feeling of being “stuck” creates a trust gap between social media users and the companies behind their favorite platforms. When he himself found out that the -led Cambridge Analytica had 5,000 data points on every U.S. voter, Carroll ditched the remaining trust he had in Silicon Valley and lawyered up. He then took his legal battle to Britain, where Cambridge Analytica had processed its user data through its parent company, Strategic Communications Laboratories Group (SCL).

Because of the rights-based model that exists in the , Carroll was able to file a complaint to the Information Commissioner’s Office (ICO), which reports directly to the U.K. Parliament. Eventually, Cambridge Analytica’s work was determined to have violated multiple U.K. privacy laws, but not before its parent company, SCL, had filed for bankruptcy.

When asked about the character of Alexander Nix, the CEO and face of Cambridge Analytica, Carroll was candid in his response. “I know he is willing to lie to lawmakers, so at the minimum he is a scoundrel in that regard,” said Carroll with a smirk. “The House Intelligence Committee just released its transcripts from the Russia investigation, so you can read Nix’s interview. He lied to Republicans and Democrats alike, and he lied to Parliament as well. He’ s just an unreliable narrator.”

COVID-19 and Data Security

This lack of trust with data security has increasingly come to the forefront of topical issues in the United States, as concerns regarding contact tracing applications and data during the COVID-19 crisis have progressively increased. Carroll believes that it all comes back to the lack of existing legislation, regulations and other legal protections to protect consumers’ and citizens’ data privacy.

In May, Apple and Google made updates to their respective operating systems to allow for Bluetooth-powered contact tracing applications to run on their smartphone devices. According to Carroll, both the operating systems and the applications have been built in a privacy-driven fashion; users are unable to detect the identities of others through their devices, and Apple has recently announced that this feature will be disbanded once the pandemic is no longer a major crisis.

In theory, this decentralized approach is supposed to protect consumer healthcare data. However, as Carroll was quick to point out, these companies have not been perfect in the execution of their privacy-focused projects, and the mistakes they make along the way are merely chalked up as public relations problems, rather than legal ones.

“This is why journalists have to play this outside role, because we don’t have laws and regulators that normally take care of this stuff,” said Carroll.

Does It Work?

The Cambridge Analytica scandal made major headlines, and confirmed the fears of millions of consumers around the world. It made one thing very clear: social media data in the United Safe is not safely protected.

When it comes to the effectiveness of Cambridge Analytica’s data-fueled campaigns, the results are a little less definitive, according to Carroll. He prefaced his assessment of the advertising campaigns and their success in swaying the presidential election by quoting a famous saying by marketing pioneer John Wanamaker: “Half the money I spend on advertising is wasted. The trouble is, I don’t know which half.” For Carroll, “All technology targeting is dealing with this reality. Some of it doesn’t work.”

Carroll found it particularly interesting in this context that Joe Biden, the presumptive Democratic nominee for the 2020 presidential election, won the nomination with an advertising budget that was just a fraction of Michael Bloomburg’s, who spent over $500 million in advertising for his Democratic campaign. “The question as to whether or not Cambrirdge Analytica worked is not answerable,” said Carroll. “But did Cambridge Analytica conduct mass data abuse on the American people? Yes. That we know.”

Protecting Our Data

It’s understandable if readers at this point are experiencing information overload There are many moving parts to this story, and the complex underlying issues associated with privacy in the age of social media and global networks. If you want to learn more about these issues, including how to better protect your data, but feel overwhelmed or don’t know where to start, Carroll has some comforting and helpful advice.

“Try not to feel guilty that you have little control, or that you don’t know how to protect yourself,” he said. “There are some things you can do to protect your data, and the data literacy of security is important too.

“But it is also important for ordinary citizens to realize they can ‘do democracy’ here. They can write to their state lawmakers. States are passing interesting privacy laws because the federal government isn’t. It’s a place for local politics to engage. These issues affect local communities. The Easton Library could have many resources for citizens on how to ‘do’ data security, because libraries take this very seriously.”

For those interested in learning more, a follow-up interview with David Carroll will feature more advice for everyday citizens to protect their data in these unprecedented times. Courier readers may also view the video below of David Carroll’s public lecture delivered at Sacred heart University on February 13, 2020 .