91B0FBB4-04A9-D5D7-16F0F3976AA697ED
C9A22247-E776-B892-2D807E7555171534
David Brody, Professor of Government Robert Martin, and Shoshana Weissmann take center stage as part of Hamilton's Common Ground program.

As algorithms and online platforms come to define our daily lives, how do we navigate the social responsibilities of platforms and our own online freedoms?  These were questions addressed at the Common Ground panel on March 27. Guest speakers were Shoshana Weissmann, the digital director of the think tank R Street, and David Brody, the managing attorney of the Lawyers’ Committee for Civil Rights Under Law’s Digital Justice Initiative. Professor of Government Robert Martin moderated the conversation.

To begin, both speakers briefly described their positions.

Weissmann addressed the moderator’s dilemma, in which platforms that moderate open themselves up to the risk of being liable for the content posted on their website. She said that websites must then “moderate everything and let little through or moderate nothing and just let whatever happen. Neither is a good way to run the Internet.”

Currently, this dilemma is assuaged by Section 230 of the Communications Decency Act of 1996, which Weissmann summarized as “if you do something illegal on a platform, the platform is not liable for it –  you are.”

Shoshana Weissmann during a classroom visit.
Shoshana Weissmann chats with students during a luncheon. Photo: Nancy L. Ford

She expressed concern that changing Section 230 could “reopen those legal floodgates to sue over everything.”

In his introduction, Brody pointed out that most of our lives, such as applying for jobs, houses, and mortgages, have moved online. “However, the algorithms that intermediate the platforms that we are using have the potential to have discriminatory outcomes,” he said.

“Social media and similar online companies are places of public accommodation in this new economy,” he continued. “And as such they have the obligation to not discriminate in their provision of service.”  Brody also raised concerns for online harassment, which disproportionately impacts marginalized groups.

Martin began the discussion by inquiring about the roles of the different players in online speech – the poster, the intermediary, and the audience.

Weissmann replied, “I see it as an application of First Amendment law online. Each person is still responsible for their own speech, even with anonymous users.” She added, “I think people forget that whistleblowers sometimes need to be anonymous at first to be heard. The things whistleblowers say at first can seem like libel.”

David Brody interacts with students after the Common Ground event.
David Brody chats with students in Professor of Computer Science Mark Bailey’s Computer Organization class during a lunch and discussion in Buttrick Hall. Photo: Nancy L. Ford

Brody addressed the impact on the audience, saying, “Online threats and harassment intimidates people into silencing their own speech, even if the harassment is not directed at them.

“We’ve developed this norm online now where it’s just expected that harassment is going to occur,” he continued. “If you’re a journalist or high-profile person or just a woman online, at some point someone’s going to direct this at you. You need to learn how to deal with it, which lots of people do by shutting up. That's not what we want in terms of free and open debate.”

Martin acknowledged that the two panelists came at online speech through two different lenses - public accommodation and the First Amendment.

Weissmann explained her lens, saying, “We should sue the guy doing the thing, not the place where they did it. I want to make sure that speech online and through intermediaries is protected and let platforms moderate how they see fit. Moderation is protected by the First Amendment.”

Turning to Brody, Martin asked about effective regulations.

Brody proposed two key components to effective regulation: data privacy legislation and regulation of algorithms.

“Through data privacy legislation, you can change the economic incentives of these platforms. If it is no longer in the platform's economic self-interest to maximize your engagement on the platform at all costs so they can sell or show you more ads, then they no longer necessarily have an incentive to promote the most outrageous sensationalist content,” he explained.

Brody continued, “Algorithms making important economic decisions around housing, employment, credit insurance, education, and healthcare need to be tested for bias before they're deployed.”

Weissmann supported a data privacy bill but still has concerns about regulation. She said, “Sometimes I just worry about overregulation enshrining the current companies because regulation can tend to do that. Not all the time, but there's a reason sometimes industry likes regulation because it keeps itself there and its competitors can't make it up.”

Concluding the discussion, Brody addressed the radicalization that recommendation algorithms can foster. He explained, “Algorithms are doing that because they're trying to maximize engagement and the amount of time people spend on the platform because that means more eyeballs on advertisements which equals money.”

Students then had the opportunity to ask the panelists questions. Afterwards, student facilitators from the Levitt Center led small groups of students in their own discussions about online speech.

The interests of the students drove the conversations as participants shared their thoughts on a host of issues. Topics included radicalization, methods of regulating algorithms, sensationalism, and the economics related to online speech.

Common Ground is Hamilton’s widely acclaimed multi-format program that helps prepare students for lives of active citizenship. Designed to explore cross-boundary political thought and complex social issues, Common Ground brings highly respected thought leaders to campus for small classroom dialogues and large event discussions. Topics intertwined with the College’s curriculum are chosen to foster critical thinking and holistic examination of difficult and often contentious national and global policy issues.

Related News

Jessica Rich, former director of the Federal Trade Commission’s Bureau of Consumer Protection and former head of the FTC’s division of privacy and identity theft, speaks during the Common Ground panel.

Common Ground Panel Discusses Privacy and Data Concerns

The Common Ground series returned to Hamilton on February 22 with panelists discussing the question, "Who should set the rules regarding privacy and data? "

University of Rhode Island scholar Nicolai Petro, Professor of Government Alan Cafruny, and former foreign minister of Russia Andrei Kozyrev discuss the war in Ukraine.

Russia-Ukraine War: “Cultivating Compassion Toward Adversaries”

Former foreign minister of Russia Andrei Kozyrev and University of Rhode Island scholar Nicolai Petro discussed the war in Ukraine with Professor of Government Alan Cafruny on Nov. 15.

Help us provide an accessible education, offer innovative resources and programs, and foster intellectual exploration.

Site Search