Philosophy course explores the ethics of computing

This term, an introductory philosophy course gave students the opportunity to engage in high-level conversations with industry and academic experts on a thorny and timely issue: the ethics of computer science.

“Ethics and Information Technology,” with Susan Brison, Eunice and Julian Cohen Professor for the Study of Ethics and Human Values, explores a range of topics related to technology ethics. The course and a related workshop that Brison co-hosted are part of his ongoing efforts to ensure that as new technologies are created, ethics and social justice are at the heart of the process.

A large and emerging field, tech ethics is “much like bioethics 30 years ago,” says Brison, whose areas of expertise include free speech and social media. “My own view is that it’s really pressing.”

A wide-ranging conversation

In the largely discussion-based classroom, students apply ethical, political, and legal theories to current issues such as free speech, cyberbullying, and digital surveillance.

They also delve into the problem of algorithmic injustice, which Brison defines as a range of harms related to the use of artificial intelligence, including machine learning, to automate decisions in various social spheres.

“Using computer systems that use algorithms to, for example, perform online searches, predict crimes, identify the ‘best’ job candidates and allocate medical care may initially seem like a way to avoid bias humans,” she said.

“However, in practice, algorithmic decision-making has often proven to be far from neutral and objective and can, in fact, amplify biases and reinforce stereotypes based on race, gender and other social categories. .”

Left to right, Kevin Lin ’24, Brian Zheng ’24, and Virginia Coffey ’24 discuss a hypothetical app that aggregates user information, which is then sold to third-party advertisers. (Photo by Rob Strong ’04)

The program included two public lectures co-sponsored by the Leslie Center for the Humanities, the Nelson A. Rockefeller Center for Public Policy and the Wright Center for the Study of Computation and Just Communities in collaboration with the class.

UCLA Professor Safiya Noble, author of bestselling book Algorithms of Oppression: How Search Engines Reinforce Racism, and Anita Allen, professor of law and philosophy at the University of Pennsylvania and expert in privacy and data protection law, spoke at Filene Auditorium. Noble and Allen joined the class for hour-long conversations after their interviews, which took place on July 25 and August 1, respectively.

And students recently heard presentations from attendees of the three-day Ethics and Information Technology Workshop at Dartmouth, co-hosted by Brison and Steven Kelts, senior lecturer at the Center for Human Values ​​at the University. Princeton University, with grants from Dartmouth Institute of Ethics, Neukom. Institute For Computational Science and the Lu Family Academic Enrichment Fund in Philosophy.

With sessions such as “Affective Computing and Human-AI Interaction”, “Fairness in/and Machine Learning” and “Shame is the Name of the Game: A Value Sensitive Design Approach for Combating Rude Speech Online,” the Hanover Inn workshop earlier this month explored many of the same issues as the course. It attracted more than a dozen ethicists, researchers, and professors from Google, Meta, and universities across the country.

Sophia Rubens ’24, a physics major from Stratham, NH, says meeting the speakers and workshop attendees was “really valuable.”

“Most of us use email, social media, lots of things that have predictive algorithms, predictive advertising,” says Rubens, who chose the course based on its day-to-day relevance. She says it’s been enlightening to see how leaders in the field “communicate with senior officials at these companies.”

On a recent Wednesday afternoon, Kelts and several other workshop participants addressed the class and answered questions in an “ask me anything” session.

The discussion began with a question from Allison Zhuang ’25 that elicited nods from several of her classmates. “Are the people in charge of product development listening to the ethical concerns you raise? »

Geoff Keeling, an AI ethicist at Google, said he had “only positive engagements” with the bioethics team he works on. When Zhuang asked for an example, Keeling described a model he had made that was used to work accurately with different skin types, prompted by his concerns about a new product being developed.

The conversation also touched on various challenges in applying standards to IT, such as promoting an interdisciplinary approach among researchers who are not used to thinking outside their field and keeping pace with new developments.

Often by the time the standards are in place, the technology has changed, Keeling said.

A lasting effect

Looking ahead, it looks like the effects of the course and workshop will be felt long after the term is over.

Brison says the workshop has generated what she hopes will be continued collaboration between leaders in industry and academia.

Despite some constraints, people working for Google and Meta can speak quite freely about the areas they research and the issues they encounter and would like advice from ethicists on, Brison says. Since they often play a role in shaping their companies’ research programs, “it’s exciting to be able to connect with them from the early stages.”

And Brian Zheng ’24, a government major from Naperville, Illinois, says he expects to apply what he learned in the classroom to a career in the military.

After graduating, he plans to become a second lieutenant in the ROTC program, says South House member Zheng. He hopes to work in communications or intelligence, where technology is a factor in “decisions that have a definite impact on the lives of people around the world”.

Kamil Salame ’24, a politics, philosophy and economics student from Greenwich, Connecticut, says the class changed the way he views technology.

Going forward, he’ll be more aware of how his data is used to identify his interests, passions and “what I might want to buy,” says School House member Salame. “It made me think critically about how I use technology and how technology uses me.”