Stanford tech ethics course urges students to move responsibly and think about things
In the course CS 181: Computers, Ethics and Public Policy, Stanford students become computer programmers, policymakers and philosophers to examine the ethical and social impacts of technological innovation.
In a fictitious policy memo to Stanford campus leadership, students enrolled in CS 181: Computers, Ethics and Public Policy laid out their thoughts about how to regulate campus vehicles in a future where autonomous vehicles are prevalent – how fast should they go, how to handle jaywalkers or what to do in the event of a crash. Although imaginary, the proposal forced students to consider the complexities of technological innovation.
Go to the web site to view the video.
That assignment was one of several students tackled as a way of exploring the various dimensions of technology’s impact on people and society from the perspective of a policymaker, computer programmer or philosopher. Students also probed tradeoffs between privacy and security, simulated how political polarization occurs on social media and discovered how bias can emerge in a decision-making algorithm.
“In the course we created, we wanted to take a truly multidisciplinary approach where students could grapple with philosophical and policy issues while also grounding them in code,” said Mehran Sahami, a professor of computer science and a former Google research scientist and one of the co-instructors of the course. The other two instructors were Rob Reich, a professor of political science and director of the McCoy Family Center for Ethics in Society, and Jeremy Weinstein, a political science professor who worked as a national security advisor during the Obama administration.
Can code be fair?
Assignments in the class were inspired by real-world events. For example, students evaluated the case of Eric Loomis, a Wisconsin man whose prison sentence was partially determined by a private company’s computer code that calculated how likely he was to reoffend. But because the code was proprietary, it was not entirely clear how it came to its scoring and whether the algorithm might be biased.
“And even if it hadn’t been, algorithms are frequently ‘black boxes,’ producing decisions that can’t be explained even by the engineers who coded the algorithm,” said Reich, the Marc and Laura Andreessen Faculty Co-Director at the Center on Philanthropy and Civil Society.
For Jonathan Lipman, a sophomore majoring in computer science and philosophy, the case raised questions about what happens to trust and transparency in the judicial process when not even a judge or the coder fully knows how a computer came to its score.
“Should we trust these complicated and proprietary black box systems that we sometimes can’t fully understand?” Lipman asked. “Can we guarantee they will drive outcomes we desire in society? Do we even have ways of mathematically formalizing the desirable outcomes we are seeking?”
It turns out, not really. In 2016, the investigative journalism nonprofit ProPublica gathered data to replicate the algorithm used in the Loomis case and their analysis found it biased against blacks, who were disproportionately given higher risk scores compared to whites.
Students investigated the dataset ProPublica used in their analysis (it is open source), and from that learned that the data the algorithm was trained on could itself be prejudiced. For example, previous research had found that some demographics are stopped, searched and arrested more often than others – leading to a skew in crime history for those groups, which could cause a model to replicate or amplify these disparities.
But there’s a technical layer that adds a further dimension to the issue: Fairness can be defined in different ways, Reich said. Some coders may consider it fair to block characteristics such as race, but another might think fairness requires making the algorithm more accurate by including those characteristics.
As Lipman observed, “One worrying area where the issue of bias can come into play is with proxy variables – even though protected characteristics are not explicitly included in the model, other characteristics that are heavily correlated with protected characteristics could be,” he said. “An example of this might be including ZIP codes from a racially segregated city in a model. Even though the attribute of race isn’t explicitly included, the model could make predictions based on ZIP code which might effectually mean predictions are made based on race.”
In this assignment and others, students came to appreciate how new technologies are more than just lines of code or pure mathematics, said Hilary Cohen, the head teaching assistant for the course.
“As automated systems increasingly influence our lives, we wanted students to see firsthand how complex it can be to reconcile seemingly incompatible goals – to build accurate tools, ensure fairness, guarantee transparency, preserve privacy – but also how essential it is to try,” Cohen said.
Move responsibly and think about things
What struck some students was how many of today’s concerns are problems that have been debated for centuries.
“One the most fun parts of the class has been using technical problems as lead-ins to broader societal questions, many of which are age-old questions,” Lipman said. From defining fairness to implementing a just society to how society should balance public and private spheres were some of the topics students debated, he said.
Underlying all these issues are a set of competing values that weigh against one another, said Weinstein in the class’s concluding lecture.
“Equality, privacy, security, autonomy, freedom, efficiency – who weighs these values and how? This is a critical question of governance, politics and power,” Weinstein said. “As we consider how to govern the technologies of the future, which will inevitably involve tradeoffs, societies will have to decide when, where and by whom these values should be weighed. Whether you work in tech or not, all of you are citizens and thus you will have a voice in these decisions.”
Computers, Ethics and Public Policy will be taught again in winter quarter 2020 under a new course number, CS 182. The course is also cross-listed across a range of departments beyond computer science, including philosophy, political science, ethics in society, public policy and communication.