Faculty Focus: Shibulal Family Associate Professor of Computer Science Sorelle Friedler
Details
Freidlers' work, focused on the intersection of technology and society, has taken her from Haverford's campus to the White House.
She has worked at Google and for President Joe Biden in the White House. But Shibulal Family Associate Professor of Computer Science Sorelle Friedler calls Haverford the anchor of her career.
“I think about where technology meets society, and what can go right and what can go wrong,” notes the Swarthmore and University of Maryland alum. “And Haverford students come to these conversations with a certain baseline understanding of ethics and baseline thinking about society that I find is, frankly, much more advanced than some of my colleagues’ students at traditional large research institutions because they’re used to that type of thinking about society.”
So is Friedler. For her, the questions are clear in principle but vast in scope. What can technology do to improve society and advance science — and what do we need to think about to make sure that as we do that, it works well? How do we keep algorithms that are used in hiring from discriminating against people? How do we make sure that these systems have guardrails in place so that they actually work? As tech and our relationship with it becomes more complex, Friedler finds her work increasingly relevant and her expertise in demand.
“Since about 2014,” says Friedler, “I've been working in this field called fairness, accountability, and transparency in machine learning. And that field is concerned with how to implement technologies in society in a way that is fair, accountable, and transparent. I was a founder of the subfield and because of that, people in policy-related positions started to invite me to be part of advisory groups. For example, the Pennsylvania court system was thinking about how they could implement a criminal risk assessment that's an algorithmic solution to pretrial detention that would allow more people who are considered low risk to be freed without bail. Trying to do that in a way that is actually non-discriminatory and also works within the justice system so that it can be understood and contested — by defense attorneys, principally — is both important and tricky."
That sort of interest in the humanity that's present in all things tech brought her to the attention of the White House.
“I started engaging more and more with folks in the policy sphere who are interested in what these types of requirements and guardrails could look like from a policy perspective. Which led to my one-and-a-half-year appointment as Assistant Director of Data and Democracy inside the White House Office of Science and Technology Policy. It's basically the office of the science advisor to the president, the people who inform the executive branch on scientific ideas and help to, in my case, craft policy around technology.”
That concern with policy-as-guardrails also informs her thoughts about artificial intelligence at a time when all the world seems to be waking up to the power — some might say, threat — of AI.
“A lot of the discussion about AI has been in the form of, ‘We need to be scared about the coming killer robots.’ And I think that that concern is rooted in a chain of low-probability events that project forward into the distant future. I think we should be much more concerned about what’s already happening, and has been for a decade. Things like discriminatory hiring algorithms, or people getting cut off from their benefits because a fraud detection algorithm incorrectly marked their benefits application as fraudulent. This is not some future dystopia; these are things that are already costing society through their impact on real people's lives, today.
“So I'm far more concerned with putting in place guardrails for those systems. And I think that if we can get those guardrails right, then that also drastically decreases any possibility that these systems run away from themselves. I think that a lot of the killer robot scenarios assume that nobody does anything to try to put guardrails on the use of these systems for a long time.”
Since her arrival in 2012, Friedler has been instrumental in stewarding the explosive growth of interest in computer science at Haverford. “I think the first year that I was here, there were four graduating computer science majors. And now we have more on the order of 30 or 40 graduating seniors every year. So that’s been a large change. Obviously, that impacts us from a staffing standpoint, but over time we have also implemented changes to the curriculum. When I joined, the curriculum was largely focused on the subfields of theory and systems, which are the sort of historical core and origins of the field that developed partway between math and engineering. But we found it necessary to add a third track that's focused on applications, that allows students to integrate their study of machine learning and computational linguistics into the core of the major. I think it is a more modern take on what computer science can look like.”
The growth in student participation has also meant a greater diversity in learning goals and outcomes. Before Friedler’s arrival, many of the students who pursued computer science were interested in grad school. “That's to be expected if you only have a handful of students every year, that you’re going to get the ones who are really focused on this for their long-term career, imagining it in a particularly academic way. But it's now much more heavily students focused on going into industry, which is also much more the norm for the field. Meanwhile, the student body has also diversified. In the early 2000s, no women graduated with a computer science major at Haverford. And I would not say that we are at parity now, but we're certainly a long way from the way things used to be.”
Friedler is also interested in developing greater interdisciplinarity across the curriculum so that students who might not pursue the computer science major nonetheless benefit from the work that’s going on in, and with, the department.
“We need to make sure that all students have the opportunity to think about the way that technology changes society and impacts their lives. That all students have the chance to take a class where they learn some basic programming skills so that they can really understand what that looks like, what it is genuinely possible for it to do, and what it is not possible for it to do. Where all students can have the chance to do a data-driven data science examination within a discipline that they care about, which is now a key part of being a professional in a lot of fields while creating a more data literate society.”
Given such ambitions for her students and the department, It's no surprise to hear that Friedler is excited about Haverford's Strategic Plan and its call for greater ethical engagement on both the curricular and co-curricular levels. “It's natural for Haverford, given our mission and values,” she says. “My hope is that we can find a way to integrate the conversations about ethics with the work we are doing — and can do — with data. I see them as integrally related, and look forward to sustained, expansive dialogue in the years and decades to come.”