fbpx Q&A with Latanya Sweeney: How chance encounters drove a career examining data privacy, algorithmic bias, and the ways technology affects democracy | Harvard Kennedy School

Faculty Focus logo.Latanya Sweeney is a computer scientist whose work examines the unforeseen consequences of technology on society and what to do about them. She holds a joint appointment at Harvard Kennedy School and Harvard’s Faculty of Arts and Sciences. Sweeney directs the Data Privacy Lab, which she founded and brought to Harvard from Carnegie Mellon University and is launching a new Public Interest Technology Lab at the Kennedy School with active participation from several schools at the University. Sweeney, who has also served as the chief technology officer at the U.S. Federal Trade Commission, was the first African American woman to receive a PhD in computer science from the Massachusetts Institute of Technology (MIT).


Q: How does your research and teaching connect with the solution of pressing problems in the world today?

Latanya Sweeney.Well, I focus on unforeseen consequences of technology. I teach a series of classes that some people call “save the world classes.” That is, we teach people how to look for the unforeseen consequences and what kind of things they can do in order to make a difference or quantify or better understand the problem. Students have literally saved the world, which is why they're called “save the world classes.” They’ve had a big impact on technology business practices, on legislation, on regulation. They have been a tremendous success. Moving my work to the Kennedy School is like putting it on steroids because the breadth and depth of the nature of the encounters have increased significantly. And this is coming at a time where the clashes between technology and society are changing. Earlier in my career, they were all about privacy, and I ended up pioneering the field of data privacy. The second wave was around algorithmic fairness, and I was one of the first to do experiments in that space. But in more recent years my work has been about democracy itself, the very cornerstones of the things that we take for granted. How does the design of technology redefine or enable—or fail to enable—democracy? What do we do about it? How do we understand it sufficiently to make changes?


Q: What are you teaching at the Kennedy School?

I am not teaching a class right now because I have been working as a co-lead at the University on coronavirus testing and tracing. That was literally an all-day affair throughout the summer, every day. That’s the work I’ve been doing, and I have been continuing this work in the fall.

In the spring, I will teach a class that asks: How do we look at technology? How do we think about its unforeseen consequences? What are the experiments or ways that we could expose these? And if we were to expose consequences, how do we predict what kind of change would result? The idea is that students come out of the class with a much better ability to look at technology. And then they can see vulnerabilities and concerns, giving them a way to think about how to make a difference.

“It’s all been about personal experience. We think that we plot a career from beginning to end; it doesn't really work out that way.”

Latanya Sweeney

Q: How have experiences in your own life influenced your academic direction?

It’s all been about personal experience. We think that we plot a career from beginning to end; it doesn't really work out that way. When I was a child, all I wanted to do was be a mathematician. And then I took a computer course, and all I wanted to be was a computer scientist. And then I became a computer scientist. However, while I was a graduate student in computer science at MIT, an ethicist came by and told me computers are evil. I was going to correct her thinking, but she went on to talk about how the social contracts were being broken by technology and data sharing. And that led me to do those fundamental experiments that became the foundation of data privacy—to show how easy it is to identify someone—and launched a whole field. 

Then, as life went on, I got an appointment here at Harvard, in the Faculty of Arts and Sciences, in the government department. Within the first month of being there, a reporter comes to my office. I was looking up a paper, so I typed my name into Google and an ad popped up implying I had an arrest record. So, I tell the reporter, “Here's the paper you wanted to see.”  He says, “Forget the paper; tell me about when you were arrested.” And I said, “Well, I wasn't arrested.” So, then he and I spent a couple of hours trying to figure out what makes these ads come up. I spent money to go on that website to show him that not only did I not have an arrest record in their system, but no one with my name had an arrest record. We began playing with it, and we left with the observation that ads implying you had an arrest record came up more often if your name was more commonly given to Black babies than to white babies. The idea is that some names are race specific, Latanya being an example. At first, I didn’t believe it, so I typed Latanya into the Google search bar on images, and up pop all these Black women. When you type in the name Tanya, however, up pop white women. Then you realize that there really are these racially charged names or racially identified names.

So, I ended up spending a month or two doing a full-blown experiment to see when these ads pop up, and I found out that, sure enough, it wasn't about whether there was an arrest record for you. It was discrimination: 80 percent of the time, if your name was given more often to a Black baby than a white baby, you would get an ad implying you had an arrest record, even if you did not. So, this became the start of work to show a violation of civil rights. 

These opportunities at first look anecdotal, but running into an ethicist in a lab or a reporter in my office turned out to be pivotal points in my career.


Q: How are you connecting with the HKS community while we remain remote?

Things are different, and you spend so much time on Zoom that it's become a way of relating to people. You know, if you think about it, I can see your face on Zoom, but if we were in person, we'd have on masks, and we'd have to be socially distant, so I wouldn't be able to see your expressions as much. I miss the opportunities for conversation and so forth. It does take more deliberate action on one's part to have meetings and conversations, but I'm committed to having more of those conversations and reaching out to make it happen.

Banner image by Kacper Pempel; faculty portrait by Rose Lincoln

CSS - Do Not Delete

JavaScript - Do Not Delete

Get smart & reliable public policy insights right in your inbox.