“POLICY MOVES in a function of years,” Latanya Sweeney explains. “Technology moves in a function of months. There is an incredible mismatch in the rate of change.”
This mismatch has driven Sweeney’s work for more than two decades as she has explored what she calls “clashes” between technology and society, leading her to major discoveries in the fields of privacy and algorithmic fairness.
A computer scientist and the Daniel Paul Professor of the Practice of Government and Technology at Harvard Kennedy School, Sweeney is committed to public interest technology, a field built on the understanding that technology can cause social harms as well as serve the common good.
Sweeney established and leads the , which is housed in the School’s . The lab enables students and scholars to conduct their own experiments to address collisions between technology and society, which range from the bias baked into algorithms to vulnerabilities in voting technology to the immense power that tech platforms wield. “We go where the problem is,” she says.
Going Where the Problem Is
This mindset developed from Sweeney’s experiences in graduate school. She was achieving her lifelong dream of becoming a computer scientist at the Massachusetts Institute of Technology, when a colleague told her that computers are evil. Sweeney disagreed, but as she learned about how data was being shared through technology in ways that put people’s privacy at risk, she realized that technology and society could, indeed, be at odds. So, she started investigating and experimenting, eventually writing her dissertation on data privacy protection. In 2001, she became the first African American woman to receive a PhD in computer science from MIT.
Continuing her graduate school focus on data privacy, Sweeney became a pioneer in the field, with her discoveries leading to changes in law and policy. For example, she conducted experiments that showed vulnerabilities in supposedly anonymized health data. She revealed that using just a few pieces of publicly available information—date of birth, gender, and ZIP code—87% of the U.S. population could be matched to health records. In response, HIPAA—the Health Insurance Portability and Accountability Act, a major federal health privacy law—was revised to better protect individuals’ privacy.
Sweeney had another breakthrough when she discovered just how biased computer algorithms used by ad providers can be. She learned from a journalist that an ad for arrest records popped up when her name was searched online. Sweeney was alarmed. She had never been arrested. She decided to figure out what was going on. Her ensuing research, published in a 2013 paper, revealed racial discrimination built into algorithms. Ads for arrest records popped up more frequently when someone searched a stereotypical Black name rather than a stereotypically white one—“Latanya” rather than “Tanya,” for example—regardless of whether the person searched for really had an arrest record. “I showed how Google’s ad network was delivering ads that were actually in violation of the Civil Rights Act,” Sweeney says.
A Laboratory at a Policy School
Although Sweeney has made significant contributions to the public good in her research on technology, a large part of her work as an academic has been teaching students and building networks of scholars.
While she was a professor at Carnegie Mellon University, Sweeney started the Data Privacy Lab (which is now part of the Public Interest Technology Lab) and brought it with her to Harvard in 2011 when she joined the University’s Department of Government. After a leave to serve as chief technology officer at the U.S. Federal Trade Commission, she returned to Harvard and eventually took a joint position at the Kennedy School. (Since 2016, she has also been a faculty dean at Currier House, where she lives with her family among 300 Harvard undergraduates and enjoys mentoring and interacting with them.)
In 2021, Sweeney launched the Public Interest Technology Lab with three primary goals: developing public interest technologies, conducting research, and connecting students and scholars to share knowledge across universities. She has drawn faculty affiliates to the lab from the Kennedy School and across Harvard.
The idea, she says, “was to take all these experiments that have worked”—the tinkering and exploring that have led Sweeney to discover privacy vulnerabilities and algorithmic bias, for example— “and put them on steroids.” Although the Tech Lab is fairly new, it traces the arc of Sweeney’s evolving interests in an evolving field and aims to address the growing conflicts between technology and society.
To deal with these increasingly frequent clashes, Sweeney realized, she needed to reach more people. So, she designed the lab as a hub for students and scholars to learn, experiment, and connect—to train others in order to extend the work far beyond Harvard. Sweeney, who holds a faculty position at both the Kennedy School and the Faculty of Arts and Sciences, teaches what she calls “save the world” classes that embody the ethos of the lab.
“I teach students how to spot unforeseen consequences and how to do these simple experiments,” she says. She emphasizes the simplicity: it is “not rocket science.” But these modest investigations into technology can make a difference and unearth threats to privacy, to fairness, and to democracy. “The issue is: what do we do about these clashes and how do we make the public aware?” As her own work has revealed, asking the right questions and investigating can change laws and business practices at large tech companies.
“I teach students how to spot unforeseen consequences and how to do these simple experiments.”
Sweeney “is going to save the world from technology run amok,” says Joan Donovan, research director of the Shorenstein Center and an adjunct lecturer affiliated with the Tech Lab. “She not only teaches ethics and values in tech but illustrates for students how to build technology that has social significance beyond monetization.” Donovan, who is an expert in media manipulation and disinformation campaigns, explains, “I have been able to sit in on several of her classes, where the students debate the real stakes of technological design, not just from a critical perspective. They have to prototype how to make things work without ignoring the inconvenient truth. Technology is both process and product, helpful and harmful, and most of all, intentional design can mitigate risks.”
Kathy Pham, a senior fellow at the Shorenstein Center and an affiliate of the Public Interest Technology Lab, also emphasizes how uniquely placed Sweeney is to lead the lab, with her background as a computer scientist and former chief technologist at the Federal Trade Commission. “When I think of Dr. Sweeney, I think of the embodiment of public interest tech,” Pham says. As a computer scientist herself—who was named deputy chief technology officer of the Federal Trade Commission in 2021—Pham is inspired by Sweeney, who, she says, has been a pioneer in the field for many years, “rethinking how technology affects democracy.” Pham explains, “she has done all these roles. She brings a depth of experience that is unique.”
Drawing on her deep and varied background and knowledge, Sweeney has positioned the lab as a hub to connect with others doing public interest technology projects. For example, it is part of the Public Interest Technology University Network, which consists of roughly 50 universities where students and faculty members do this type of work. The lab also sponsors the Technology Science Research Network, which is made up of scholars from across the country who study technology-society issues, host events, train students, and publish research. The network’s scholars contribute to The Journal of Technology Science, of which Sweeney is the founding editor in chief.
Through the Public Interest Technology Lab, students, staff, and faculty have developed a range of tools and platforms that solve public problems—from enhancing voting security to helping people without internet access schedule vaccinations to understanding how large social media companies shape conversations.
Credit Monitoring for Elections
One tool that Sweeney is excited about is VoteFlare. The idea emerged in 2016 in one of her “save the world” classes when students asked how they could empower voters across the political spectrum. Sweeney, along with Harvard PhD student Jinyan Zang and Data Privacy Lab researcher Ji Su Yoo, discovered how easy it was for voters’ registration information to be changed by fraudsters, potentially disrupting elections. In , they shared their findings: In 2016, websites for 35 states and Washington, D.C., were vulnerable to voter identity theft whereby imposters could change voter registration information. All that they needed were a few key pieces of information, most of which are publicly searchable or obtainable through data brokers or darknet markets.
Sweeney and her students wondered how they could fix that problem. The answer they developed was VoteFlare, a tool that Sweeney compares to credit monitoring for voting information. People who sign up for the VoteFlare app are notified by a “flare” in the form of a text, a phone call, or an email if their information is changed online, so they can take corrective action if warranted.
VoteFlare was piloted in 2020 during the Georgia runoff election successfully and has also been used in the Texas 2022 primary. The team plans to roll it out across the country before the general election, particularly in areas where voter suppression is a risk. Josh Visnaw, the project manager for VoteFlare, explains that the tool “allows individuals to fix issues before it is too late, ensuring that they’re able to participate in the democratic process.” He says, “We envision this technology complementing the good-faith efforts of election officials and civil society groups across the country, as we approach the 2022 midterm elections.”
Inside Facebook’s Decision-Making
Another project Sweeney is excited about is fbarchive.org, a new online archive of 20,000 internal Facebook documents that the whistleblower Frances Haugen, a former Facebook employee, released in 2021. Sweeney and her team received the documents, verified their authenticity, and are digitizing and curating them so that they will be available to researchers, journalists, and to some extent the public. For this project, the lab is collaborating with the leadership and core team of the Shorenstein Center, as well as the center’s .
Sweeney explains two basic challenges to creating the archive: one is privacy, and the other is usability. “Names and identities have to be scrubbed,” she says, because the documents—which are photographs of internal Facebook employee messages—include not only Facebook’s leaders but also ordinary employees, who might be harmed by exposure, and Facebook users, whose conversations are sometimes included in internal employee discussions.
As for usability, Sweeney says, “What Frances Haugen did was call up a document on a screen and she would take a picture with her mobile phone. So, you end up with 20,000 images from a phone. Sometimes she was not the same distance from the screen. The orientation’s different. The lighting’s different.” All those factors pose difficulties. But the usability issue goes beyond the challenges of working with inconsistent photographs. The researchers also need to decipher the content. “There’s a huge amount of inside-Facebook talk,” Sweeney says. “So, we had to unpack acronyms and so forth and put a glossary together.”
The fbarchive.org platform, Sweeney says, will provide insight into not only what goes on at Facebook, but also the power and sway of the company’s decisions. Using a machine learning algorithm, her team is organizing the documents into clusters of topics to make them easier to explore and curate. “The documents are fascinating,” Sweeney says. “They cover almost every contemporary issue in society around the world.” No matter the topic, “somehow Facebook is engaged in it, involved in it, magnifying it, causing it.” She notes, too, that the documents can reveal Facebook’s behind-the-scenes policy decisions: “What are the knobs they could use to change things, and do they tend to use them or not?”
A Dizzying Industrial Revolution
We are, Sweeney says, in the middle of a revolution. Unlike earlier technological revolutions—for example, the mass development and adoption of cars—digital tools are changing so rapidly that society and policy struggle to keep up. “Technology today doesn’t require slowing down,” she says. “It’s immediate. You have an idea: Let’s put up a website. People are mesmerized by the shiny new thing they can do.” The Public Interest Technology Lab, however, is there to respond and to look out for those unforeseen consequences of fast-moving technology.
“The number of challenges—the number of clashes—dwarf what one person or small group could ever do in response,” she says. But Sweeney is hopeful that the Public Interest Technology Lab—through its experiments, teaching, and far-reaching networks with other scholars and universities—can help policy catch up and create avenues for technology to help, rather than harm, society.