January 2021. GrowthPolicy’s interviewed Bruce Schneier, Lecturer in Public Policy at Harvard Kennedy School and Fellow at the Berkman-Klein Center for Internet and Society at Harvard University, on technology security, social media, and regulation. | Click here for more interviews like this one.
Links: Faculty page | (personal website) | |
Selected books: (Wiley, 2019) | (W. W. Norton, 2018) | (W. W. Norton, 2016)
GrowthPolicy. In a recent opinion piece in the , you write: “American democracy is an information system, in which the information isn’t bits and bytes but citizens’ beliefs. […] When you really need to worry is when insiders go bad. And that is precisely what is happening in the wake of the 2020 presidential election.” What advice would you offer policy makers seeking to safeguard future elections from disinformation campaigns undertaken by bad inside actors?
Bruce Schneier: We need to break up the tech monopolies. Companies like Amazon, Facebook, and Google wield enormous power in the market, and by extension in politics. Decentralization brings security, and the world would be much safer if there were twenty smaller Amazons and Facebooks and Googles than one of each. So we need both smaller companies and the ability to move, delete, combine, and reuse data from a variety of companies. Enforcing existing antitrust laws will make an enormous difference in how these companies affect society. And in areas where decentralization doesn’t make sense—when we have natural monopolies—we need to treat them like the utilities they are.
We also need regulate social media. Traditionally, there has been a differentiation between carriers like the phone company—who are not responsible in any way for what people say on their telephone calls—and media companies like newspapers, who curate and are responsible for their content. Social media companies like Facebook do both, and like to play both ends, depending on where they get a regulatory advantage. They claim not to be responsible for what anyone posts, and also claim the ability to curate everyone’s newsfeed. I would like companies to have to choose a business model, and have regulations flow from that.
GrowthPolicy. In your book , you propose the creation of a U.S. federal agency to oversee information security and privacy laws for the entire country. But, as you point out, there is an essential impasse, because Silicon Valley will always be regulation averse, especially given the profits to be harvested from “.” As recent congressional hearings with technology companies reveal, politicians often seem either about or use these . In such an environment, how should policy makers design a federal agency whose exclusive focus is information security?
Bruce Schneier: I’m not sure information security is the correct focus. We need a federal agency to oversee data, algorithms, AI, and robotics. Understanding these technologies for the purpose of governance is a major undertaking, and we need to house that understanding inside a single government agency. At the same time, computing is essential to many technologies that have their own regulatory structure—cars, airplanes, medical devices—and that field-specific regulatory expertise needs to remain in those agencies. To start, we should establish a federal agency that can coordinate between those other agencies. Cars are not the same things as pacemakers, but they use some of the same computer hardware and software. So there will need to be some commonality in regulation.
GrowthPolicy. Your book, , was the first to introduce the concept of “security theater”: “.” In your opinion, why do we still see ongoing hacks and breaches of , both of websites and high-profile individuals, even following three decades of the World Wide Web’s existence? To apply your term “security theater,” are these instances of entirely performative security measures with no real efficacy? In other words, what should policy makers learn from the term “security theater”?
Bruce Schneier: Security is both a feeling and a reality, and they’re different. You can feel secure even though you’re not, and you can be secure even though you don’t feel it. There is value in each: people need to feel safe, and they need to actually be safe. This means we need to pay attention to both. The immediate example is the COVID-19 vaccine. It’s not enough for it to be safe. People also need to believe that it’s safe.
A completely separate question is why we’re still seeing hacks and breaches. There the answer is that we’re simply not taking security seriously. It’s a traditional market failure: the market incents companies to save money by skimping on security in the hopes they’ll not have a problem. And like any other market failure, government needs to step in with standards and regulations. It’s how we finally got things like safe pharmaceuticals and fireproof pajamas.
GrowthPolicy: You are one of the world’s top technology theorists, with expertise in both practice and academia. What is one future technology-related issue you see on the horizon that worries you and keeps you up at night—perhaps one related to privacy, security, trust, or “digital feudalism” by a small cohort of tech companies—to which you believe policy makers should be paying greater attention before it is too late?
Bruce Schneier: I think artificial intelligence and robotics are going to change society in many ways that we can’t even begin to predict. From my perspective in security, I worry about the effects of hacking AI and robot systems. I worry about AIs hacking conventional systems. And I worry about AIs as hackers, inventing new ways to attack systems. All of these will change how security works in new and unpredictable ways. I can’t even tell you if near-term AI technologies will benefit attackers or defenders more.
This is what my current book project focuses on, and it’s not an easy one to write.
GrowthPolicy: Where will the jobs of the future come from?
Bruce Schneier: The jobs of the future will come from the technologies of the future. I don’t mean to be trite. Technology is changing society in ways we can’t predict, and future jobs will come from that unpredictable space.
GrowthPolicy: What should we do about income inequality?
Bruce Schneier: Fix it. This is actually a security problem. In society, there has been a tacit agreement between the rich and the poor. The rich don’t make the poor’s lives too miserable, and the poor don’t revolt and overthrow society. We are approaching inequality levels that threaten this agreement. The hard question is how, and that’s so outside my expertise that I hesitate to even comment.