MIT Technology Review published an interview with Gil Herrera, the new head of the NSA’s Research Directorate. There’s a lot of talk about quantum computing, monitoring 5G networks, and the problems of big data:
The math department, often in conjunction with the computer science department, helps tackle one of NSA’s most interesting problems: big data. Despite public reckoning over mass surveillance, NSA famously faces the challenge of collecting such extreme quantities of data that, on top of legal and ethical problems, it can be nearly impossible to sift through all of it to find everything of value. NSA views the kind of “vast access and collection” that it talks about internally as both an achievement and its own set of problems. The field of data science aims to solve them.
“Everyone thinks their data is the messiest in the world, and mine maybe is because it’s taken from people who don’t want us to have it, frankly,” said Herrera’s immediate predecessor at the NSA, the computer scientist Deborah Frincke, during a 2017 talk at Stanford. “The adversary does not speak clearly in English with nice statements into a mic and, if we can’t understand it, send us a clearer statement.”
Making sense of vast stores of unclear, often stolen data in hundreds of languages and even more technical formats remains one of the directorate’s enduring tasks.
Click to Open Code Editor