Hacking Defense and Diplomacy in Silicon Valley
Scott Hartley is a venture capitalist and term member at the Council on Foreign Relations. His new book, The Fuzzy and the Techie (Houghton Mifflin), comes out in April. You can follow him @scottehartley
With technology empowering non-traditional actors who create asymmetric threats, there are real questions as to how the Department of Defense (DOD) can best ensure security and preserve privacy. For many years, DOD has worked to attract top technologists from Silicon Valley. I was part of one White House initiative called the Presidential Innovation Fellows, where technologists and venture capitalists go to Washington for six- to twelve-month tours of duty, working with chief technology officers at various agencies.
But in 2015, then-Secretary of Defense Ashton Carter flipped the model on its head and moved part of the defense community out of Northern Virginia, three thousand miles west to Silicon Valley. He helped create the Defense Innovation Unit Experimental (DIUx).
In Silicon Valley, venture capitalist Marc Andreessen has for many years talked about how “software is eating the world.” He argues that technology has come to affect every corner of our world. In moving elements of DOD west and embracing the technology community in its own environment, DIUx has been a catalyst in bringing some of the most profound, intractable problems to a valley full of people looking to chase big ideas.
One of the most promising results of this westward push has been the establishment of a class called Hacking 4 Defense (H4D) at Stanford University, brought to life by two U.S. Army Colonels, Joe Felter and Pete Newell. Felter and Newell help translate the experiences and needs of the defense community to Silicon Valley. Having also recruited professor and serial entrepreneur Steve Blank to develop the H4D course, today the trio helps teach an entrepreneurship course aimed at solving some of the biggest problems from DOD.
In H4D classes, active military units and intelligence officers submit problems they face to students, who then launch minimum viable products and work with them to refine the product—an approach known as “lean startup.” As part of this process, students must develop an authentic understanding of the challenges military and intelligence personnel face. Students visit training camps, wear scuba diving dry suits to understand needs of Navy SEALs, and learn about the requirements of ordnance disposal unit’s bomb suit needs on base with the Air Force. The class is open to students from all backgrounds and departments, and features political scientists and engineers working side by side.
H4D helps bring together what at Stanford are known as “fuzzies” and “techies,” namely those who study the arts, humanities, and social sciences, and those who study the computer sciences or engineering respectively. The pairing creates a deeply collaborative environment that brings together the context around the problems to be solved and the code to solve it.
One challenge, posted by the U.S. Army Cyber Command, was to “determine how to use emerging data mining, machine learning, and data science capabilities to understand, disrupt, and counter our adversaries’ use of social media.” The description highlighted that “current tools do not provide users with a way to understand the meaning within adversary social media content . . . Current tools and methods monitor social media streams and can provide quantitative measures (volume, relevance, search). These tools fall short by failing to capture the content from the sites where most of the actual meaning is transmitted.” In other words, the U.S. Army Cyber Command sought a hybrid solution to a problem, one-part human, one-part technical, just like the composition of the class. In a world of compartmentalized expertise, the Army recognized the value of a solution that was not technology-only, but rather required both social science and engineering. The solution had to effectively degrade adversarial social media use, but do so within the guidelines of U.S. law and policy.
In 2016, Blank and Felter also helped create “Hacking 4 Diplomacy,” a course aimed at building technology solutions for the State Department. The class offers students opportunities, for example, to work on building tools to combat violent extremists such as the self-declared Islamic State group. Solutions will again likely combine the power of machine-learning algorithms with data pulled from a number of online sources, and with the fuzzy understanding of the culture and psychology of adversaries.
For understandable reasons, these courses have garnered attention. In the fall of 2016 Stanford hosted 75 attendees in an educator’s class focused on training the trainer. This year, 13 other colleges such as Georgia Tech, the University of Pittsburgh, the University of Southern California, and Georgetown will begin teaching their own H4D courses, with additional problem sets provided by the diplomatic, defense, and intelligence communities.
Rather than lionize technology as a standalone solution, policymakers ought to recognize the value of context that can be provided by our men and women in uniform, and pair their understanding of the toughest national security challenges we face, with some of the nation’s brightest minds in technology. This pairing of experts across domains yields results.