Economics

Technology and Innovation

  • Technology and Innovation
    Deep Fakes and the Next Generation of Influence Operations
    Play
    This panel identifies guidelines tech companies can follow to limit their negative use and offer views on how governments should react to deep fakes, if at all. 
  • Technology and Innovation
    Will Artificial Intelligence Curb or Turbocharge Disinformation Online?
    This symposium was hosted by CFR on November 14, 2018. "Will Artificial Intelligence Curb or Turbocharge Disinformation Online?" convened policymakers, business executives, and other opinion leaders for a candid analysis of artificial intelligence’s effect on democratic decision-making. 
  • Women and Women's Rights
    Gender Bias Inside the Digital Revolution: Human Rights and Women's Rights Online
    Podcast
    As the world moves deeper into the digital age, international organizations, businesses, and governments grapple with questions about how technology challenges and reinforces biases. Dr. Safiya Noble, assistant professor at the University of Southern California Annenberg School of Communication and the author of Algorithms of Oppression: How Search Engines Reinforce Racism, discussed her work to define digital technology as a human rights issue in the context of the United Nation’s investigative report on internet access, as well as her work on algorithmic bias.   POWELL: So let’s go ahead and get started. Folks are welcome to bring your coffee and food in here. I want to say welcome. And—if folks can get settled. Welcome and thanks to both old and new friends. I know there are a number of advisory committee members around the table and supporters in various ways. So it’s always a thrill to see all of you. It is with great pleasure that I have an opportunity to introduce Safiya Noble who, as you can see from the bio, she’s a professor at both University of Southern California Annenberg School of Communication, as well as UCLA. And she told me she just got tenure yesterday at UCLA, so—(cheers, applause)—we have to—if we had champagne—I wish we had champagne to toast her. But also equally exciting is that she’s also just accepted a job at Oxford University, where she’s going to be joining their faculty this summer. So this is a woman on the move. NOBLE: It’s an embarrassment of riches. That’s all I can say. (Laughter.) POWELL: Really incredible. And she’s just—she was at Harvard yesterday, speaking with them. So we’re so lucky to have captured her for this brief moment. So as I’ve become really fascinated with women in technology, and I’ve produced a CFR report on women in tech. And everyone, when I started down this path, kept telling me about this book, Algorithms of Oppression. And so I knew right away that this is a very important voice and that we had to have her at this table. When I worked in the Obama administration is when Syria and Egypt shut down the internet when folks were out in Tahrir Square demanding democracy in those countries. At the same time, protesters were using social media as tools of demanding democracy and human rights. And so technology comes with a double-edged sword. As we move into this digital age, it can be both freedom-enhancing and freedom limiting. And so it’s that quandary that really brings me to this table and wanting to learn more. So with that, I’m going to turn it over to Professor Noble. And she’ll speak for about ten minutes to lay out some opening remarks. And then we’ll open it up, because we have a lot of experts and smart people around the table, so that we can have a more informal discussion. NOBLE: Thank you very much. I really have never had the opportunity to speak to such an impressive and slight intimidating crowd of people. So thank you for the opportunity. Thank you, Professor Powell, for the invitation. I thought maybe I would talk a little bit about the book, because I’m sure not everyone—I mean, most of you probably just heard of it a couple of minutes ago. And so I could kind of give you some of the highlights and what led me to this research. But let me just say that in the kind of spirit of the type of work that you all do, what I’m most interested in in this work is trying to surface not just the affordances of the web and digital technologies, but also some of the consequences for people who are most vulnerable, people who are most vulnerable, people who—for whom they may not be able to use the internet in some type of liberatory fashion, but in fact may find themselves the victim of the kinds of, you know, dangerous and misinformed types of processes that happen on the web. So I bring kind of this sharing and the spirit of thinking about what might we all be able to do in thinking about policy or other types of interventions. So let me just kind of set the stage. The book really was a genesis of a project I was doing in graduate school. When I came to graduate school to get a Ph.D., just a few years ago, I had spent 15 years in marketing and advertising. So at the time that I was leaving work with very large brands here in the United States, brands who were interested in, quite frankly, gaming search engines and getting on the first page, and were spending a lot of money with the ad agencies that I worked with. I was surprised to enter graduate school at the University of Illinois and hear so many people talking about Google like it was the new public library, or some type of new trusted information frontier, at a moment when I had just left industry and knew that we were spending a significant amount of money to use it more like a public relations engine, let’s say. So that was—that piqued my interest, that the academy was in one place, so to speak, around what was happening. And of course, this wasn’t just with Google. This was Yahoo and other. I mean, some of us have been on the web—probably most of us in this room—far before the search engine came around. And so I was—I was thinking about kind of this dynamic between what was happening in industry and what was happening in the academy, but then also what the public was doing with search engines. Pew was releasing great research about search engine use. And what they were finding was that more than seventy percent of the public was relying upon search engines like a trusted, verified, credible resource. And they have a very important study. They have two studies on search engine use that you can find with—as part of the Pew American Internet life series of work that they do. So this led me to start to coordinate and develop a study on a variety of different identities. I was thinking about how do people in communities get represented in something like a search engine? And of course, you know, we—at that time, I was living in the Midwest in not an incredibly diverse town and thinking about the ways that people who might have very limited contact with people who are different from them might in fact be highly reliant on upon something that they trusted to provide them credible, vetted information about other people. That first study really looked at a variety of kind of racialized and gendered identities, starting with black girls. My niece is here. She’s a graduate student at NYU. And she was one of the people that I was thinking about when doing this search on black girls. At I was—at the time, she was, you know, young, a tween. And I was stunned to find that the first page in all the major search engines brought back pornography as the primary representation of black girls. And so, you know, we started to expand. And, you know, when I—when I saw that—and I won’t. This isn’t the place. Normally I would put a slide up and you would see the sites. But I just feel embarrassed, quite frankly, to say the names of these porn sites. But they’re real pornified, if you will. (Laughter.) And they’re really sugary and other kinds of adjectives that start with P. So there as—this led me to looking at Latina girls, Asian girls, a lot of different kinds of, again, ethnic minority identities, and seeing that overwhelmingly the representations of these girls were pornography. It opened up a bigger series of questions, which is: What does it mean when we kind of have a larger, broader, public discourse, particularly coming out of Silicon Valley, that the internet simply reflects back what the majority of people are doing? However, when you looked at these kinds of searches, you didn’t have to add the word “porn.” You didn’t have to add the word “sex.” These girls of color in the United States were synonymous with pornography. And of course, they weren’t even girls. These were women that was represented in this site. So at a kind of fundamental, you know, Sexism 101, women were coded as girls. And I started writing and speaking about that. And you know, I will say that I’ve been in a—what I feel like is a quiet, unspoken relationship with Google in particular, who I talk about quite a bit, because, you know, over the years they have adjusted some of the results. And I’ve seen their response to critics. And I think that there are some positive steps happening in that direction. But more systematically, when you look at misrepresentative information, this is a very dynamic and difficult set of issues to get our arms around. And I think it’s the kind of thing that we need to be thinking about, particularly when people are in a numerical minority in the community, in the country. They really would never have either the capital to SEO their way out of it by kind of aligning different content with the keywords, and the incredible amount of money that it takes in these advertising-driven platforms. But they also would never have the majority in terms of the kinds of people who are searching on these identities. So that became a bit of the foundation for the book. But it was really the opening into thinking about what does it mean for us to have this kind of tension between private corporations being positioned as public goods, as public information resources? And this is a really big tension. Of course, I come out of the field of library and information science. And so I’m—you know, it’s kind of thinking about the role that librarians and other kinds of information professionals have played—teachers, professors, and so forth—in helping us make sense of knowledge and knowledge management in the world. And yet, I see this kind of—this paradigm shift now to where—and of course, any of us, all of us in the room who teach students, we know that students now say they could never write a research paper, for example, without a Google search or some other search engine. And it doesn’t even occur to them to use the library, except as a study space. (Laughter.) But that’s a different talk for another time. (Laughter.) So I started thinking about kind of, again, what are these tensions between advertising platforms and other kinds of public interest information resources, and how might we shift the conversation? And this is really what the book is about. It’s not really about trying to do away with the current, you know, set of companies. Although, I will say that I think the more diversity we have in information platforms, the better that is for democracy. And I’m not the first person to say that. There are many people who know that the more journalism we have, the more media outlets we have, the more schools and universities and, on these types of public institutions, the more we bolster democracy. And I think that this is important when we think about kind of one monopoly leader being trusted and, in many ways, displacing other kinds of public institutions, like public libraries or public interest media. So you know, this is a—leads us, of course, to the inevitable kinds of conversations that I think start to creep into the fore, which is what is this kind of space of information, and who are the people that are involved in it? And, you know, I feel very fortunate to have my colleague Sarah Roberts, who I just happened to be lucky enough to convince her to come to New York with me from Harvard yesterday. But you know, this is a world’s expert on commercial content moderators. And these are people—now we’re talking about, you know, more than one hundred thousand people around the world in global sites that are deciding, for example, is the napalm girl and these photos of—for example, of war—is that child exploitation material or is that documentation or evidence of war and human rights abuse? And these are the kinds of tensions that now we start to see, and Sarah so well documented in her work in her forthcoming book. So I think that, you know, what we know is that the dominant discourse has been that large digital media platforms are simply conduits. They’re free-speech zones, where anything goes, except for this of us who study these platforms. We know there’s actually a global labor force that’s involved in deciding whether things can come down or whether things should stay up. We know that there are a host of programmers and other managers and executives who make policy and decisions about content. And those are values-based. But there’s no transparency about what those values are, in many cases. And these are the kinds of conversations that I think scholars like I—or, like me are trying to reveal with our work. More so to just shift the conversation and say: Well, if there are values at play that we know are present, why aren’t they transparent and what’s at stake when they’re not transparent? And this is where I find we start to move into the realm of concerns about civil and human rights, and what the role is of industry in relationship to either fomenting something as, you know, basic as should pornography be the dominant representative information of children of color, or girls of colors, or should platforms be mindful or even regulated around the—you know, their participation or their—the way that they’re used in ethnic cleansing projects, like the Rohingya in Myanmar, for example. And, or, you know, the disinformation campaigns, for example, maybe around new civil rights movements, whether it be the Black Lives Matter movement, which—you know, we may look back in twenty or thirty years at the campaigns of propaganda and disinformation that have moved through these spaces to suppress civil rights movements in the way that we look back on the civil rights movement, quite frankly. You know, at the time that the civil rights movement was underway it was a vilified movement. Everybody marched with King, apparently, in my family, but we know that’s actually not true. (Laughter.) So we—you know, but we have a retrospective about, you know, our deep commitments. But at the time, we had very similar kinds of campaigns of propaganda and disinformation against, you know, empowering and the giving of rights to African-Americans, as were indigenous people, Latinos, and even women. So I think this kind of moves us into the space of thinking about what’s happening at a global scale. We know, for example, that in—the EU has been much more aggressive in terms of policy and thinking about what their conceptions of values are around speech and, we might call it, content. But that’s a very generic word for a lot of very different kinds of information that might be moving through a platform. And, you know, I think that some of our conceptions in the U.S. are also often exported to other parts of the world, as if they’re kind of acceptable. And, you know, I’ve often found it interesting to see social media and other types of digital media platforms argue that they’re not responsible for the content that moves through their platforms, except for Sarah’s revealing work that there are hundreds of thousands of people who work at subcontractors, in many cases, or direct employees of these companies who are curating and pulling down content, or that hate speech can’t be managed or acknowledged even in the United States because it’s free speech and platforms might argue that there’s nothing they can do about it, except in Germany and France, where it’s illegal to traffic in antisemitism, for example, and that content does come down. So this is why it’s important for us to kind of understand the larger kind of international scope of concerns. And, you know, I also think that in talking with people in industry, still to this day, the incredible amount of financial rick and legal risk that these companies are facing in trying to think through how content moves through their platforms and where the line of demarcation is in terms of their responsibility or culpability is on their minds. And so we might be able to also kind of in the public interest play a role in conversation and dialogue with companies about how to solve some of these problems. And I certainly have been available for that. So those are kind of, like, the broad strokes of things that I think maybe will get us at least to a provocation and, you know, a set of maybe some questions and conversations. You know, I’ve been talking about the harms that I think—and of course, you know, I’m not just an information science scholar. I’m also a communications scholar. And I understand, for example, the incredible harm that comes to the public when we, you know, ingest and have contact with stereotypical racist and sexist kinds of content that circulates in our societies. So, you know, it’s not of little consequence to have this kind of content and disinformation moving around. And, you know, when I was talking about this years ago, when I first started this work in 2010, people felt very sympathetic and sad for the black girls, and for the girls of color. But then when the same kind of mechanisms threw a presidential campaign in 2016, everybody started paying attention. So, you know, I feel that it cost us quite a bit, quite frankly, in the political arena to have to see some of these mechanisms unfold. But I think it’s a really important moment now. And I don’t think we can leave this—these conversations to hoping that other people are having them. So I’m really grateful that you all are open to exploring these ideas, at least at this kind of—at this roundtable. And maybe I’ll leave it there. POWELL: Thank you so much. So I’m going to open it up and just ask you to put your cards sideways. I have questions of my own, but I want to maybe first go to folks around the table, and then I’ll jump in if there’s time. Let’s start with Mark Shulman. Q: Oh, thanks, Catherine. What a fascinating talk. And I was unfamiliar with much of what you were saying. And I’m really grateful for you revealing all this to me. In particular, you talked about the lack of transparency in the decisions made by people running big—the search engines, and that those—those non-transparent decisions reflect a lot of values, and that people are not taking responsibility for how their values play out to silence, or characterize, or mischaracterize people. But I don’t know, assuming they ought to be more transparent, what sort of responsibility ought there to be for deciding how to characterize, prioritize content? I couldn’t tell if you want the German model, where there’s criminal sanctions. Of course, there’s some First Amendment issues with that in this country. Do you want civil liability to attach to slander or libel? Or what sort of locus is appropriate for deciding on the algorithms and the responsibility for the aligning of content? And then who decides? Because I look at the FCC today, or the federal government today, and I think that if we charge them with the responsibility for deciding what’s hateful speech, you might find a lot of transgender kids being shut out of the internet. NOBLE: Yeah. It’s a great question. And I think—I don’t think it has a simple answer. So let me just say that first. I mean, there’s a—there’s a few dimensions in terms of the consequences of what happens with the way in which data profiling in particular is another large dimension of what happens in the search engine. So all the past things that we’ve done, the way in which our information and our participation in digital systems is brokered and sold across a whole host of other companies. I mean, it was rather amazing to me when the Cambridge Analytica story broke and everyone was shocked. And I was like, that’s the business model of the internet. (Laughs.) I mean, is buying and trading on our data, and the intense datafication of our lives. So I know for sure we are already seeing evidence of things like technological redlining, which is something that I talk about in the book, which is this idea that certain opportunities may be foreclosed, and higher prices might be attached to certain kinds of people, right? Maybe not-so-good insurance rates, or higher premiums on a mortgage, or maybe not even be able to get a mortgage because, again, new experiments that are happening around social credit and so forth. And these are being experimented in other parts of the world, but also I’ve seen a lot of reports of people talking about kind of what is their data profile, and how might they ever be able to intervene upon that? And I think this is where search and social media companies play a huge role, because they are very large aggregators of online participation and what people are doing. So I think we don’t have legislation that allows people in the United States, at least, to think about controlling their data profiles. We don’t have legislation like the right to be forgotten, for example, in the U.S. And I think that to the degree that people can come to know how their data profile is created and also to be able to intervene upon it is important. I once heard the director of the CIA at a meeting say that as far as the state was concerned, people are their data profile. And I would argue, a lot of people don’t know what their data profile is. So I think that—of course, that has all kinds of other implications around, you know, the freedom to read anything one might want to read in the world, right? The freedom to kind of explore and know things. I mean, if I’m trying to read up on al-Qaida, does that mean I’m part of al-Qaida, right? So these are complex issues that I don’t think we have an appropriate framework yet to do sense-making. Certainly I think that—you know, the imperative for industry is profit-making. And so I guess the question is, you know, profits up to what point. Certainly every industry is interested in those imperatives and, quite frankly, are held account for them. We have a paradigm in some industries about consumer harm that we don’t necessarily have in the tech industry. So, for example, we would never allow pharmaceutical companies to make drugs and just pass them out on the street and see what happens. And so, like, I hope it works out, right? (Laughter.) I mean, I can just—OK. We just—it’s inconceivable on a certain level. I know, now I’m—yeah. Don’t put that in the internet. I don’t know. Automotive industries. You know, other industries where we can see, well, what is the potential impact of consumer harm. And of course, there are great scholars doing work around the harm. Virginia Eubanks has a new book out called Automating Inequality, where she looks at things like people losing their children in the foster care system because of erroneous databases, and the removal of those kinds of decisions from human beings, like social workers, and putting those into automated systems. And there’s just multiple examples. So I think we really don’t have the conversations in place that need to be had about what is—who is liable. You know, and as the state and government start to become deeply invested in these systems too, it will be harder and harder to extract or to do the sense-making. You know, one of the things I often repeat, you know, Cathy O’Neil wrote this great book called Weapons of Math Destruction. And one of the things she said in it is, you know, you can’t take an algorithm to court. And this is part of, I think, the tension that we’re in right now around the—many of these media platforms—tech media platforms really argue that they are simply kind of the pipes and they are not responsible for the content that moves through them. And so I think this is—again, this could be argued. Some of us argue about where the culpability lies. And I don’t think that’s all been entirely sorted out. So I guess, you know, that would be kind of broad strokes. I mean, certainly I think we need I can’t even say much more aggressive. We just need to be thinking in a more serious way about policy. I mean, when I watched the congressional hearings when Mark Zuckerberg was brought in, you know, to account for what was happening in Facebook and, you know, the answer is AI will fix it eventually. I mean, you know, people like us we just—it’s like, we know AI is still trying to figure out is this table a table. (Laughter.) Are you kidding? So I mean, that is—is a cat a cat? That’s what’s going on in the state-of-the-art of AI right now. So despite what anyone will mislead you to begin. So I don’t think these problems will be solved by AI. I think they also get solved, again, in the realm of these content moderators and people who make the policy. You know, Sarah wrote this chapter in a book that I—that I published a couple of years ago. And one of thing one of her informant said—I should let you speak to this—but, you know, he said—he said, you know, it was interesting me. I worked for a large, you know, mega-tech company. And I didn’t understand how things like drug murders—drug-related murders on Juarez, Mexico has to come down from the platform, but beheadings in Baghdad got to stay up. It seemed weirdly like there was a relationship between U.S. foreign policy, but I couldn’t really say because I’m too low-level a worker to know. But it just seemed like violence in some parts of the world was all go and other parts was a no-go. And then he would say, you know, I would ask my managers why was it we would take down animal mutilation videos, but blackface was in? Who decided? What were these kinds of, you know, decisions about—how did blackface, you know, and kind of this, like, gross, stereotypical kind of performance—who did—upon whose values was it OK for that to be in and other things to be out? So these are—these decisions are being made all the time. And when you think about the kind of—I mean, YouTube alone, you know, they were on “Good Morning America.” The vice president for YouTube was on it a few months ago. And he said—and this was kind of a recent statistic. He said four hundred hours per hour— four hundred hours per minute is being uploaded just to YouTube’s platform— four hundred hours of content per minute, twenty-four by seven. So I think that, you know, again, when you think about the type of people that are having to flag that, or having to do sense-making of that, what we know is that that’s a volume that can’t actually even be tended to. So it’s up until the public takes it down, or others. And I think, again, we have a lot of work to do in sorting out these lines of demarcation around responsibility. POWELL: So I have five people on the queue. And I’m glad I’ve deferred my question. I have, like, a million questions at this point. But let me to go to my neighbor here. Q: Hi. Nan Keohane, Princeton University. It’s a fascinating presentation. I realize that a lot of other people want to ask questions. So I’ll ask a question and ask you to give a brief answer to help us think about it further. My question is about path dependency and how these things get begun. So you said, you know, when people found pornography when they looked for black girls, it’s because that’s what they wanted to find? You sort of implied that? Does that mean that in an early period when only a small number of people were using this search engine, a few people somehow indicated that they wanted to find pornography, and then that became the norm? I know algorithms are part of the answer, but can you tell me conceptually how does that happen? NOBLE: Yes. It’s capital. The porn industry has more money than everybody. Q: So it’s done through advertising, or? NOBLE: Well, Google search is an advertising platform. That’s what it is. So it’s a—you know, you have to remember that—not just Google. I mean, Facebook is optimizing content also in relationship to advertising. Q: So it’s not as thought someone—that it was the result of mass preferences that led to that. It’s that that’s what was provided by the capital-supported documents? NOBLE: I think the way I see Google characterize what happens in its search when it reports out is that there are over 200 factors that go into the decision-making metrics that they use about what to prioritize and what to sort out. You know, they don’t even index the entire web. They index about—last I saw as about forty-five, it might be more than that, percent of the web. So those who can pay to optimize and pay more to have their content connected to the keywords—“black girls,” “Latina girls,” “Filipina girls,” and so forth—they can outspend me or anyone else that’s interested in those keywords. And so, you know, how ad words works for Google, it’s a twenty-four by seven live auction. And people just pay too big to optimize their content. So industries that have a lot of money always end up on the first page. I mean, this is why if you’re looking, you know, for—and, of course, geolocation is important. You know, you’re not going to necessarily find content from Bangladesh. You’re going to get U.S.-based content. So, you know, geography matters. Also what other people have been searching, because Google’s trying to help shortcut and optimize in those ways. So there’s a kind of a confluence of factors. Q: Thank you. POWELL: And, by the way, I mean to mention, you mentioned Cathy O’Neil, the author of Weapons of Math Destruction. She came to speak here this past spring, which was fantastic. NOBLE: Great. Great. POWELL: Let’s go to Michael Walsh. Do you still—you took your card down. Q: Oh, no. A lot of it was answered. But I—having started living in a conformed culture at the age of twenty, I’m just curious. Is what you’re going through now or what people are going through in terms of black versus—black girls, or Latina girls, or whatever—does this also happen in other cultures? And if so, what is being done to try to balance out your research, or how do we find new ways of helping to open that up so that it becomes a global issue, and not just your own—just your own very specific research and technology. Because this is all brand new. And, you know, when we see from the past election, I’m just curious, is that how do we educate the public to participate in this research that you say has to be done? NOBLE: Yeah. I mean, I—most of my work is based in the U.S., for sure. So that’s the site that I’m most familiar with. But I will tell you, for example, Elad Segev wrote a really important book about Google and the digital divide years ago as his dissertation. And he certainly had tracked kind of the misrepresentative kinds of information and the way in which groups who were in a minority in other countries were also held hostage in some ways, or had a harder time breaking through, let me put it that way, with their—with their content. And of course, these things are certainly important. I mean, the, you know, the cooptation of certain kinds of keywords, for example, often still happens in the U.S. And, I mean, many people have written, for example, about how Jewish people around the world have been maligned or their representations have been connected to Holocaust denial information or, you know, really white nationalist kinds of messages. And so we see that phenomena happen outside of the United States, for sure. But, again, the interventions are different in different parts of the world because they’re—the policy requirements are different. Q: But how do we—how do we— NOBLE: I don’t know how we’re going to do it. But, you know, I feel like this is a really important step, is that we’re having larger conversations. And so— POWELL: So, and in fact, just before this session we were talking about—I’m teaching the Rwandan genocide in my human rights class now. And just showed the first part of the Frontline documentary on the genocide, which features the use of hate radio. And I’m just thinking now, if in 1994 these kind of platforms were where they are today, how much, you know, even more amplified that message of hate would be. NOBLE: Let me just add though, something that I think is a little bit different from radio is that, you know, radio is also about kind of, like, the use of spectrum. But when we start talking about platforms, those are something different than radio in terms of the decision-making of what can be seen and what can not be seen. And so I would just make that distinction a little bit in terms of, again, a different kind of curatorial process that’s happening. Certainly I would—we can’t overlook that in places like Rwanda part of what’s also fueling genocide there are the mineral wars and, you know, the selling and the trade of arms for minerals. And of course, many of the ways we don’t talk about technology in the West is we don’t think about, for example, the incredible extractive industries that are happening in the Congo, for example, and, again, that—you know, where, you know, the United Nations has said that the Congo is, you know, the greatest site of sexual violence in the world. So kind of the concerns at my work kind of extend to thinking about the kind of material dimensions of this. Like, you know, the component parts, where are they sourced from, and what are the politics of that? And certainly, you know, if we had to design this and say: No one dies, now go design. That would be really different than design it with the cheapest possible, you know, minerals and commodities possible. And I think that’s something to put into play, especially when we talk about Rwanda. And certainly, you know, I spent the in Ghana, in Accra, studying e-waste and what happens with our devices when we’re done with them, and how they get loaded up into barges. And people think they just go to the recycling center. But they actually go on barges, and they’re shipped to China, the west coast of Africa, and other places. Where, you know, huge new toxic e-waste cities are emerging, and people are being poisoned, and their lives are being cut short. And if we think that that’s not tied to refugee crises and the displacement of people when their lands are poisoned as we kind of rush in our thirst for the digital, I think we—again, we’re just being incredibly short-sighted. POWELL: Let’s go to Sylvia Ann Hewlett. Q: Thank you. POWELL: It’s already on. Q: It’s on? OK. (Laughs.) Incredibly urgent work. Thank you. NOBLE: Thank you. Q: A small comment and then a question. We know that AI—sexism or racism are being baked into that as we speak. You know, Joy Buolamwini’s work, you know, she finds her face not recognized because the folks that designed the—or coded the algorithms didn’t have a wide enough range of skin tones to somehow pick her up. Now, companies like Microsoft, where I was recently, is working on the premise that if you get more diversity around decision-making tables, and around design teams, right, you finally will be more responsive to a bigger range of end-users. And perhaps the content and the degree of empathy for new marketplaces because it is true, you know, the biggest growth market in the world is not China, it’s women. You know, it’s a huge growth market. So the potential of these diverse groups to actually exercise market power, if only they were represented around decision-making tables in the tech industry, is that some kind of piece of hope going forward? Because right now we know that design is unduly, you know, kind of dominated by young white guys, right? Which is making the problem of how it’s all determining the future that much more poisonous. NOBLE: Yeah. POWELL: Before you jump in, can we take a second question? Because, like, now there’s so many cards up, I want to get as many people in. NOBLE: Yeah. OK. Yeah. POWELL: Linda, you took down your card. Q: I was just going to raise the question of how do you get people to even care? It seems that everybody that I know doesn’t—you know, I’m not on Facebook or anything because I don’t want people knowing this data. And I’m considered an absolute elephant in the room. So when people— NOBLE: Or the smartest person. Q: So when people are not—are not alarmed by this, how do we begin to get any traction? NOBLE: Sure. OK. These are both great questions. And I think they go together well. So one of the things that often is argued is that if we had more diversity in Silicon Valley or Silicon corridors— Q: At the top. NOBLE: Exactly. In management, but also among programmers, right? The kind of whole ecosystem. That we, you know, we could solve these problems. I guess where I part—you know, I think Joy Buolamwini’s work at MIT is trying to raise these concerns. You know, there’s also the—and this is because she doesn’t want false-positives, right, where people are recognized, but it’s not them. And of course, you might remember this study that the ACLU did, pointing kind of the market AI facial recognition technologies at the Congressional Black Caucus and half of them being—or, a third of them being flagged as felons and criminals because the AI is so unsophisticated. And that is what I mean when I say it’s still trying to figure out is a table a table. So this idea that somehow if we diversify the data sets, diversify the training data, diversify the design work and the management, that that will solve it. And I think I just push back on this a little bit to say: Certainly we need to do—there’s nothing wrong with doing that. However, what we don’t have are people who are critical thinkers around these issues that are at the table designing. We don’t have people with Ph.D.’s, for example, in black studies, or ethnic studies, or gender studies, right, with the same level of legitimacy as a, you know, person with a bachelor’s degree in computer science who’s making these technologies and deploying them on the public, all right? So this is one dimension of kind of lack of expertise. We also don’t find that these technologies are being invented and pointed at maybe—well, the question then is who are these technologies designed for? Who are they pointed out? So one of the things that’s very, you know, prominent now, L.A. has been an epicenter for predictive policing technologies, also using facial recognition and other kinds of technologies. And, you know, those policing, for example, technologies, or Microsoft, or Amazon’s AI around facial recognition, it’s pointed at the borders. It’s pointed—who is it pointed at? It’s pointed at low-income communities. It’s pointed at vulnerable people. These are people I consider kind of the data-disposable, who are experimented upon to perfect these technologies. You know, no shade, but they don’t get pointed at Wall Street, let’s say, to say who might be the next person in here who’s going to defraud the economy, right? Like, these are, like—you know, I mean, I’m being a cheeky, but I—you know, the questions have to be asked around, like, what are these technologies? Who are they working in service? And who are they being exercised upon? And this, of course, goes to, you know, what we often find, is that the most powerful and influential are not engaging with these technologies at all, and their children are not being allowed to be raised on these technologies, right? It’s actually poor and working-class people, middle-class people. So I think, you know, we have to think about kind of, again, like, what are the increasing mechanisms of control that are also built into the design, again, of, like, why we—why do we want these to exist? What is their purpose? And I think those are important questions too. POWELL: Let’s go to Katharine and then Kenneth. We can group the two of them. Q: OK. I got to do my microphone here. Can everybody hear me? Oh. Really fascinating. I was actually the first news editor of the Huffington Post, and did all the moderation, and got calls from Dick Cheney’s office, and Elizabeth Edwards, even, to take comments down. So I have stories. NOBLE: I have somebody who might want to interview you. Q: Yeah, yeah, yeah. Lots of stories. (Laughter.) I also was the head of digital at The Washington Post. So lots of stories. So I actually do—I have a platform now that does all the large-scale gender diversity recruiting for corporations. Microsoft’s a big client. And to your—to what you’ve just said, it was super interesting. I have two—this is a question. We are getting—from a lot of these corporations they’re saying: We need more African-American women. We don’t need more white women. So we’re saying OK. But then our problem is, we need women to self-select with their race. And so as you’re pointing out, that the more data that people put on the more it can be used the wrong way. So I’m personally in this quandary right now, that, you know, we’re literally losing sales deals right now because we can’t get enough women to say, you know, how they identify. So that’s the opposite of this. And the GDPR stuff has also been very hampering on that side. So, you know, how do you sort of get the good players—and I’ve learned from years of being in digital, and some bad things. So there’s that. And then to that point too, I’m starting to see a lot with venture capitalists coming in and investing in verticalized communities because they’re sick of LinkedIn. They’re sick of Facebook. They’re sick of Glassdoor. They want these communities that are created just for people who self-identify. So there’s a lot of money that starts—that seems to start—that’s going into just women-only communities, where you don’t have the rules of Facebook. And a lot of money I’m seeing, even like with construction worker communities. And the question I have for you is it’s making me nervous, because I just saw that Alex what’s-his-name, the Infowars— NOBLE: Jones. Q: Jones, yeah. That devil guy. He—(laughs)—he just created his own—he’s off Facebook. But I guess he’s coming back on there. But the rise of these very scary political verticalized communities, even outside the bounds of these companies that seem to have some responsibility, what do we do about them? POWELL: And, sorry, if we can put Kenneth with Katherine. NOBLE: You bet. POWELL: Perfect. Q: You don’t seem to want to suggest specific enforcement mechanisms to bring about the transparency of values that you suggest. What are some of the specific suggestions by people within the field as to how to bring about or to enforce the values that you’re suggesting? NOBLE: OK. Let me start with that one, and then we’ll come back, OK? Q: (Laughs.) I gave you a lot. NOBLE: It’s not that I don’t want to suggest enforcement mechanisms. I mean, we have ideas and there are people who are already talking about the kinds of convenings that need to happen in more democratic ways, I think, to generate policy. Certainly we don’t have, for example, you know, at the level of, let’s say, the United Nations or some other appropriate international body enough of a framework—a regulatory framework, for example, for people to be protected from these technologies, right? How does one opt out of being facially recognized, for example, in the world? That doesn’t exist as a—you know, as a rule of practice, or of law. So I think that we certainly have a number of areas where we’re concerned about everything from I think, you know, worker and labor protections that should be afforded to people who do a lot of the kind of dangerous work that is part of this ecosystem that I’m talking about, to being protected, to having personal privacy rights that are, again, fairly non-existent in the U.S. context. So I think there are actually a lot of axes where we can intervene. I mean, one of the things that I also argue for in my own work is that we can’t simultaneously on one hand regulate the tech sector and then gut every possible public interest institution that would stand up as an alternative too. So we can’t gut public media, public libraries, public education, public universities, and so forth, which really stand up in a democracy to be kind of places where knowledge can proliferate too, and not—again, not allow for these kinds of, you know, less evidence-oriented, if you will, or disinformation-oriented spaces to dominate as kind of the public good around education and information. So we have lots of things and places where we can invest, and we can intervene. I will just say, you know, there—this idea of, you know, vertical markets is not new. I mean, some of us remember iVillage, BlackPlanet, I mean, MiGente— Q: Gay.com. NOBLE: Gay.com, PlanetOut. I mean, these are—we’re old. Been on the internet for a long time. So we remember those. And those were also extractive communities. Great research that’s been on those. Those are really about targeted marketing and being able to have a target market. So of course, some types of information flourish in them, but also it’s about aligning, you know, consumer purchasing power with certain types of ideas and identities. So I think we can look to some of the past work and say, like, you know, what’s the staying power of them? The issue isn’t so much, you know, that—you know, ethnic-based identity, you know, communities, whether it’s, like, women or other kind of marginalized communities, you know, that they have a space to be and generate community. I think the issue is that, you know, and the work of people like Jessie Daniels who’s at Hunter College here in New York. She wrote a great book called Cyber Racism. And in that book, she talks about it’s that racist speech has more speech than anyone else. So that it’s the distribution of visibility, quite frankly, that I think is also part of what we’re talking about. And we see that there are, you know, not just individual bad actors, but there is a—there is a—you know, click—the clickbait of racism and sexism and, in some cases, even going as far as genocide in certain parts of the world is incredibly profitable. It moves through these platforms. And every time it moves the companies make money, irrespective of their own opinion about it, right? And so those are things. At what point can profit—I mean, you know, we can look bac to history. You know, people have written—what’s the name of that book about IBM, the origins of IBM? You know, IBM perfected its punch card technology on the Holocaust. But no one remembers these histories and stories. We’ll have a retrospective about this moment. And I think we need to use our knowledge of history of these systems to think through it. POWELL: OK. So since you mentioned public media, we of course have to go to Fabiola, and then also Ryan. Q: Thanks. I actually—my background is I was part of the founding team at Yahoo. So—and I actually was the head of the team that built and led Yahoo Europe. And so we led—Sarah will be very familiar with the very famous French case around taking down Nazi items. And I share this—and I share this—I haven’t read your book, so I— Q: It’s not out. It’s not your fault. It’s coming. It’s coming. (Laughter.) Q: Oh, it’s not out yet. OK. (Laughs.) So what I found very interesting at the time is because it was the early days of the web in the late ’90s. What we set as the standard in Europe was—I had a team across Europe. And we spent a lot of time talking about—the way we managed search at that point in time was we actually had surfers, individuals who actually received the sites and then would categorize them. And so in and amongst my team at the time we decided—and I remember thinking at the time thinking, oh my God, we’re playing God. We decided collectively—I had a very diverse team—decided collectively what we were going to accept and what we were not going to accept. And the challenge became when the volume became so large, it became very difficult to do that. So we did our best. And we kept on racing, right, to take things down, like antisemitic items, right, to try to take down child pornography. We were basically applying our own sets of values at the time. What struck me, and the lesson that I learned, was I was able to control that in my domain, because this was the team that I had built and led. But I was incapable of convincing my counterparts—with whom I had also built, right, I was part of the original thirty-two in Silicon Valley—the importance of some of these issues. And the real challenge was that we had a lot of traffic going to dot-com versus going to our local European and individual—you know, eight country sites at the time. And so to make a long story short, what it taught me was it helps to have the regulation because the French case, right, brought things to the fore, right, and allowed them to start focusing and realizing, oh. But they still were able to say, oh, that’s something that just happens in those pesky Europeans, right? In those pesky European countries. POWELL: Who don’t like free speech. Q: Right. Who don’t like free speech. (Laughs.) But it didn’t change things here in the U.S. Now, what I’ve seen subsequently in the past 18 years is that I see Silicon Valley has become more and more dominated by mass-scaling developers, I guess is the best way to put it, right? And there has been shift. It’s been well written about and you can see it in some of the stats. And what concerns me is you have this issue here in the U.S., but similarly, to the gentleman’s question before, even though you have regulation in Europe, the right to be forgotten is a very good example, I have friends who’ve recently tried to be forgotten, had initially been told they will be forgotten, and then Google has so much money behind them and so much lobbying money behind them, that they’re able to make them not forgotten. (Laughs.) Right? So I guess, you know, you answered the question that said that it was quite a complex set of things that need to be done. And I’d like to hear—in order to address these issues—I’d like to hear what you think of upstarts like Inrupt and Tim Berners-Lee trying to decentralize the web. Because I think this is not—this is not an issue that you’re going to solve in one way, right? This is an issue you’re going to have to build awareness, right? There’s going to have to be at some level some regulation. But then because these organizations have become so powerful on a global level, the next piece that you’re also going to have to do is figure out a way to take the web that you describe, so that is so highly concentrated in the hands of very few, right, and unpack that. I’d like to hear what you think. POWELL: OK. I think Ryan and then you can close with your grand thoughts. NOBLE: OK, great. Do you want to have a comment? I mean, because it is a little bit directly related to content moderation. Q: Well, I just wanted to say—and I think it ties in with some of these other provocative questions—that, you know, really what we’re trying to ascertain in so many ways is how do we fix the problem. And we have a number of solutions. We’ve talked about legal interventions. We’ve talked about policy and regulation at the nation-state level. We’ve talked about diversifying the workforce. We’ve talked about greater transparency. So I think the answer among those is yes to all, right? But some of them we must beware of. What does it mean to have nation—localized nation regulation in an autocratic state, OK? What does it mean to diversify the workforce pipeline when we’re not attending to gross social inequity in the local community and in our material world? So there’s a disconnect. And I often think that part of the problem that we’re dealing with is that Silicon Valley in particular—which as I think you astutely point out and know probably better than anyone in the room is dominated by particular ideologies of liberation through technology. It believes that social problems can be solved with technology without really attending to the dimensions and the extent to which those problems are fomented by the very technology that we’re talking about. So putting some black women as coders on the team, I’m here for that. Please do. And let’s do it. But let’s also think about this pipeline issue when we’re talking about extraction, or when we’re talking about shipping labor to the Philippines, as we ship trash to the Philippines. And the one other thing I wanted to say is that, you know, the other bit of rhetoric about these technologies simply mirroring or reflecting human nature, human expression, yes. Yes, and they do things in an unprecedented way. They do them at scope, scale, volume, speed, and at a level of connectivity that we’ve not really attended to or seen before. And so that is actually not status quo. (Laughs.) That actually does something with all of these—if we started a baseline of a lack of equity and a lack of justice and of strife and of, you know, communities being marginalized and oppressed, and we pipe that through systems that don’t attend to those fundamental things but circulate them at speed, we have a problem on the output side. But we have to start at the front end. And that’s all I’ll say. Thank you. POWELL: I think we can give thirty seconds to Ryan and then thirty seconds to you to respond, yes. Q: Thank you very much. Ryan Kaminski with the U.N. Foundation. It’s been a terrific discussion. On this issue of education and awareness, you were talking about the hearings on Capitol Hill where, you know, WhatsApp was being compared with emailing. You know, it reminds me that the first resolution at the U.N. Human Rights Council on internet freedom was passed after the U.S. ambassador there brought diplomats from Geneva to Silicon Valley to talk about the internet and have, like, you know, a candid discussion. So my question is, you know, what platform—if that’s a helpful model—would be useful for that? Is it the companies? Is it scholars? Is the U.N.? Is it another platform? And what group of people would be best to have that kind of leapfrog seminar to learn about these issues and gain a better understanding? Thank you very, very much. NOBLE: Thank you so much. I don’t think that the foxes can guard the henhouse. So I think that there are a number of scholars, quite frankly there have been mostly women who—scholars, who right now have been at the forefront in my opinion of having the complex conversations about discrimination, ethics, and so forth, and technology, their impact on social inequality, and so forth. And I would be happy to provide a list to you at any time of people that I think are important and we could gather. I know that the U.N. has special rapporteurs that are looking at things like race and human rights. And we also have rapporteurs looking at technology and society. But we don’t necessarily have them meeting. And so that’s something that I think we’re interested in trying to bring about. Certainly we could add labor to that conversation. So I think the U.N. plays a really important role. And I don’t think that policymakers at the moment are kind of scaffolded up yet on the kind of granular levels of the research and the evidence. And so that’s something that we can at least provide from our lane. And then, again, convenings and making—I mean, making this work legible to policymakers is really important. And I’m not sure that, you know, that should be led by, you know, the current industry players. POWELL: Thank you so much. So much there. (Laughs.) Hopefully we can bring you back from Oxford at some point for a follow-up discussion. NOBLE: I’d love to come to New York anytime. Listen, I just want to say thank you so much for this opportunity and, of course, for your leadership. And the invitation is really meaningful to me. So thank you for the opportunity to share some ideas today. (Applause.) (END) This is an uncorrected transcript.
  • Transnational Crime
    Taking Stock of the Global Fight Against Illicit Financial Flows
    A growing number of actors have joined the fight against dirty money. The success of global efforts to combat illicit financial flows, however, remains uncertain. 
  • China
    The Global Artificial Intelligence Race
    Play
    Panelists discuss the global race for leadership in artificial intelligence, and provide an analysis of major AI legislation and initiatives in China, the European Union, and the United States. 
  • Technology and Innovation
    Platform Economy—How Companies Can Drive Inclusive Growth
    Voices from the Field features contributions from scholars and practitioners highlighting new research, thinking, and approaches to development challenges. This piece is authored by Henriette Kolb, Manager, Gender Secretariat, International Finance Corporation (IFC).
  • Afghanistan
    Building a Skilled GirlForce: Lessons from Roya Mahboob
    On International Day of the Girl, we celebrate Roya Mahboob, Afghanistan's first female tech CEO. Empowering the next generation of women STEM leaders she mentors the Afghan Girls' Robotics Team and leads the Digital Citizens Fund. 
  • Women and Women's Rights
    Who Run the World: Girls Powering Afghanistan's Digital Future
    Podcast
    As technology transforms the world of work, a generation of girls is in danger of being left behind. Globally, girls are underrepresented in science, technology, engineering, and mathematics (STEM) education. In celebration of the International Day of the Girl Child, Roya Mahboob and Fatemah Qaderyan discussed their experiences breaking barriers in STEM and what needs to be done to reduce the education gender gap.   STONE: So we’re going to get started. Is everyone feeling good? This is the few, the proud, the breakfast crowd, CFR breakfast meeting. So I just wanted to welcome everyone. Good morning, my name is Meighan Stone, and I’m so honored to be a senior fellow here at the Council on Foreign Relations in our Women and Foreign Policy Program. Before I joined the team here at CFR, I worked with Malala Yousafzai, and I was the president of the Malala Fund, so—our youngest Nobel Peace Prize winner—so this is a particularly exciting day for me—International Day of the Girls, one of my favorite days on the calendar. And we’re really grateful that all of you made the time to join us today. Our mission here at the Women and Foreign Policy Program is to analyze how elevating the status of girls and women around the world furthers our U.S. policy objectives. So to that end, our conversation is actually on the record today; I know many conversations at CFR are not. This is a day where we want to encourage you to take your phone out and feel free to tweet. The hashtag is #CFRWomen—so #CFRWomen, and if you want to post a photo, or if you hear something today that’s particularly meaningful, that you want to expand the conversation beyond the Council and these walls, we want to encourage you to do that, and our team will engage with you on social. So we’re going to be having a presentation today to start. We’re going to mix it up. Today we’re going to have a bit of a TED Talk kind of 10-minute presentation from Fatemah. When I was your age there was no way I could have given that kind of talk—(laughter)—so I’m even more awed and honored by you, Fatemah. So she’s going to be presenting for about 10 minutes, and then we’re going to bring everyone up on stage to have a little bit of conversation. And then we’re going to open it up to Q&A as we always do at the Council. I want to especially encourage our younger attendees today—which means you are not an adult; you are a young woman or girl—to feel really bold about asking questions. We want to hear from you today. So today is really important because it is this International Day of the Girl. We’ve been celebrating this since 2011, and the goal of the day is really to celebrate girls’ extraordinary achievements and potential, and to take action to advance the rights and opportunities for girls everywhere. So having our guests today is right on point, all the more because they have a theme this year for International Day of the Girl, which is “A Skilled Workforce (sic; GirlForce).” So when we think about STEM, and tech, and digital—all the more relevant, and why not champion and celebrate girls and women doing this work in Afghanistan. We know we’ve had a lot of progress globally on girls and women in tech, but currently women hold only five percent of leadership positions in the tech industry, and only about three percent of ICT graduates globally are women. So we have a lot of ground to take. In Afghanistan specifically, we know that labor force participation rates with young women are particularly low, and we know that progress in education there can also sometimes be threatened by the security situation. And we even saw that secondary education for girls over the last few years in Afghanistan went from thirty-seven percent to thirty-five (percent.) So we want to see improvement. We’ve seen incredible efforts by the government of Afghanistan to continue to support those numbers. We’ve also seen cause for hope when we look at the parliament in Afghanistan, what are women doing when they graduate, when they’ve learned, when they’ve gotten their degrees. We see women holding 27.7 percent of seats in parliament in Afghanistan, and that number was only four percent in 1990—so cause for celebration and cause for more hard work together. I think all these statistics are why Roya’s work is so important in Afghanistan. We’re really excited to have you share with us today. My favorite fun fact about Roya is that she started her foundation operations trading bitcoin—(laughter)—to find resources to support her work, so her creativity and commitment are unparalleled. So today we just want to have this conversation on the International Day of the Girl to explore how we can expand learning opportunities and prepare girls to join this workforce of the future. So we’re thrilled and honored to have our guests here today, Roya Mahboob and Fatemah Qaderyan, and then also Kawsar Roshan has joined us as well, so we have an additional member of the Afghan robotics team. We are so thrilled that all of you are here. Roya is the founder and CEO of Digital Citizen Fund, which is a nonprofit that is increasing women’s technological literacy and providing employment and educational opportunities to girls across Afghanistan. She is one of Afghanistan’s first women tech CEOs, and she was named one of Time’s most a hundred influential people in the world because of her innovative initiatives in Afghanistan and globally to expand computer education. So we’re thrilled to have you, Roya. MAHBOOB: Thank you. STONE: Fatemah is the captain and the spokesperson of Afghanistan’s all-female, high school robotics team. She is 16 years old. I think you’re our first young woman to have speak at the Council in quite a while, if not ever, so this is a good day. We’re so glad you are here. She is an eleventh-grade student, so she is off from school to spend time with us today. She goes to Mehri High School in Herat, Afghanistan, and she actually wrote her first book, My Afghanistan, at the age of thirteen. And she is currently working on her second book. So we really admire that at the Council because we like to write books here, and to think that you’ve already got one done and you are working on your second is really impressive. So with that I want to hand it over to Fatemah to share with us, and Roya, who is going to be translating during her presentation. So I want to welcome you both to come up, and Fatemah, do you want to share your presentation with us today? Why don’t we give her a round just to welcome her? (Applause.) QADERYAN: (Through interpreter.) Salaam. This is not a greeting only from me, but it’s from all the young females from Afghanistan to you. As a child, my world was filled with curiosity. I had abundant passion for understanding how the world works. I watched documentaries about technology as favorite pastime. I sleep with a book under my pillow every night. I walk around with so many books in my backpack that my mom would often ask me whether I have filled it with stones. (Laughter.) I was six years old when I first saw the cartoon called I Robots (sp), and it was so interesting to me that how the robots could talk, and they could walk like humans. I became so curious about the technology works, and this was the first time that no one could answer my questions. After a time, my imagination grew and became a dream. I wanted to go to the school to learn as much as I could so I could one day build robots of my own. But I wasn’t aware of the danger of dreaming in my country. In my country, girls are not supposed to be curious, and instead they should be—(inaudible)—and shy. My mother told me stories about the dark area of the Taliban who would force women to stay inside of their homes so they would be easier to control. The Taliban insisted that the mullahs’ roles to be accepted as the truth. And they destroyed the power of the knowledge and imagination and left no room for innovation. They keep everyone in dark in the name of Islam and Sharia law. Even taught everywhere in Quran, curiosity and scholarship are encouraged. Today Afghanistan is a place where Rokhsana was stoned, where Farkhunda was (burned ?), and where women are murdered, but here are still sign of the hope. But due to the economic priority, teacher shortage, lack of opportunity, and the Taliban continued presence in the part of the country, many of the girls and children are still denied access to education. And even when the girls have the opportunity, cultural barriers and prejudices stand in their way. My team and I faced these challenges when we started our robotic team. The Afghan Dreamers started through a program run by Digital Citizen Fund, and our teacher was Alerazami Harimon (ph). Many of my relatives didn’t understand and support my interest in science and technology, especially because mechanic is such a male-dominated field. My father was the only one who supported, and encouraged me, and helped me to go every way. When we got the opportunity to go to the competition, we were so excited, but many girls couldn’t participate because their families were very conservative. One of our challenges was that our visa was denied, but finally, with intervention from president of United States, President Trump, we could finally get the visa. We broke our silence and spoke out, and our story reached millions of the people through the TVs and social media, and fifty-three congressmen signed a petition to us. And finally we could win and we could come back with the medal of silvers to our country. This medal sent a message to everyone who doubted us. We proved that if you giving a chance to the young girls we will be able to reach our full potential and will hold the Afghan flag with our pride. A week after our return to Afghanistan, ISIS took my father from me. I always believe that there is always a hero in the life of a child. And my father was my hero. A year has passed, but still I can’t believe that he is gone. It seems to me that good people die before their time. Everything is in a child that start with imagination. After a while, imagination grew and become a dream. Once they had a dream, they want to achieve in their reality. Children of conflict only knows the blackness of the blood and the redness of the blood. But they know that the condition in their country has forced them to only have access to two color from the whole pattern, no more. Many people in older generation continues to hold the prejudices told to them by the Taliban, but my generation is different. Children or young adults make up over fifty percent of our society population. Leadership must be in the hands of the young, the generation, and consider technology as weapon of the peace, but not the generation that consider Afghan (foes as brothers ?). We are the children of the war, but we have proven that hope still exists. Hope is what builds my today and your tomorrow. My friend and I were the ones who plant the first seed of the science, technology, engineering, art, and maths, and today we are harvesting the result. We had a chance to meet with president of Afghanistan, Ashraf Ghani, and sharing our plan to build the first school of science, technology, engineering, art, and maths in Afghanistan, and we are still surprised that he has not only accept one, but he wants to build five in five zone of Afghanistan. We are today—are the people who started a revolution of the technology in our country, and we still know that there are lots of the problem exists in our way, but we are trying to stand in front of all of those. And we have proven that sometimes a small opportunity can change the story in (a country ?). And we are happy that—to proving the importance of the STEAM education for not only our leaders, but also for our communities, and we are trying to—building the schools. And we were proving that today the pencils is on the hands of the young generation, and we are going to change the world. A thousand years ago, Afghanistan produced scholars like Abul Alecena (ph) and al-Biruni, who is scholars in math, and science, and philosophy, but influence and highly regarded around the world. Now it’s our turn to write the pages of our story through knowledge and technology. The children in my country no longer want to hear the sounds of the gun and bullets. We do not want to be bystanders. We wants to become the actors. We do not want to live in the fear of the distribution of international imports or relationship. We want to produce and be exports. This is why we have started technology revolution in our country. And I want to change your image of our country. My teammate and I know the danger under the water and, yes, we know that there are sharks that want to make us their prey. But we also know that there are shiny pearls in the depths of the ocean. We will go to them by swimming in the deepest water and we will harvest those pearls. And then I wanted to say that I wholeheartedly believe that we cannot see the future, but we can build ourselves so that the future we want becomes a reality. Thank you. (Applause.) STONE: Thank you. Thank you so much. Thank you for sharing that. I still can’t believe you’re sixteen and just gave that speech. I can’t speak like that now and I’m way older. So we’re going to have our team bring up some chairs so we can sit and have a conversation together. So, Fatemah, if you want to—if you want to come back here, they’re going to bring all this up. We’re going to welcome Kawsar and Roya actually to come up and start our conversation. And so this is a good time to start thinking about your questions as well because we’re going to open that up in just a moment. So I want to start with Roya actually and just talk about what Fatemah just shared about wanting to change the image that many people have of Afghanistan, of your nation. And I’m wondering if you can share with us about this vision that you have to build the first high school that is STEM-focused for girls in Afghanistan. And what caused you to develop that vision? And why do you think this project is so important? MAHBOOB: Sure, and thank you. I think that I have to a little bit give you the background about myself and why we came up with this idea. I’m a tech entrepreneur from Afghanistan and I also ran a foundation called Digital Citizen Fund, that we are trying to empower women through digital literacy and providing education and financial literacy. We have helped thirteen thousand local girls to come through our program. We have different programs, from coding, building applications, games, and financial literacy to help them to start their own startups. But two years ago when we started a robotic team, things have changed. I mean, many of our projects had important impacts in our society, but I think that this one was a bit different. It was just was, like, a spark in the darkness and it has changed. After they come to the United States and they come back to their countries, they have been changed the view of so many people in the community, in the leadership, and in the politicians, and economics. So that was for us a hope to continue, to not let this light to be turned off. So we went to do so many competitions and they win some of the medals and they meet with lots of the leaders. And it has changed the view of the Afghan people, especially men, on the women’s ability in science and technology. And we have forty-one percent of our population lives under the age of eighteen—under the age of fifteen. And in order to compete in twenty-first century, we have to—the countries must have access to groundbreaking technologies. And unfortunately, inequality to access this education still exists and especially in the area of a third-world country, and Afghanistan is one of those countries that they don’t have that access. And what we want today is to build the first school of science, technology, engineering, art, and math in Afghanistan to build the next generation of the young leaders in science and technology. And we’re three years along to building the MIT-style university focusing on high tech and hopefully to see Afghanistan within the next ten years as a country with sources of high tech for their country. STONE: That is a great goal. I love that you talk about how you’re changing the perceptions of men in Afghanistan about what girls and women are capable of. And you’re just showing by doing, you’re leading by taking action. And I wanted just to ask about, you know, allies as we call them here in the states, so, you know, men that support, boys and men that support your work. I know, you know, Fatemah, when you shared so powerfully about your dad and about losing your dad and how he loved you so much and how he always supported you and celebrated how curious and inquisitive and intelligent you are and that he’s still in your heart today, you know, helping guide you. You know, I’m wondering if you can share about men that—you know, what is that interaction like with men around this or men who have been supportive? You know, I think about Malala’s father always saying that all he did was not clip his daughter’s wings and just let her fly. And, you know, I think there’s some real wisdom in there. I don’t know if you want to translate, Roya, the question for them. QADERYAN: (Speaks in Farsi.) MAHBOOB: She says that Afghanistan is very male dominated and very conservative. And if you don’t have the family support as a first thing, it’s not possible that you can continue your education or your growth. So for her, it was her father who supported her and that’s why she did succeed. And she said that at the beginning, the men in Afghanistan doubted their abilities, but today many of the families have changed views, especially men in their families on the women’s ability in science and technology and they supportive of this cause. And I have to also mention that lots of—we couldn’t be here today if the men didn’t help us and support us in this progress. And I guess it started from, first, from their coach that was a man, and then lots of others. Ambassador Mohib, he was very supportive of us. STONE: The ambassador of Afghanistan to the United States, who’s now the national security adviser to President Ghani, has been a big champion. MAHBOOB: Yes, he was a very big champion and he is still very supportive of us. And there are lots of other men in Afghanistan on the ground helping us to build these schools and building on our ability. And actually, they lobby for us. So also, I think that it’s very important to say that’s an important role of men in our success and especially for the future if you wanted to bring the change, because I think that this is a collaboration, it’s not, like, one, we only as women can go. We can go, but if we have the support of the men, we can do it faster. STONE: I think that’s such an important message and especially as people think about Afghanistan. I don’t know if you’ve heard, here in this country we’re still working on that issue as well, so we’re joined in that. I want to ask a question to Kawsar before we open it up to our guests. And so I don’t know if Kawsar wants to share about her own experience about what barriers she overcame in her life to be able to become part of the team. And I have to say I want one of the uniforms because they’re amazing. (Laughter.) They’re phenomenal. But what did—what did Kawsar overcome in her own experience to become part of the team? ROSHAN: (Speaks in Farsi.) MAHBOOB: She also mentioned that Afghanistan is a—is a country of male dominated and very conservative. And the women—we have very few women in the leadership of technology, so that’s also very challenging because they don’t have enough role models to look at. And for her, it’s just she lost her father when she was three years old. And her mom and her sister were supportive of her that she could come and joining to this team. And she was very interested in technology and science when she was a child because she likes the complex problems, to solve it. STONE: Well, we’re—if you can share with them we’re just really humbled by both of your strength and bravery and courage and commitment, especially losing family and still pressing on and achieving their dreams is really incredible. And we’re proud of you and we’re grateful you’re here. QADERYAN: (In English.) Family, yes. (Laughter.) STONE: All right. So we’re going to open it up to questions. What we do at the Council is, if you have a question, go ahead and put your placard up and we will call on you. So feel free to put up—I see a young woman putting her placard up first, Sophia. So are you ready to ask your question? Q: Yes, I actually have three of them. STONE: You have three? Well, this is a great educational moment. Your dad is training you here. (Laughter.) So we have a rule here against speeches and multiple questions. (Laughter.) If there was anyone to break it, it would be you. How old are you? What grade are you in? Q: I’m eleven and I’m in sixth grade. STONE: You’re eleven and in sixth grade. I really want to bend the rules for you, but why don’t we—why don’t we see how you do with your first question and if we run out of questions we’ll come back to you. Is it a deal? Q: OK. Uh-huh. STONE: All right. Sophia, we’re ready for you. Q: How have your achievements affected women’s position in your society and community? STONE: Awesome, that’s a good question. MAHBOOB: That’s a good question. QADERYAN: (Speaks in Farsi.) MAHBOOB: She said that we receive a lot of the awards and the medals. We received the congratulations. We’ve been in Estonia. And all of these awards actually are giving a message to the very conservative men in Afghanistan, if they give a chance to women to go on in the competition, they can make the—they come back very proud and they make the flag of Afghanistan show to the world the abilities of women in this field. So that has changed the view of the men in the country. Q: Can I go on with my second? STONE: We’ll give you a follow up. We’ll give you one follow up, Sophia. What’s your—what’s your follow-up question? You can have one follow up. Q: So how have your achievements affected— STONE: No one tell Richard Haass, all right? (Laughter.) Tell us? Q: How have your achievements affected schooling in your community? MAHBOOB: School in the community? ROSHAN: (Speaks in Farsi.) MAHBOOB: So they say that when we—we are—we are part of different schools with a different number of students, like, for example, between five (thousand) to six thousand students because they are part of public schools, government schools. So when they come back, they’re providing the training and workshops for the students inside of the schools. But in general, they created—as Fatemah says, that we become a role model for the girls. And right now, competition is getting very tight and less of the girls want to be participants in robotics and this field. STONE: Great follow up. We see you in the White House press corps in the near future. Why don’t we go to Holly, we’ll go around the table? Q: Don’t you want to ask us—(inaudible)—ask my question? Q: I have a question. Q: She has her own question. STONE: OK. Well, we’ll go to Maeve Brogan— Q: But I’ll cede my— STONE: —after Holly, how about that? We’ll start with Holly and then go to Maeve. Q: Oh, OK. So Fatemah and Kawsar, in ten years from today, where are you going to be and what are you going to be doing? QADERYAN: (Speaks in Farsi.) ROSHAN: (Speaks in Farsi.) MAHBOOB: She wants to say in general and then also she wants to say the personal. Q: Perfect. MAHBOOB: So she says that Afghanistan within the next ten years is a country of innovation and technology. And she says that we want to go to this field to learn about STEAM and also becoming—she wants to grow to be a specialist in AI and come back to the country. And because they have all these schools, they want to teach students there and give back to their communities and helping to be a country to working on innovation, bringing new products and new ideas. And she thinks that in the next ten years, Afghanistan has a lot of awards not only in the country, but also in the region in terms of innovation in AI and robotics. STONE: We’ll go to Maeve. Do you have a question? Q: All right, yeah. If not everyone wants to be involved in robotics and not all the girls know they can be involved in robotics and technology, what are some ways you would want to inspire them and bring these opportunities to them? STONE: Great question. QADERYAN: (Speaks in Farsi.) MAHBOOB: So she says that the STEAM education is very male dominated and more than eighty percent are right now in the hands of men. And she thinks that it’s very important that the girls get involved because it’s such a—such a field to create more creativity and you can be more creative and you can also solve complex problems, that it will help you to not only build your personal life, but also it helps you to build your future. QADERYAN: (Speaks in Farsi.) MAHBOOB: And the future is about STEAM. And if the girls are not getting involved, they are going to lose opportunities in the future because, first of all, in the next ten or twenty years, the jobs in the industry will be double and then lots of high salary would be another thing, an option that the woman could have. QADERYAN: (Speaks in Farsi.) MAHBOOB: And if they get involved, they can also increase their income and also help support their families and also grow the economy of their country, they can be part of the growth of the economy. QADERYAN: (Speaks in Farsi.) MAHBOOB: Like, for example, we should look at the cybersecurity which is a very important thing for the security of the country. And many of the sites we see that it was hacking, and especially in Afghanistan we don’t have that expertise. But by 2020, we need a lot of the jobs in cybersecurity and we need to have women to be represented in this field as well. STONE: I love that we have a sixteen-year-old girl giving cybersecurity job coaching this morning. (Laughter.) So I have to imagine that you, Roya, and all of the students on the team are great role models in this. You know, it’s like we always say, you can’t be what you don’t see. And, you know, being able to be visible and see makes a big difference. I want to go around the table and come to Emily for your question. Q: Hi. So my question is, like, how do you hope that international, like, organizations get involved and, like, help you further your goals? Because there’s been such a big focus on, like, improving women’s access to education and, like, including women in, like, the technology field? STONE: Roya, if you want to, you can start with that one, too, after you translate maybe as well, yeah. ROSHAN: (Speaks in Farsi.) QADERYAN: (Speaks in Farsi.) MAHBOOB: So she says that investing in STEAM education for young girls if very important because they are living in society and also they know lots of the problems inside of the families and the environment and the community. And if they get involved with this education, they can bring better productivity in the country. But I think also that, as Fatemah says, investing. And also, what we need is funding and we need resources. And if the organization wants to join forces, we would welcome them. And right now, we’re going to build not one school, five schools in five zones and we’re going to build a university. So I think that the main part of that is we need funding. And we would to be partners with organizations who are interested in this, that want to be part of this revolution in Afghanistan. We would welcome them to join us. STONE: That’s good. I think our hope in having these conversations is being able to be an earnest broker for people that are caring about specific issues to come together and find ways to take action outside this room. So be sure to talk to Roya after if you want to be part of not just being inspired this morning, but actually doing something. I’m sure she would love to have that conversation. I’m going to go to Samantha for our next question. Q: Hi. Your work has truly sparked a fire internationally on trying to integrate women into the global workforce, especially within male-dominated fields. So my question for you is, how do you think your work will impact not only the newer, younger generations, but the older generations as well who have their own already preconceived perception of women and where they should be in the workforce? MAHBOOB: I will respond to this question. I think that we already—we are not only going to meet with the younger generation, but also we had a chance to meet with lots of the leaders in politics and religion and economics in Afghanistan. And it’s interesting that we saw that the changes happened in them as well and they understand the importance of the knowledge and especially technology as a tool to improve the economy of the country and security and the future of Afghanistan. And we’re not only going to change that view in Afghanistan, but the good thing is that we go to other countries, we meet with lots of the leaders there. And recently, a month ago, we were in Mexico and met with the minister of development there. And interestingly, we discussed about technology and what we have to plan for Afghanistan and they got so interested. And they asked the team to lead their force in Mexico and help them to build their own first STEAM education, first STEAM schools in Mexico. So right now we are working with them as well. So I think that it’s not only for the younger generation, but also for the older generation. And it’s good that they understand that this is very important. STONE: Yeah, that’s so important. I mean, a statistic that really resonated with me when we were preparing for this discussion is that in the next decade six hundred million adolescent girls will enter the workforce—six hundred million over the next decade—so we need to get this right for that generation and for generations that are also figuring out how to come to terms with the digital evolving workforce and economy. So we’ll go back around the table, so I want to go to my colleague here. Can I read one of your questions? (Laughter.) Did you work out a deal—you’re now—you’re now reading her third question? This is a natural diplomat right here. Q: Well, OK. What is your favorite problem or what has been your favorite project or robot that you’ve worked on? STONE: Maybe give that to Kawsar so we can hear from her as well. ROSHAN: (Speaks in Farsi.) MAHBOOB: So one of my favorite projects was the robots that we prototyped for the farmers. We did an interview with lots of the farmers, the small farmers in Herat to see what’s your problem. And then we designed a robot to helping to cut and package and process the wheat. And this is a prototype that they have done. And it’s reduced the time and it’s increased their productivity and also it’s very—it’s good for the small farmers to use these small machines for their work. So that’s one of the favorite projects and they are working on that. STONE: Great. We’ll keep going around I think to—does your father have a question as well? Q: So the answer to that actually answered the question that I had, which was solving problems in Afghanistan, not just the competitions. And so, because they answered my question, I put my card down. (Laughter.) STONE: OK. Well done. All right. I see you—yeah. Q: What other girl teams have you met? MAHBOOB: What? Q: What other girl teams have you met? STONE: What other girl teams have you met? Have you run into a lot of all-girls robotics teams on the high-stakes robotics team circuit? Have you—have you run into many other all-women teams? MAHBOOB: I think so because they are going to compete—going to different competitions, so they meet with lots of the teams that are working on robotics. Like, for example, they were almost four months in Canada for the first competition and they met with lots of the teams that are girls and they build robots. So they had the chance to working with the students as well. Q: Were there any all-girls teams from a non-Western country? QADERYAN: (Speaks in Farsi.) ROSHAN: (Speaks in Farsi.) MAHBOOB: So they forgot, there was one of the African countries that they met in Mexico that they were all girls. And then also, the coach was a female, so that was also another team that they met. Q: Thank you. STONE: So one question from me about just the high school—the high school—the five high schools, right, that you’re looking to start in Afghanistan. What kind of curriculum, what kind of skills are you looking to teach in those schools as you’re talking to partners and donors and friends who want to support this program? What kind of educational opportunities do you foresee girls being able to access in those high schools? MAHBOOB: So this is a program that we are working with our different partners. More focus would be on AI and robotics, but it would—it would be based on the standard or the curriculum that the United States and Canada, they’re teaching in their STEAM schools. So we will take that, but we will a little bit adopt it with Afghan society needs. But then our focus more will be, again, on AI and robotics. And we are right now working with our partners on developing a good curriculum for this. And the idea is that from ninth grade we take an exam from the students on logic and mathematics. And once they take this exam, the best students can go and enter into this—to this—for the high school for ten, eleven, and twelfth grade. And then the training would be—everything would be in English. So we’re going to have the professor, we think that they will come and they’ll do lectures in Afghanistan. And we’re going to have also online training for them, virtual training for the students in some subjects. But also, we already find lots of the Afghans who got their master’s from AI or from different computer science or engineering from the United States and Europe, so we’re going to have also resources there from inside of Afghanistan for that, for the program. STONE: I think it’s so powerful because it really flips the script on what people think about girls and women in Afghanistan, girls and women, you know, writ large in terms of tech and AI. I see one last question here from Christopher Brogan. Q: One of the—one thing that’s evolving is not just STEM, but adding the A for arts as we move forward. And clearly, Fatemah has got her second book in the works. So I’m curious as to how—do they have artistic pursuits that help feed that creativity as it relates back to technology? MAHBOOB: Well, I—well, we have science, technology, engineering, art, and math, so it’s STEAM, so have included art, which is very important, in our curriculum. STONE: Do you—do either of the students, do either of you pursue anything in the realm of arts? Of course, Afghanistan has a rich artistic history as well. It’s pretty extraordinary. QADERYAN: (Speaks in Farsi.) MAHBOOB: She does painting. STONE: Painting, that’s wonderful. How about Kawsar, anything? ROSHAN: (Speaks in Farsi.) MAHBOOB: She is more interested in the mechanical field. (Laughter.) STONE: Yes, I love it. Well, the last question I want to close with is just to—because this is International Day of the Girl and it is meant to celebrate the achievement of girls and girls who also become women one day. And I wonder if we could just go down and each of you can just share with us something that you’re most proud about in this work, because you’ve overcome a lot. You know, when I think about last year with everything around the visa process, your visas getting rejected, and then there was—it was pretty extraordinary for someone that cares about girls and women in the region to see the whole world really get engaged for a couple of news cycles about would you get the visa or not was really wonderful and extraordinary to see the outpouring of support. And, you know, you’ve had so many achievements and successes since and I know many things that you overcame to even get to this point. So perhaps each of you could share something that you’re proud about that we could join you in being celebratory about today. And we’ll bring it to a close after that. So do you want to share the question, Roya, and then you can kick us off? QADERYAN: (Speaks in Farsi.) MAHBOOB: I would say very small that imagination is powerful, dream big because that’s what we did. QADERYAN: (Speaks in Farsi.) MAHBOOB: So whatever you believe, stand on your belief, even if there was no one standing with you. And always fight for what you believe. And also, persistence is one of the things that you can get it and don’t give up on the things that you believe on that. ROSHAN: (Speaks in Farsi.) MAHBOOB: Always imagine what and who you want to be. And I always dreamed to be one day a mechanical engineer and today now I have that opportunity to become. STONE: That’s so powerful. Well, we’re so honored that you came all the way from Herat, Afghanistan to be with us today. You know, what a wonderful moment we even had before we started today about how the Council, we’re all connected in ways we don’t even expect. One of our member’s—Jeffrey’s daughter Eleanor is actually a robotics team high school student here in the area and was actually part of the petition process to bring you to the states and I know sent a gift this morning. So young women are connected in ways we didn’t even realize. So let’s give our speakers a round of applause. Thank you so much for joining us. (Applause.) If not the first time that we had a young woman under the age of eighteen speak at the Council, certainly the first time we’ve had at least fifty percent of a discussion be young women, so this is a historic moment for CFR. (Laughter.) Thank you so much. And I think if we wanted—if anybody wants a photo with you maybe, would you be willing to agree? If anyone wants to come up and have a photo after, I would be happy to take a couple of photos before their next engagement. So thank you so much. (END)
  • Technology and Innovation
    The Artificial Intelligence Race and the New World Order
    Play
    Kai-Fu Lee discusses the advances in artificial intelligence technology, the effects on the future of work, and the technology race between the United States and China.
  • Technology and Innovation
    Governing the Next Technological Revolution
    With the perils of heedless innovation all too apparent, and with a new and potentially more transformative wave of technical advances in the pipeline, global movements to govern the next technological revolution are beginning to take shape.
  • China
    Trump is Rising to the China Challenge in the Worst Way Possible
    China has 800 million Internet users and is overtaking the United States in areas such as drones, mobile payments, bike sharing and artificial intelligence. But Trump is responding to the China challenge in the worst way possible.
  • Economics
    The Restructuring of the World
    Trade protectionism, together with fears over the national-security implications of technological development, are contributing to a balkanization of the world order. This is not good news for the United States as it faces an intensifying rivalry with an increasingly powerful China. MILAN—The global economy is undergoing a far-reaching transformation. Change is being driven by shifts in countries’ populations, productivity, wealth, power, and ambitions, and accelerated by US President Donald Trump’s moves to reshape supply-chain structures, alter cross-border investment incentives, and limit the movement of people and technology across borders. The tensions that these changes are producing are most apparent in escalating disputes over trade. Notwithstanding some dislocations in emerging economies, markets’ reaction to the tit-for-tat tariffs so far has been only muted. Investors probably assume that it is all just part of a renegotiation process that will ultimately produce new rules of engagement for global business—rules that are even more favorable to the powerful. But such assumptions may underestimate the complexity of the issues at play, beginning with the politically salient matter of where investment and employment are created. On their own, tariff and trade barriers, if viewed as transitory negotiating tactics, will not significantly change global investment patterns or the structure of global supply chains and employment. Protectionists like Trump argue that the power of tariffs and other trade barriers lies in their ability to curb cheating and free-riding. The implication is that such measures can help to eliminate the tensions, imbalances, and polarization associated with globalization. “Cheating,” of course, is in the eye of the beholder. State subsidies for specific sectors, including preferential treatment of state-owned enterprises, may be regarded as cheating. So may requiring technology transfer in exchange for market access, public procurement favoring domestic entities, acceptance of unsafe work environments and exploitative labor practices, and exchange-rate manipulation. The test of free-riding is whether a country contributes too little, relative to its capacity, to the provision of global public goods, such as defense and security, scientific and technical knowledge, mitigation of climate change, and absorption of refugees. The culprits depend on the topic in question. But whatever the downsides to cheating or free-riding, tackling these behaviors is unlikely to eliminate the conditions that have contributed to economic, social, and political polarization. After all, labor arbitrage has been the core driver of the organization of global supply chains for at least three decades—accelerating, of course, with China’s rise—with significant distributional and employment effects. It seems unlikely that, had China and other emerging economies adhered to the letter of World Trade Organization rules, the distributional effects of their integration into the global economy would have disappeared. What, then, is the real purpose of the tariffs? Trump could be interested only in leveling the playing field, at which point he will accept global market outcomes. But it is more plausible that this is all part of his strategy—echoed by leaders in a growing number of countries worldwide—to win support by asserting national priorities and sovereignty. Such efforts are pushing the world toward a more balkanized system. Moreover, the challenges and fears raised by advances in technology, especially digital technology, with regard to both national security and economic performance are also propelling the world toward greater fragmentation. Fifteen years ago, few would have predicted that mega-platforms like Google or Facebook would become key players in areas like image recognition, artificial intelligence, and the development of autonomous vehicles (including military vehicles). Yet that is exactly what has happened. In fact, Google is now a defense contractor (though it may not renew its contract). Given the security implications of these developments, as well as a host of issues like data privacy and security, social fragmentation, and foreign interventions in elections, countries are unwilling to leave the Internet unregulated. But they are also unwilling to delegate regulation to a supra-national body. As a result, many are taking matters into their own hands, leading to a growing divergence among countries regarding Internet regulation. Reflecting the national-security tilt of these initiatives, the scope and authority of the Committee on Foreign Investment in the United States—responsible for reviewing the national-security implications of foreign ownership of US companies or operations—has recently been expanded. Despite these efforts, however, the fact remains that innovation cannot easily be blocked by national borders. On the contrary, the diffusion of ideas may well become the most consequential dimension of globalization in the future. While this may complicate national-security planning, it represents powerful new opportunities for business, even as trade faces headwinds. Already, there has been an explosion of innovative, digitally-based business models, many of which could become powerful engines of inclusive growth, especially in emerging economies. Digitally-enabled ecosystems, with open architecture and low barriers to entry, are one example of an emerging model with considerable economic potential. There is one more crucial dynamic that will shape how the global economy will develop in the coming decades: the strategic rivalry between China and the US. At this point, it is impossible to say precisely what form this rivalry will take. What is clear is that every part of the global economy will be affected by the mix of cooperation and competition that emerges. In the face of a powerful rival, one might expect the US to pursue a strategy focused on building, expanding, and consolidating alliances with natural allies—that is, countries with similar governance structures and shared views about the benefits of international cooperation and open markets. Instead, Trump has alienated longtime allies and attacked multilateral structures and institutions, all while antagonizing China in what is quickly becoming a two-player game. This is a bizarre strategy. Whatever advantage Trump thinks he will gain by positioning the US in opposition to its natural allies will be dwarfed by the losses. A split between the US and its traditional allies, if it becomes a permanent feature of the new global order, would lead to deeper fragmentation among the world’s market-oriented democracies. That will surely shift the long-term balance of power in China’s favor, as it moves steadily toward becoming the world’s largest economy.