Economics

Technology and Innovation

  • Women and Women's Rights
    Behind the Screen: Gender, the Digital Workforce, and the Hidden World of Content Moderation
    Podcast
    As user-generated content on the internet continues to increase in popularity, the question of who moderates this content comes to the forefront when discussing the future of social media . Dr. Sarah Roberts, author of Behind the Screen, Content Moderation in the Shadows of Social Media and assistant professor of information studies at the UCLA Graduate School of Education and Information Studies, joined the Women and Foreign Policy program to speak about the world of content moderation and the importance of this invisible work.     POWELL: We’re going to go ahead and get started. I’d like to welcome everyone. And my name’s Catherine Powell. I’m with the Women and Foreign Policy Program here. And I want to acknowledge the Digital and Cybersecurity Program which we’re co-hosting with tonight. So some of you may have gotten the email through that program as well. It’s with great pleasure that I introduce Sarah Roberts, who is a professor at UCLA School of Education and Information Studies. I’m not going to read her whole bio because you have it in front of you, other than to say that she came to my class today—I’m a professor at Fordham Law School where I’m teaching a new seminar this semester on digital civil rights and civil liberties. And she came to speak with my students about this fantastic new book she has out, Behind the Screen: Content Moderation in the Shadows of Social Media. And it was a very engaging discussion. So Sarah’s going to outline some ideas for about ten minutes or so, and then we will open up for discussion because we have a number of experts in the room and the discussion is always the most fun part. Just as a reminder, this is on the record. Without further ado, let me turn it over to Sarah. ROBERTS: All right. Thank you so much. Thanks to everyone for choosing to spend time here this evening. It’s certainly a delight to be a part of this series, and to be present with you. So thank you. I am going to do my best. I’m a professor, so I have a problem with verbosity. We try to keep it short and sweet. I’m going to try to speak quickly so we can get to discussion. So if there’s anything that seems like a bit of unpacking, we can return to it. But I’m going to do my best to give an overview, assuming that you have not all spent as much time as I have with the subject. So basically I’ll talk a little bit about the research that’s contained in the book, and then I want to tee-up some issues that I think are pertinent to the present moment particularly, because this work is the culmination of nine years of research. We like a slow burn in academia, so it’s been simmering for some time. When I began this work in 2010, I was myself still a doctoral student at the University of Illinois, but I had a previous career in the IT field, although I had, you know, perhaps made the unfortunate decision of finishing my French degree—French literature degree rather than running out to Silicon Valley during the first kind of net bubble in the mid-’90s, so there you have it. But I have fun when I go to France, I guess. Anyway. So I was working in IT for about fifteen years before I decided to go back to school. It was going to just be a quick in and out sort of master’s degree. And I became enthralled with really feeling like I needed to pursue some of the issues that I had to live through first-hand, mainly the widespread adoption and also commercialization of the internet. I had been a user of the internet at that point for almost twenty years in 2010, and I had considered myself a late adopter. I thought I kind of missed the wave of the social internet. But anyway. So in the—in the summer of 2010, I always want to give credit where it’s due, I read but brief but powerful report in the New York Times tech section. It was sort of what we would consider below the fold. I didn’t say that to your students today because I didn’t know if they’d know what I was talking about. (Laughter.) But it was a below the fold kind of piece, a small piece about a firm in rural Iowa. I was sitting at the time in the middle of a corn field at the University of Illinois, so I could relate to these people who were described in the article as working in really what, for all intents and purposes, seemed to be a call center environment. And they were working not taking service calls for, like, your Maytag washer or your Sears home product, but in fact what they were doing was looking at material from unnamed social media sites that had been uploaded by users and which had been flagged by other users as having some issue, being problematic. And this typically fell around issues of perhaps being pornographic, or obscene, gratuitously violent, all the way to things like child sexual exploitation material, images of abuse of other sorts, and the list goes on and on. And I won’t belabor it with examples, but you can sort of imagine what one might see. What I wanted to do upon learning that as really get a sense of what to what extent this need was in fact a fundamental part of the at that time ramping up but very significant social media industry emanating from Silicon Valley. So I should just contextualize this by saying I’m talking about American companies that are based in Silicon Valley. I’m not an expert, unfortunately, on some other parts of the world. But these companies, of course, cover the globe. And in fact, last I knew the stat, Facebook’s userbase is 87 percent outside the United States. So it’s quite significant that these American firms and their norms are circulating around the globe. The findings are detailed in here. It’s also a bit—I have to admit, I guess this is a first-time book writer’s thing where you sort of go into your own autobiography and you really wax poetic. That’s in there too. You don’t have to take that too much to heart, but I think what I wanted to do was contextualize the way in which from that period in the—in the early to mid-’90s to where we are now, the internet has become—and what we consider the internet, which is our social media apps usually on our phones, right—has really become an expectation, a norm, a part of our daily life in terms of interpersonal relationships, in terms of maybe romantic relationships, business relationships, political discourse, and the list goes on and on at how these platforms are a part of—a part of the fabric of our social life. And how those companies that provide these essentially empty vessels rely upon people like us to fill them with our so-called content. Content is a funny word, because it just stands for, evidently, any form of human self-expression you can think of. And so I often come back to that as an interesting thing to unpack. I’ll tell you a little bit about what we found, and we’ll buzz through this, and then we’ll—I’ll get off the mic for a minute. But essentially what I discovered over the subsequent years was that this activity of content moderation on behalf of companies as a for-pay job—something that I came to call commercial content moderation—was viewed by the firms that solicited it as a mission critical activity. In other words, these firms viewed this practice so important as to be really unwilling to function without this kind of stopgap measure to control the content on their sites. This—you know, we can think of this as a gatekeeping mechanism, which means it’s also a mechanism by which content is allowed to stay up as much as it is a mechanism to remove. But what was really important to me to understand about the impetus for this particular activity, and then the creation and shoring up of a global workforce, was that the activity was taking place primarily as a function of brand management for these firms. What do I mean by that? Well, I mean that just as, I don’t know, CBS Studios is unlikely to flip on the camera, open the door, and ask New Yorkers to just come in and get in front of the camera and do what they will—without any control—neither are these platforms. But one of the biggest differences about the ways those kinds of relationships have come to be understood in our—in our everyday life is that I think the expectation about the former is much clearer than it is about the latter. These platforms have come to take up and occupy such an important space, in large part because they were predicated or sold to us on a—on a claim that essentially it would be us, to the platform, to the world. In fact, YouTube’s on-again, off-again slogan has been: Broadcast Yourself. I mean, they say it better than I can, right? You just get on there and emote, and do your thing, and it’s going to broadcast all over the world. So what I came to find was that in fact there was a workforce in the middle. And to me, that was revelatory, and it was shocking. I had never considered it. And I was supposed be getting a Ph.D., right, in this stuff. And I had worked for fifteen years in this area. I actually started asking other colleagues around campus—esteemed professors who shall remain nameless, but who are victimless here—they also said, gosh, I’ve never heard of that. I’d never heard that companies would hire people to do that. That’s the first thing they said. Then they said, don’t computers do that? Now, if these are the people who are—have their fingers on the pulse of what’s going on in social media, why didn’t they know? Well, that led me to speculate that in fact this practice was intended to be, to a certain extent, hidden. That actually is the case. So I’m just going to talk for a minute about what this workforce looks like, and then we’ll go into some of the maybe provocations, I guess we can call it. As we speak today, I would—it’s difficult to put numbers on what kind of global workforce we’re talking about, but I would estimate that we’re thinking about maybe 100,000 at this given moment. The number I arrive at for that may be conservative. But I take that number from looking just at the public numbers that Google and Facebook now offer up around their workforce, which are in the tens of thousands. Those are two platforms out of how many? The Snaps, the Instagrams—they’re not counting Instagram—the TikToks of the world, right, whatever the latest thing is. I’m starting to show my age and I don’t even know what’s going on anymore. But anyway, so any—essentially any company that opens the opportunity and—as a commercial entity—opens the opportunity for someone to upload is going to introduce a mechanism to control that. And that’s where we can arrive at these numbers. The thing about this globalized workforce is that it’s diverse, it’s dispersed. You can find it a number of different industrial sectors. But there are some things we can say about them overall that they share in common. And those characteristics that I think are important to mention is that this work, and the workers who undertake it, are typically viewed as low status. They are typically low wage earners. And they are typically limited term workers for a firm. So the expectation is not that one would make a lifelong career at this work. We can think about why that maybe is. It may in fact be because you wouldn’t be able to stomach this beyond this—right? We’ve got the shaking heads. It’s like, no thank you. I—personally, I couldn’t do it for a day, much less a year. But it’s often limited term. The work is often also to some extent done at remove from the platform that actually needs the service. So how do they do that? Well, no surprise, it’s going to be contracting, outsourcing, and other sorts of arrangements that look other than permanent and look other than direct employ. They often, of course, in the case of the United States, for one, given that circumstance, lack significant workplace benefits. Now, when we start thinking about the fact that this work can put workers in harm’s way psychologically because of what they view as a precondition of the work, that lack of benefits, that lack of—and even under the Affordable Care Act people might not be able to afford mental health support, because we know that’s often extra. I mean, I know even in my health care plan as a UCLA professor that’s something I would have to pay for, to a certain extent, out of pocket. How might a worker, who’s contractual, and low wage, and low status, go about obtaining that? Now, when we think about this work being global, we also know that there are places in this country and in other parts of the world where mental health issues are highly stigmatized. And so seeking that help is also encountering barriers just based on cultural norms and other sorts of attitudes towards that kind of—that kind of support. And so really, what we’re looking at is a system of work that has been essentially outsourced and devalued. And yet, those who are near to this kind of operational activity within firms know that it’s mission critical. They know that this gatekeeping mechanism and ability to control what’s on the platform has a fundamental function in their operations. And they really wouldn’t go forward without it. As one person quite candidly put it to me once: If you open a whole on the internet, it gets filled with, blank. And so that was her way of telling me, therefore every time we allow someone to upload something into essentially this empty vessel, we have to have a mechanism to control it. OK. So I’ll talk a little bit about the outcomes here. I’m just going to list them. We can come back to them. But the primary findings in the book, I would say, are as follows. We’re talking about a fractured, stratified and precarious workforce, as I’ve mentioned. You will find this workforce not sort of in a monolithic site that can be easily identified as this is where commercial content moderation is done, but instead in a variety of industrial sites and sectors, some of which might not be recognizable to workers who are actually doing the same work because of the setting or because of the nature of the work. What do I mean by that? Well, some people go to work every day at Silicon Valley. They sit next to engineers, or maybe down the hall as the case may be. But they have a different color badge. They’re contractors. While others do this work disembodied over a digital piecework site, like Amazon Mechanical Turk. It maybe even has a different name. One person might be doing work called “community management,” and another person is doing dataset training for machine learning algorithms. And guess what? They both might be doing some form of commercial content moderation. So this—when we think about how workers might self-identify, these are the features that make it difficult. There is a patchwork approach to covering this labor need. So, again, global. And, again, using these different industrial sectors, because there’s simply often not enough people available to just be taken up into the apparatus and be put on the job, just going straight through one particular firm or one particular place. This is where we’re now seeing big labor provision firms in the mix. The Accentures of the world and others are now in this—in this field. Workers, again, globally dispersed. And one final thing that I’ll say that I think is actually very key, again, it is often secretive. The work is often under a nondisclosure agreement. Now, many of you know that Silicon Valley asks you to sign a nondisclosure agreement every time you turn around. It’s sort of a cultural norm. But this is taken actually very seriously for these workers in particular. So I had to guarantee a certain level of anonymity and use pseudonyms and other things when talking about particular cases in the book. I talk about a case called Megatech. And I was speaking to the class earlier today, one of the funniest things that I never would have expected is that when I meet people from industry and Silicon Valley, and have over the last few years, they say to me: Well, we’re sure that our company is Megatech. We know you’re talking about us, Megatech. I’ve had, like, six different firms tell me they’re certain that they were the field site. (Laughter.) Now, I can neither confirm nor deny that, so that leaves them a little anxious. But I find it fascinating that so many companies see themselves in what I reported here. I never would have expected that. That’s the beauty of doing research, I guess, and having it out in the world. OK. I just want to give a few provocations or thoughts about—I know everyone here is quite interested in policy implications and things of that nature. So I want to give a couple highlights about that. I’ll say I’m not a lawyer. I kind of like to hang around with them a lot. They seem to be a good crowd for the most part. (Laughter.) But I’m not one myself. But the nature of this work, and the operationalizing of this work, means that I have to have a direct relationship to what’s going on at a policy level, and then ever further at a regulatory level. And that’s sort of been an interesting trajectory for me that I might have not expected originally nine years ago. So what’s going on? What are the pressure points on industry around this type of work? Well, I would identify them as follows—and I guess this is sort of in a sort of order, but not really. The first one I would say is regulation. Regulation is the big one, right? That could mean European maneuvers at the EU level. So we’ve seen things like that already—GDPR and other regulations passed sort of at a pan-European level, but also at the state level. Germany, for example, or Belgium, or France, where they have pushed back on firms. We have heard about, and are seeing, movement around antitrust in the United States, for example. We have seen discussion and invocation of perhaps a need to revisit Section 230 in the United States, which is what has allowed these firms to grow to where they are now, because it has given them the discretion to both decide to keep up, but also to remove, again at their discretion and to their benefit, content over the years. And then there’s this—I guess the next kind of layer I would talk about would be litigation. We have seen a number of interesting cases crop up just in the last few years—this is a new trend—of current or former content moderation workers, working as commercial content moderations, who are filing lawsuits. So there’s been a couple lawsuits that have been individual cases. One that was very interesting was about Microsoft. They had a hard time claiming these workers were not direct employees because, as you may know, Microsoft got the pants sued off of it a couple decades ago around this issue of having long-term employees they called contractors. So that was an interesting thing with that case, where the people were unequivocally direct employees. But also there is a—there is a class-action suit that’s in the state of California right now. There’s also rumblings of some cases being filed in places like Ireland which, as you may know, is a huge operations center for Silicon Valley firms, for no small reason because it’s a tax haven. OK. What else? Journalism. Negative journalistic coverage. This has been a big one, a big pressure points on the firms. Exposés around working conditions for content moderation. We’ve seen them over the years. I’ve partners with many journalists, and many journalists have broken stories themselves around this. It tends to focus on the negative working conditions and the impact on the workers. Doesn’t often go to some of the deeper policy dimensions, but it’s the kind of headline that shocks people, and it brings people into a position of, first of all, knowing that these people exist and, secondly, taking issue with their—with their treatment. And that leads us to consumer concern. Of course, the biggest fear for platforms is—well, maybe not the biggest—but a huge fear for platforms is losing their userbase. They want to gain a userbase and they want to keep people coming back. Of course, they’re advertising firms and they connect these users to advertisers. But if consumers become dissatisfied enough they may leave the platform. So when that sort of rumbling occurs, they respond. And then finally, last but not least—well, I guess I would say also academic research has had an impact to a certain extent here. But last but not least, labor organizing. This is seen as a huge threat. Again, the same with the regulatory pushback. I think labor organizing they’re trying to avoid at all costs. I think it goes without saying that these firms site their content moderation operations in places that are probably on the low scale for strong organized labor—places like the Philippines, for example. Places where the United States might have had a long-standing colonial relationship, and therefore firms there can say things like, our workers have great colloquial American English, as if that just happened by happenstance. (Laughs.) It didn’t, right? All right. So I think I’ll leave it there and we can just open it up? Is that good? All right. I tried to be short, sorry. (Laughs.) POWELL: So as is our way here, please just turn your card sideways if you would like to ask a question. I certainly have questions of my own, but I’m going to first turn to you. And I’ll just jump in later. So let’s start with Anne (sp). Q: OK. So I just want some facts. ROBERTS: Yes. Q: Where are these people geographically? What is their demographic? Are we talking about Evangelical Christians? What are their value sets? What is their filter? Because—you know, how hard is it to control what they do? ROBERTS: That’s right. OK. So the first company that I came in contact with was this Iowa firm. And this firm’s tagline was quite literally, “Outsource to Iowa, not India.” So they were setting up this relationship of don’t go to the racialized other somewhere around the world. You want your content moderation homegrown, are good Iowa, you know, former farm family workers. Of course, their farms are gone, so now they’re working in call centers. So that was something that they actually saw value in and they were willing to treat as a commodity, to a certain extent. What’s going on now with the larger firms is that—so these are—these sites can be found in places like the Philippines, especially for American firms, but also in India. Then for each country that sort of introduces legislation that’s country-specific—for example, Germany. Suddenly, there needs to be a call center in Germany, because they need to respond quickly to German law, and those people have to be linguistically and culturally sensitive to the German context. So these places are springing up, frankly, like mushrooms all over the world to respond to the linguistic and cultural needs. How do they homogenize the responses? This is the difficulty. Well, you would not believe the granularity of the policies that are internal. If there are—if the percentage of the image is flesh tone to this extent, delete. If not, leave up. If the nipple is exposed, delete, except if it’s breastfeeding. You can now leave that up. Except if it’s sexualized, delete that. So these are the kinds of decisions that have been codified— Q: From headquarters? ROBERTS: From headquarters, correct. And the expectation is that the workers actually have very little agency. But what they do have is the cognitive ability to hold all these things in their mind at once, which guess can’t do that very well? Computers. Algorithms. Not successful in the same way on all of this content. Some things computers can do well, but the cost of building the tools to do this and the worry of having false positives, or losing control over the process, means that humans are filling the gap. I think there’s a sensibility in Silicon Valley that this is just for now. That soon that we’re going to have computers that can do this. Q: But— ROBERTS: Right? Thank you. That’s what I say too. And if you talk to folks close to the operations, you know, in a candid moment they’ll say something, like, look, there’s never going to be a moment where we let the machine loose without some sort of engagement in human oversight. In fact, when the algorithmic tools are unleashed on some of the content, what has been happening is that it goes out and aggregates so much content that they actually need more human workers to sift through the stuff. So it’s actually not eliminating the humans out of the pipeline at all. Hopefully that answers— Q: But in the U.S. case, Facebook, Twitter, they are using Filipinos and Indians? It’s an outsourcing industry right now? ROBERTS: And, again, that’s— Q: In some instances. ROBERTS: Yeah. Yeah. I mean, there’s—again, it’s like a patchwork, right? So there might be folks who are local. There might be people who have specific competencies who are employed to look at certain types of content, or certain cases. An example I would give is Myanmar, which Facebook took a lot of heat for not having appropriate staffing. You know, they’ve staffed up. So there are people who are, you know, kind of country-specific, like the way we think about people who work in policy work, actually, right? But there is often a fairly significant gap between those people who are—who are putting into operations the rules, and those people who are making the rules. And that’s another big kind of tension point, if you will. POWELL: Let’s go to Joan next. Q: Hi, Sarah. ROBERTS: Hi, Joan. Q: I’m Joan Johnson-Freese, a professor at the Naval War College. ROBERTS: Hi. Q: Thank you for a great presentation. I’m wondering if you could talk a little more specifically about the gender aspect. ROBERTS: Yes. So actually in my research I found that it was fairly gender-equal in terms of who was doing the work. One of the interesting things however is that in talking to some of the workers who were female or female-identified, in particular one woman who was working on a news platform, she talked about the way in which her exposure to hate speech that was particularly misogynist in nature, or that would typically include rape threats or other kinds of gender-derogatory terms, was affecting her personally to the point that she described herself as—I’m sorry you heard this already—as a sin-eater. And she was supposed to be employed part time, but she found herself when she would be maybe out to dinner, out to a restaurant, sneaking away to check her computer to see what had filtered in. And she talked—she was a person. She’s female-identified. She self-identifies as queer, working class, and oriented towards racial and social justice although she’s white herself. And she talked about the way that misogynist language in particular and threats, homophobic speech and threats and behavior, and racially insensitive and hostile material was starting to affect her so much that she felt like when she was not even on the clock she would go in and check the site, because if she wasn’t there doing it she felt like others who weren’t prepared to see the material were being exposed. Right? So she described herself as a sin-eater to me. And she said, I call myself a sin-eater—as if I knew what that was. I didn’t know what it was, I admit. So I asked her to describe it, and I looked into this later. And for those who don’t know, it’s a figure—something of a folkloric figure. But it’s a person who in the context of England and Wales was typically a poor villager, man or woman, someone destitute in a particular community, who would upon the death of someone more prominent volunteer to take a loaf of bread or other kind of food that had been passed over that individual, was imagined to be imbued with their sins, and would eat it. That person would therefore be absolved of the sins and go to heaven, and the person who was eating the sins would, I guess, suffer the consequences later. So that’s how she described it. And she—in the book we go into detail about her experience and how it became very difficult for her to separate her multiplicity of identities. But especially as a woman, and as a queer-identified woman, dealing with the kind of vitriol that she was responsible, essentially, for cleaning up. So that was a pretty stark one. (Laughs.) That was—that was tough. Yeah, thanks. POWELL: Let’s go to Catherine (sp). Q: Yeah. This is super interesting. And I actually have an experience as an early comment moderator myself, because I was the sixth employee of the Huffington Post, who would get phone calls from heads of—like Dick Cheney’s office, calling and saying: Could you please take this negative comment down about the vice president? And we would—you know, it was from the Secret Service. So, anyway, lots of stories there. But my bigger question is, what—like, it sounds like you’re talking about the labor force and this unrecognized labor force. But then from what you just said, it’s the fact that we have this unbridled comment stream of hate and how are companies ever going to really reconcile? Like, when is the moment where they finally say: We have to do something bigger than just moderate all day? ROBERTS: Well—(laughter)—what—if we can solve that this evening we can go find VC investment and we will—we’ll resolve it. But I think—you know, if I can sort of read into what you’re saying, I mean, I think your discomfort is on a couple of levels. One is, this is the function of—good, bad, or ugly, however you feel about it—Section 230s, internet intermediary definition of these platforms as being able to decide to what extent and for what purposes they will moderate. So that’s the first thing. But I think the second thing is a little less apparent. And it has to do with business model. It’s not as if it was a foregone conclusion that the whole world would just flood channels with cat pictures, and this was my sister-in-law’s wedding, and whatever they’re posting or, you know, Nazi imagery or other—you know, terrorist material, child sexual exploitation material. But there’s actually a direct relationship on these platforms to the circulation of material that we call content—which already, again, I would say is a ridiculous, too-general category—and monetization, and treating that material as commodity. So what I’m getting at here is that the platforms are a little bit in a bit of a pickle, to say the least, about how they have developed a business model that’s predicated on a constant influx of new material. Why? Well, because they want us to come back. If it’s just the same stuff all day every day, they don’t think we’re going to come back. What is—what is going to get the most hits from viewers? Is it going to be something really boring and uninteresting, or is it going to be the things that’s just maybe this side of bearable and everyone’s talking about it because it’s viral, right? So these are the kind of economics and logics that have been built up around the constant influx of content. And so it’s gotten to the point where this computer scientist that was at Dartmouth, and he’s now at Stanford, who developed one of the primary AI tools to combat child sexual exploitation material, and actually does work very well in that use case, he pointed out in a paper that he wrote, and then I cited him heavily in a recently in a paper I wrote, where he said: Look, what’s never on the table when I’m in boardrooms is, what if we slow down the hose of the influx of the material? That’s never under question. And he’s—for heaven’s sake, he’s the developer of this tool. And he’s the one thinking, hello, the always on, constant on, kind of unvetted uploading from anyone in the world is maybe not an awesome idea, right? Like after the Christchurch shooting in New Zealand, which was a horrible massacre, that was maybe the first time you heard Facebook seriously question, maybe we shouldn’t just let everyone in the world turn on a livestream and go for it. Maybe it should only be trusted users, or people whose info we have or something, right? So we get back to this problem of the business model. And it’s the thing that it’s kind of like the elephant in the room. It’s, like, the thing that they don’t want to touch because that’s how they make their money. They monetize the content that we provide. I’d also say that we are unfortunately fairly implicated. And I mean, like, look, I’m sitting here with my phone, tweeting, doing all of the things, right? We are implicated ourselves as users and being a part of the economy. But I can’t in good conscience tell everybody to throw out their phone and get off the platform, because I can’t do it. So they’re—I don’t know. There’s got—you know, there’s a slow food movement that came up a number of years ago because people were sick of the scary supply chain industrialization of their food, right? And I often think about, who’s going to come up with slow social media? Q: Yeah. No, that’s sort of my—I have a friend who’s pretty high up at Facebook. And they’re complaining about how the guy who wrote what the Zucc, or something—or, Zucked, advertises on Facebook all the time. Like, the very— ROBERTS: Yeah, right? Q: But then they’re making money off of that. Which is like a terrible cycle. ROBERTS: Which is, like, also—yeah. And these people are probably completely disembodied from that ecosystem anyway, right? So I think one of the other things I just throw in the mix to think about is that we’ve hardly tapped any control mechanisms that might be at our avail in other realms. So things like—again, like some of these regulatory things. Or even the fact that these firms have been, for fifteen years, been able to self-define almost exclusively, without intervention, as tech firms. It’s not just because they have an allegiance to the tech world that they call themselves that, but what if they called themselves media broadcast companies. Well, what happens when you’re in broadcast media? Can you just air anything you want? I mean, George Carlin made a career out of lampooning the fact that you can’t, right? So, you know, one day at some point years ago I thought, let me just go look at the FCC’s rules on broadcast media and what you can and can’t do. Let me go find the comparable thing for social media—oh, right? And yet, they’re all engaged not only in soliciting our material, but now they’re engaged in production of their own material too. I think about YouTube as, like, the prime example of that business model relationship, where we have literally people getting checks cut to them if they—if they get a lot of views. So there’s a whole economy now, and the logic of the platform, that almost goes unquestioned and seems innate. And yet, it hasn’t been that long that it’s been this way—which is one of the things I’d like to think about. I don’t have the solution, however. Remember, I— Q: More like, is there going to be a tipping point? I mean, that’s what I—yeah, if you’re seeing it. ROBERTS: Yeah. I mean, I don’t—I’ll tell you this. Like, I don’t like to do prognostication because, again, I decided to do my French degree and not go to Silicon Valley in the ’90s. (Laughter.) But I don’t think—if I had to bet, I don’t think the pressure will come from the U.S. I think the pressure is coming from Europe. Yep, and they’re very, very worried about that. Q: Did you see that the Danes have an ambassador to Silicon Valley? ROBERTS: Yes, they do. I saw that. Indeed. Q: I was just in Denmark. And you know, these people think differently. And they’re going to think harder about the regulation issues. ROBERTS: But you’ll also see—you’ll also see social media CEOs be received as though heads of state. I mean, we’re talking about policy that rivals legal code. Q: And economies that rival maybe the GDP of some small countries as well. ROBERTS: Correct. Correct. POWELL: So we’ve got Rufus (sp), and then Kenneth (sp), and Abby (sp). Let’s go to Rufus (sp). Q: So, a two-part question. And they kind of play with each other. So this is mission critical from a brand point of view, and it supports their advertising, and, you know, you want to have control over your platform. But I’m curious in terms of is the—is it somewhat a resource problem? Like, are they just not investing enough in it, and therefore you have very bad labor practices, and that’s the problem? And then the second part of that, of my question, actually has to do with maybe how it’s different in China, because it seems like they moderate their content real well. (Laughter.) And they have social platforms— ROBERTS: Yeah. Let’s copy that model, right? Yeah. (Laughs.) Q: Yeah, no, but, you know, I’m just curious. Like, clearly they have control over their social platforms in a way that we don’t. And I wonder if there’s anything to learn from that or be afraid of in terms of we should control more. Does that— ROBERTS: Well, to answer the first question, I think it—I can’t just say yes or no, right? I’m going to have to— Q: Sure. ROBERTS: Sorry. (Laughter.) I’m sorry. I think it is a resource problem, but it’s also a problem of prioritization. So how can I put this? This function, although it’s been present in some form, I would argue, since the platform started, was never thought of as central. So it was always a bit of an afterthought, playing catch up. And I think that that position of the activity within the firm has always lagged, essentially. There’s an interesting moment in this film called The Cleaners that I was involved in, where Nicole Wong, who was at the time the general counsel at Google, was up one night making content decisions. So there were people in the firms who knew—I mean, at those high echelons—who knew this was an issue and a problem. But, you know, it was sort of, like, someone else’s problem? And it wasn’t a problem that was seen as—it wasn’t—it wasn’t a bucket that was going to generate revenue, right? It was a cost center. I mean, there’s a lot of ways to slice that. I think you could argue, for example that, well, a PR disaster in the absence of this activity would be immensely costly, or you could say that a company that has good, solid practices and has an identity that maybe they even build around their content moderation that gives a character of flavor to the platform could even market on those grounds. But the decision was made early on to sort of treat this activity as secondary at best in terms of how it was presented to the public. I think that was also because they didn’t want to be accountable. They wanted to make the decisions and have the discretion to make the decisions. So because it’s always been laggard, it’s like there’s been this huge resource shift within the firms to figure out, go figure, you know, if all you have is a hammer, everything looks like a nail. So the answer is let’s get computation on it solve it. Well, one of the reasons that they want to use computation is, of course, the problem of scale. So despite there being maybe a hundred thousand people working in this—in this sector, that pales against the amount of content that’s produced. It means that just some portion, some miniscule portion of content is ever reviewed by humans. That’s one of the reasons why they want to use computation. But another reason—there are a few reasons. Another reason is because that’s what they’re in the business of doing. And that, again, also takes out this worry about rogue actors, people resisting, people making their own choices, making errors, disclosing to the media or others—academics, such as myself—what they’re up to, disclosing to regulators or others who might want to intervene. So there are other—so we should be suspicious about some of the motives around the computation. But I think functionally, at the end of the day, there are very few companies that could actually build the tools. I mean, we’re talking about bleeding edge AI implementation. When I started this research I went over to the National Center for Supercomputing Applications at Illinois. We were in the—in the cheaper side of campus, so I went over to the monied side where the big computational stuff was going on. And I went to this computer vision lab. Now, again, this is 2010, to be fair. But I went into this computer vision lab and I spoke to this research scientist. And I said, look, here’s the problem. Let me sketch out the problem for you. Can computers do this? Is that reasonable? And he said, see that over there? And he pointed at an oak table in the middle of this darkened cube—visualization cube kind of space. I said, yeah. He said, right now we’re working on making the computer know that the table is a table. Like, controlling for every—(laughs)—you know, aspect of the—we’re way beyond that today. But it kind of shows the fundamental problem. First of all, what does it mean for the computer to know? Usually it’s pattern matching, or it’s some kind of matching. So the best kinds of computational tools for content moderation are matching against something known. This is why the problem of child sexual exploitation can be effectively treated with a computational tool, because for better or for worse people who traffic in that material tend to recirculate a whole lot of material. So it can be put in a database and can be known. But for stuff that doesn’t exist anywhere else in the world, or is hard to understand, or has symbols and meanings that you have to be a cultural insider to understand, or you have to just be a human being to understand, there are only a few firms in the world that have staff, money, know-how, the need to put workers on it. For many firms, it’s just cheaper to get humans. Now, your second question about China, I confess to being an ignoramus when it comes to China. But I would say that, you know, just off the cuff, a huge difference is that Chinese companies don’t just spring up and do what they want from the start. I mean, they are—(laughs)—I mean, they are fostered by the state and they’re typically quite intertwined with the state at first. There is no Section 230 in China, in other words, right? And there’s probably a lot more labor to put on this in China, and more of a sensibility that it’s going on, I think, than in the United States. But people have creative ways around it, always. POWELL: I guess it would be harder to carry out your research in China too, to document what’s going on there. ROBERTS: I mean, yes. Although, you know, I should tell you, I have a new Ph.D. student coming in a matter of weeks. And he’s coming to work with me because he told me he wants to do comparatives studies of the Chinese case of content moderation versus the United States case. And we’re on Skype and I’m, like, dude—shut up, dude. (Laughter.) You know? Like, we’ll talk about it when you get here, man. I’m, like, all nervous, because I don’t know who’s listening. Yeah. So I think that work will come. And I think we need comparative studies, because I am limited by my cultural context, which is the American one. But that is an important one to understand right now, because of the global impact. POWELL: Kenneth (sp). Q: To what extent can you offer specific normative suggestions on how to improve content moderation towards the ideals that you have? ROBERTS: Well, I think—yeah, it depends on what we consider an improvement. I think for the purposes of the book, it has to do with working conditions. So let’s take that as the goal. And to get ideas around that, I’ve often relied on the workers themselves, since they’ve thought so much about what would help them. I think there are a few things—well, I think there are a number of things that we can think about. The first thing that comes out of everyone’s mouth, you won’t be surprised to learn, is: Pay us more. I mean, it’s sort of a flip response, but I think it says a lot, because when I hear workers say that I hear them say: Value our work more. I also think the secretive nature of the work is something that impacts the psychological difficulty of dealing with the work. So— Q: Excuse me. What are they paid? What’s the range? I mean, are we talking— ROBERTS: So I’ll give you a very recent example. In May, Facebook made a big announcement, sort of leading the way in this arena of how to better support content moderation workers. They’ve taken a lot of heat, so that’s part of the reason. And they announced that for all of their American-based content moderators who are in third-party call centers, or wherever they are in the chain of production, the base rate of pay would be $15. And in other metro areas, New York, kind of—San Francisco, high-expense areas, it would be a higher rate of pay. So fifteen’s the floor, and then going up from there. Q: My maid makes twenty (dollars). ROBERTS: So, right. So this raises some important issues. Q: That’s like basic minimum wage now. ROBERTS: For—right. We know that also, again, they’re a step out from basic minimum wage that will be enacted in California, first of all. So again, thinking about how this—there’s a strategy of being head of regulation a lot of times. Q: But without benefits? ROBERTS: Well, right. And then the other thing that this brings up—there was sort of, like, the deafening silence from other industry players. I thought maybe some of them would follow suit. Q: That was way too high, yeah. ROBERTS: Yeah. But they haven’t. Google went on record and they said that, I think, 2022 they were going to get everyone there. Also, this was American only, but we know that there is so much of this work that’s outside of the United States. Unless it’s a place where the mandatory minimum wage is higher, which might be in some European cases— Q: Not the Philippines. (Laughs.) ROBERTS: Correct. So it’s usually very low wage. The other thing that companies have started doing—Facebook is one, and others—is bringing on psychological support on site. Workers told me a bit about this in their case. And they said that while one the one hand that was a welcome improvement, because they didn’t really necessarily have access to those services, it was in some cases voluntary. And what ended up happening was that the therapist, the psychological services person would come at the appointed time, take a room in the department, and anyone could come and speak to him or her. So that worker who’s struggling and having a hard time has to get off the queue, tap out of his or her work, stand up, walk through the department, walk past the boss, walk past the coworkers, and go in and sit with the therapist—thereby, letting everyone know: I’m struggling with looking at content, which is the precondition of my job. So some of them said: It would be nice if that were mandatory, and if everybody had to visit with a therapist at some—at some prescribed time. That’s another thing. I think benefits is another big thing. And I would also add that very little has been done by way of developing tools that could be supportive or assistive. When I talked to some of the workers, they were using outmoded kind of homebrew solutions. Or, in the book, we talk about a firm that was using, like, Google tools—like, Google Docs, Google Chat, like, sort of kitbashed or kind of—kind of quasi-internally developed but really, like, just commercially available stuff. I think there’s a market for tools that would allow workers to do things like specify a queue that I’m not comfortable being in today. Like, today I just—if something comes in and it’s flagged child abuse, I just can’t see that today. I’m going to tap out. I’ll take the one that’s, yeah, take your pick, right? Rape threats. I’ll take that one. But, you know, when we—when we as users report content, we usually go kind of triage that material. So that could be used proactively on the worker side to allow them to opt out. And it’s not—you know, some days you can handle it, some days you can’t. These were kinds of things that the workers reported to me. You know, usually I’m OK with animal abuse. That day I just couldn’t do it. One guy said, I just can’t take it when there’s someone screaming in the video. So maybe he could look at videos with audio off. So there’s, like, little things that we could do. Making the screen black and white rather than color or making the screen fuzzy might be a tool. Again, based on and maybe tailored to a worker preference. Workers told me that they would do things like they would look at the screen by squinting so that they would only get—you know, they would know it was gory and they could tell just by squinting if it was too much blood, according to the rules, or too much kind of violence, and then they wouldn’t have to, like, experience the whole thing. We could develop tools that could do that for them, right? And maybe if they felt like, I need—unfortunately I need a closer look, I’ll press the thing to unveil the entire image. So there are—I think there’s a lot of things we can do that it’s just frankly not been prioritized, right? It’s not the thing that’s going to—it’s not the new function that they’re going to blast around. POWELL: So we have two more questions. I think we can—oh, OK. Q: Sorry. POWELL: No, it’s fine. (Laughter.) So let’s see. We might have to get the last two together. But let’s go to Abby (sp). Q: Sure. So just—it’s a—it’s a bit of an expansion on the question that Kenneth (sp) just asked. But what do you think the changes to the labor workforce would be on the actual product, which is the moderation? So let’s hypothetically say we have a workforce that is appropriately compensated, that is centered, maybe directly employed. How would the product of content moderation change, in your view? What would look different to the user? What would look different to the company? ROBERTS: Well, again, I think there’s sort of a fundamental missed opportunity in the fact that the work was rendered secret, whereas again there were all sorts of experiences we have in our daily life where we look for expertise and curation. So what if we thought of people who did content moderation not just as cleaners or janitors, or people who sweep up a mess—which, of course, are important activities but are typically undervalued in our daily life. But what if we thought about them as people who were curators, or tastemakers, you know? I don’t know, sommelier of the internet. I’m just making stuff up, so please don’t—(laughter)—don’t say, that woman said sommelier of the internet. But, you know, people who can help be a guide rather than an invisible agent. I think that that has really hamstrung the circumstances for the workers in a lot of ways. I think thinking about—I didn’t—wasn’t able to get into this in the talk, but there’s a whole host of metrics—productivity metrics that are laid on these workers in terms of how much stuff they’re supposed to process in a given shift, for example. When I was in the Philippines, the workers described to me that they used to have something like thirty seconds per review, per item that they were looking at. And it had been cut to more, like, ten to twelve seconds. Now, another way of thinking about that is their productivity had been more than doubled, the expectation. Or that their wage had been cut in half vis-à-vis productivity. So I don’t think anyone benefits from a ten-second look. I don’t think the workers benefit. I don’t think users benefit. I don’t think the ecosystem benefits. Ultimately, I mean, from just, like, just a cost-benefit analysis on a balance sheet, I guess that comes out looking good for the firms. But I don’t think in a perfect world that we get any kind of quality any more than we think of a McDonald’s hamburger and, you know, a—I don’t know, a farm-to-table meal as the same thing. They’re fundamentally different. Q: What you’re saying is things get through that shouldn’t and things that should go through don’t? The famous image of the girl in Vietnam. You know, you all know that. ROBERTS: That’s right, the terror of war. Q: Right. ROBERTS: Now— Q: You know, just don’t do it very well. ROBERTS: Right. And you have ten seconds, and you’re a twenty-two-year-old college graduate in Manilla, you’re educated—you asked a bit about demographics. All of the workers I talked to were college grads. That has shifted somewhat now, but the workers in Silicon Valley in particular were grads of places like Berkeley and USC. But they had, you know, such as yours truly, made the unfortunate decision to major in econ, or history, these other—I’m kidding, right? (Laughter.) Like, I mean, I think these are very important disciplines. But they—you know, to be employed in STEM or to be employed in the valley, they were, like, kind of not prized disciplines. And yet, they actually had the acumen and the knowledge to make better decisions than some of their peers would. POWELL: So let’s collect Lawrence (sp) and Donna (sp) together, and then let you make concluding remarks. ROBERTS: OK. Q: So you’ve been discussing the irregularities, inconsistencies in the workforce in terms of particular categories of content, which need some measure of moderation—sexualized images, violence, and hate speech. But all these, I think there’s some margin of error. In the case of sexualized imagery, A, it’s—they’ve been able to quantify it, to some extent. I’ve had pictures I took at major museums that were censored because they thought the content was oversexualized. I thought it was silly, but so what that they censored it. Which way they err doesn’t bother me very much, unless it’s child pornography, and you say that they have pretty good methods for that. In the case of violence, again, I hope the err on the side of eliminating violence. It’s not a First Amendment concern or something like that. In the case of ethnic—things that stir up ethnic discord, such as what happened in Myanmar, again, I hope they err on the side of eliminating that kind of hate speech. But what really concerns me is inaccurate—is false content, often spread by governments, the Russian other manipulation of the U.S. elections, the Chinese and Taiwan elections, others in Europe, where it’s a question of facts. And here you have huge competing values. You have—here, you’re talking about real political issues, and governance issues, and it really should be a First Amendment right to speak on these issues. And yet, this false information is doing—particularly deliberately spread false information—is doing enormous damage to democracies around the world. So how do you begin to train people to moderate that, which is far more critical. If anything there’s, to me, less room for error. Less room for error in censoring what should be allowed. And it’s quite a tragedy that so much of this is being propagated and that we’re unable to control it. So how do you begin to deal with that? How do you train a workforce to deal with that? POWELL: So we’re going move to this question. We’re going to collect Donna’s (sp) question as well, and then you can answer them together. ROBERTS: All right. Q: Sarah, I don’t know how you remember that. I’ll make mine, I guess, kind of simple. You are familiar with the Verge—the Verge articles that have come out? ROBERTS: Yes. Q: I guess one question I have for you is I’m trying to get my head around listening to this and saying: What is it that’s really concerning you? Because part of this conversation has been about the worker, about the human piece of it. You have asserted—and I’ll say it’s an assertion—that technology can’t clear—can’t significantly reduce the gap, it seems. And then we’re talking about the social media companies, but we know that this is an internet issue. It is not just a social—it is not just Google, YouTube, and a Facebook issue. So it’s like, when you sit there and look at that—so I was trying to figure out too, OK, is your angle, you know, this is—we’ve got to go after—is this about Facebook and Google? Because if you think about it, right, they’re cleaning—they’re required, in essence, because they are commercially operating a channel, to keep that as clean as they can. And we do regulate that a little bit, right? But the fact of the matter is, our challenge in the content era is this content can show up anywhere on the internet, on any—you know, any website. And that’s the challenge. I’m sure if you followed, you know, child pornography, right, they’re not just looking on social media channels. They’re going to find it anywhere, including the Dark Web. You know, anywhere, parse video. So I guess it’s, like, who are we as society looking to to address this issue? And I guess, is it the worker piece that you’re—are you—and I understand there’s a big issue with humans, you know, involved in the processes. POWELL: You have approximately a minute and a half to answer both questions. (Laughter.) ROBERTS: So the answer to your question is, yes, it’s the worker welfare piece that first compelled me, yeah. And I think I wanted to address my remarks for an audience that I thought would have maybe more direct relationship to policy issues and regulation. But that’s—the book is concerned with the worker welfare, and that’s what my concern has always been, and that was my point of entry. I think what I found is that you can’t really carve that out somehow from the other issues. So for me, that was a foot in the door to now I have to understand the ecosystem. So what I tried to do was also map that out to a certain extent. I’m not certain that—(laughs)—I mean, I’m not sure I would necessarily agree with you, per se, in the way that you framed up the issue of it’s not an XYZ issue, it’s an internet issue, in the sense that I would say this: I find it difficult to, in the American context, locate many internet platforms or services that are not commercial. And that’s part of my—you know, that’s part of the claim that I make of why there is an ecosystem of this work going on. It’s because there was great profit to be made in setting up channels that encouraged people to upload, and to do it all the time, and to actually, in some cases indirectly but in other cases directly, monetize that activity. And that is fundamentally different from what the internet used to look like, which was not—I’m not Pollyanna about it. It wasn’t the halcyon days. In fact, it was a real mess for a lot of the—a lot of the interaction. But it was a different kind a mess and a different set of problems. So that’s sort of the conceit here. But it’s not some—you know, it’s not—it’s not a simple case of exploitation writ large without any other complexities. And it’s not a simple case of Facebook is trash, and sucks, and should close down either. Which has put me in the weird position of, like, working with these people, right, to problem solve. The other question was about basically veracity of information and gaming of the platforms. The one soundbite I’ll give you with that is I think that the issue that you raise is fundamental to the protection of democracy around the world. And I would also say that it’s much harder to make determinations about those issues than it is to know if too much of a boob is showing. And so what the companies tend to do—and I call them on this all the time. I say, you are—your level of granularity on things that maybe don’t matter is in the absence of your ability—or your willingness, let’s say, to articulate your own politics. Because guess what? Other countries where these platforms are engaged don’t have the same commitments to democracy, or to freedom of expression, or whatever it is. And they want to be in the Turkish marketplace, and they want to be in China. And that’s put them on the ropes, and put others in the position of making demands on the firms of, like, well, what are your commitments? Well, they’re very mushy middle. And so then it’s easier to look for and take care of, in a way, some of this content that is obviously bad, versus sitting and spending time, and money, and energy figuring out is this truthful or false? Is this from a vetted source, or is this propaganda? And I think, just to close out, your point that state actors are the ones who should be scaring everybody the most is a great point, for sure, because those are the folks, like you said, who are calling up Facebook and saying: Take down blah. POWELL: Yeah. We should end it there, but please join me in thanking Sarah Roberts. ROBERTS: Thanks. (Applause.) (END) This is an uncorrected transcript.
  • 5G
    Five Jeez: Five Security Arguments Against Huawei 5G
    There are more than enough reasons to keep Huawei equipment out of the core U.S. internet infrastructure.
  • Cybersecurity
    Cyber Week in Review: August 9, 2019
    Russian hackers compromise IoT devices; 8chan knocked offline following El Paso tragedy; North Korea has stolen over $2 billion in cyberattacks; cyber espionage group pilfers files from Venezuelan military; and China’s UNISOC accelerates 5G chip launch.
  • 5G
    Securing 5G Networks
    5G networks are expected to revolutionize the digital economy. But with this opportunity comes major cybersecurity challenges. U.S. policymakers need to respond using technical and regulatory measures, diplomacy, and investments in cybersecurity skills training.
  • Women and Women's Rights
    Women on the Blockchain: Moving Beyond “Blockchain Bros”
    Cryptocurrencies have the potential to radically alter the world's financial systems. But could they also upend inequality?
  • Cybersecurity
    Cyber Week in Review: June 28, 2019
    The United States launched cyber strikes against Iran; APT-10 suspected of cyberattack on global telecoms; Singapore invests in 5G; continued U.S. crackdown on Chinese technology companies; Chinese government and companies push back
  • Space
    Big Bangs, Red Herrings, and the Dilemmas of Space Security
    Policymakers should be careful not to focus to excess on the possibility of kinetic warfare in space.
  • Politics and Government
    Defense and Emerging Technology: A Conversation with Senator Joni Ernst
    Play
    Senator Ernst discusses the importance of advancing U.S. defense technology to counter the emerging capabilities of China and Russia, and how threats resulting from advanced technology are influencing global power competition.  
  • Intelligence
    Kenneth A. Moskow Memorial Lecture on Homeland Security and Counterterrorism
    Play
    Senator Warner discusses China’s strategy to control technologies of the future, including 5G and artificial intelligence, and what steps the United States can take to protect its own technological advantages, reduce cyber vulnerabilities, and counter China’s tactics.
  • Digital Policy
    Can a Divided World Cope With the Risks of the Digital Revolution?
    A new UN report on digital cooperation could fall on deaf ears in a period of intense global technology competition. 
  • Digital Policy
    Cyber Governance: More Spam Than Substance?
    A host of cyber governance initiatives has taken shape of late. The internet, however, remains at significant risk.
  • Women and Women's Rights
    Women Revolutionizing Blockchain: Cryptocurrencies for Change
    Podcast
    Will cryptocurrencies radically alter the world's financial institutions? If so, will women help redesign them and will their needs as economic actors be taken into account? Amber Baldet, cofounder and CEO of Clovyr, joined the Women and Foreign Policy program for a discussion about women as both cryptocurrency entrepreneurs and users.    POWELL: OK, I think we should go ahead and get started. I want to welcome everyone. My name is Catherine Powell. I’m an adjunct senior fellow with Women and Foreign Policy here. Thank you so much for coming out. I wanted to especially recognize a couple of our advisory committee members who are here this evening: Masuda Sultan and Agnes Metzger. Thank you very much for your ongoing support and advice. We have many experts in the room so we’re going to structure this as a fairly interactive discussion, and what—I’m going to introduce Amber in just a moment, and then I’m going to basically pose a series of questions, sort of interview format, and we’ll do that just maybe for about fifteen, twenty minutes, and then open it up to you. So you can—I’m not going to read her whole bio, but you can see how impressive her background is. She is the cofounder and CEO of Clovyr. She led JPMorgan's blockchain efforts. She has been listed as one of Fortune’s “40 Under 40” list of most influential young people in business. And, you know, I think we can think of her as the Daenerys Targaryen of blockchain—(laughter)—and I don’t know—you know, for those of you who follow Games of Thrones and saw the last episode, maybe we’ll just, you know, bracket that—(laughter)—that last episode part—but she is pretty much of a, you know, real powerhouse in this field. So let me just start by asking—because I think many of us come to this with different levels of understanding of what blockchain is, and I definitely put myself in the I’m-still-learning category. We did provide a couple of handouts outside which you are welcome to pick up afterwards, including a blog that I did maybe a year ago and a couple pieces that Amber did at a much higher level of expertise. But I did want to start with just sort of really the basic question of what exactly is blockchain and cryptocurrency, and how does it fit into the broader landscape of financial technology or fintech? BALDET: OK, well, if you ask thirty different people in this industry, you’ll probably get thirty different answers, and there is certainly a varying degree of expertise in the room at this point. So I apologize to anyone who already works at a blockchain startup and is going to feel extremely explained down to. But it is—I think it is important to kind of level set the verbiage you use when you talk about this technology because you can end up in very different places depending on what you are talking about. And we’re not going to get super professorial here, but just so that we have something to look at, I brought a couple slides. So I think the first thing to realize is just that there is no such thing as the blockchain. It’s a type of technology. Just as you think of a spreadsheet as there are many, many spreadsheets—you probably have thousands of them on your computer right now—you would never say, let’s put it on the spreadsheet. There are many of them. And even when we talk about different spreadsheet applications, some of you might use Excel, some of you might use Google Sheets, some of you might use, I guess, Numbers or whatever the Mac equivalent of that is. And so they’re all a little bit different, and they don’t actually interoperate despite the fact that they do the same thing. But they are a spreadsheet. So there’s lot of blockchains. The ones that are listed here—these are their icons or their logos—these are all public blockchain networks, and what the means is that if you wanted to, you could go home right now, download the software, run it on your computer with varying degrees of ease. Let me tell you it is not easy, but it’s getting easier. And you can connect to the network. No one can stop you. You don’t need to ask anyone to participate. You do not need to buy any software. If you would like to and you have the expertise, you can download and audit the software yourself, look at it. It’s all open source which means there is nothing hidden, there’s nothing proprietary, and there is no company in charge of this. You can run it in our country, you can run it in any other country around the world. You can run it in the cloud, on infrastructure that you don’t control. You can run it on your own computer in your basement and not tell anybody about it. You can connect to the public Internet, you could connect through privacy-preserving networks like Tor, and because of that, it’s considered this open, public access sort of system, just like the Internet. What’s different about it, though, is that—and before we move ahead, I guess, I’ll say there’s also distributed ledgers, so you hear about enterprise blockchain and—don’t call it a blockchain; it’s a distributed ledger. These are more of the permission systems where you do need to ask somebody, but often the software is still open source. They function a little differently. We can talk about that later—before we go to that. But the difference is that, rather than just the Internet where you get to transmit data around the network, you can transmit scarce representations of something around a blockchain network. So often what you hear about now is something like bitcoin, which is just one example of a type of a token. It’s scarce. We know what the actual supply existing is and the monetary curve of it is predefined in the code from day one. There is no sort of federal reserve that can change that policy; it simply is what it is, and because of that, the non-modifiability of it provides a certain degree of risk reduction from people in understanding the future curve of the currency. And so, because of that, people will invest in it knowing that new coins will come out at a certain rate. But that’s simply bitcoin, and what it is attempting to do is to be money, right, or a digital store of value—we can all argue about these things—but it’s something that when you send it to somebody else, you don’t have it anymore. And that’s the fundamental difference of the Internet. If you take a picture on your phone and you send it to your kids or your mom, you have a copy of that picture and now they have a copy of that picture, and you are both happy. But if you wanted to send something like a dollar or, theoretically, a vote—although we can talk about voting on a blockchain later; it’s technically very difficult—or intellectual property rights, or a land title, or a share of stock, you can’t actually do that right now in a peer-to-peer way without asking somebody else. You ask a bank, you ask a transfer agent, you ask a custodian. There’s any number of intermediaries that you have to involve. And the difference here is that it literally is just like cash. You can hand it over the Internet, you no longer have it anymore, and everyone can agree that that transaction happened and can never unhappen. That’s what a blockchain is. (Pause.) Do you want—should we just go through the rest of these right now or do you want to go ahead? POWELL: If you—whatever, it’s up to you. BALDET: So I would just say that the winners right now are not clear, and it’s kind of like standing in the ’90s and trying to imagine the Internet. Anybody who tells you they know exactly what’s going to happen is probably trying to sell you something. (Laughter.) But, you know, when people—when Apple first launched the iPhone, people thought they were crazy, right? And now they really have such a large market share, right? So I think that even looking at who is so hyper-dominant now does not necessarily mean that there is not going to be a different innovation or a different kind of market pressure that is going to change winners very quickly. And so the differences in designs that we saw—this gets a little technical so we don’t need to spend too much time here—but you should really understand that, as I said, there’s no such thing as the blockchain. The technology itself really spans a spectrum. And it’s not like someone just sat down one day and, quote, unquote, “invented the blockchain.” It’s actually a confluence of several different technologies that were invented over 30 years ago, really, but just had never been put together in this sort of novel way. And so, depending on how you mix and match those components, you can come up with something where you have an open, public, trustless or trust-minimized sort of system, or you can have something that is a closed kind of single operator system but provides strong guarantees about consistency of information. POWELL: Let me, if I can jump in— BALDET: Yes. POWELL: —because I want to—I want to bring in the gender dimension of this and then come back to some of the broader issues around blockchain, the sector, and how we should think about potential regulation that might get to some additional information you have on the slides. But let me—let me bring in the gender dimension and ask just how you negotiate, being a woman who has been very active both in blockchain—and of course the fact that women are underrepresented in the digital economy more broadly, you know, beyond blockchain and cryptocurrency. If you could talk a bit about the barriers to access for women, other underrepresented groups in the digital economy and blockchain—what are those barriers, how might we think about overcoming them? BALDET: Sure. So there’s not anything that is specifically magical about blockchain or cryptocurrency as to how we address gender or other underrepresented populations, and in a different way. It’s just like the rest of tech—(laughter). It’s not—it’s not just a pipeline problem. You know, I—coming from more of a traditional finance background, there’s also not a whole lot of women there, you know, and cryptocurrency sits at this confluence of—well, there is open-source software, there is financial services, there is cryptography, there is kind of this cypherpunk hacker thing going on. All of them are a bit bereft of women. Cryptography is actually an interesting separate use case in that there are a large number of very well-known female cryptographers, but still, as a percentage of the field, it’s small. So in the middle of this you have this new kind of cryptocurrency thing, which is exactly what I was saying: this is technology that has existed for the last 30 years, and what I found to be different is that there is this perception—people will say, it’s open; go, do your own research, you can learn about it. The code is open source, audit it yourself—as though all of us are just going to sit down and, you know, be able to read through that ourselves. But because you can learn about it, there is this idea that it’s a greenfield meritocracy and does not actually inherit all of the prior biases of the contributing technologies. So overcoming that is a challenge and something I think we’re working on. I do think that people have been very dedicated from the beginning of putting up women in blockchain groups and focusing on it—almost overcompensating in that way. That’s how I ended up on all these lists, I guess. I don’t know. I didn’t—(laughs)—you know, you get on one and then, I think, you know, they just kind of replicate themselves. And because of that there is a lot of opportunity for people that would like to enter the field. POWELL: And I wrote—I co-wrote a report a couple of years ago on women in technology and emerging economies that looked at—you know, some countries, women are actually pretty well—or girls are pretty well represented in—you know, majoring in computer science in college, but it’s more of the transition from college into jobs that gets tricky. Here, this is such a new field that some of the pipeline issues may still be not entirely clear. But let me ask a different type of question, which is not just women’s participation as workers in the digital economy—or here, blockchain economy—but the benefits for women of blockchain. There’s cryptocurrency but there are other applications as well that might be helpful for empowering women and girls. I wonder if you could talk about that as well. BALDET: Sure. I mean, I think in order to kind of break down the barriers to bring more women into the field in general—as I said, it’s not a blockchain problem; it’s a dismantle-system-of-patriarchy problem. (Laughs.) But that’s perhaps a conversation for another day. When it comes to adoption and usage, it’s a little similar in that public cryptocurrencies are very helpful for people who need to do things outside of a traditional system. So there’s a flight to using them when the traditional system does not meet your needs or you have been disenfranchised in some way. So some of the first usages were for cross-border remittances. If you were to send a Western Union, MoneyGram, you know, it might cost you twenty dollars. If you are trying to send forty dollars, that’s a very high transaction cost. And so being able to do that for pennies or less on the dollar was very compelling. Now of course, as I mentioned, we had the—if you think of public cryptocurrency as an actual cash equivalent, the difference is—and this is where people start talking about money laundering—but you generally don’t roll up somewhere with a truck full of cash. But you can take a QR code that represents many millions of dollars, put it in a pocket, and cross a border. And so it’s kind of like high-powered in that way, and that’s what has tended to scare people. And now we start talking about regulation, and gatekeeping, and know your customer, and AML in ways that we don’t necessarily just talk about, you know, buying a coffee with five dollars. So you see populations that cannot get bank accounts in other countries looking to cryptocurrency as an alternative. You see—there’s a specific group called Code to Inspire in Afghanistan where women are not allowed to hold bank accounts or have part-time jobs. The girls learn to code and then they are paid in bitcoin, and then there is kind of a convenience store where they can buy actual goods there. And so they don’t have to ask anyone. You can take your wallet with you, and you can hold this wallet—it’s not really a physical wallet; it’s just on your phone, or you can even write it out on a piece of paper if you want. Nobody can really take that away from you, right? As long as you have it, it can be private in that way. There’s also applications to other groups that have difficulty getting traditional banking services whether that’s consensual sex workers or various other kinds of excluded populations in the LGBT community, people that have been kind of false-positived away from banking rails. POWELL: Interesting. OK, so then it sounds like women then can use this as a tool to get around some of the restrictions, whether it’s you need your husband’s permission to open a bank account or other access to economic tools. You’ve also written, spoken about surveillance capitalism and some of the privacy concerns there. How can we think about blockchain as a way of addressing some of those broader privacy concerns? BALDET: Yeah, it’s an interesting problem. I think if you look at the way that disenfranchised populations are treated, it’s an interesting way to see kind of how you might be treated later under less great conditions. And so as we’ve seen that kind of financial exclusionism, when you look at—or you’ll hear about a cashless society being this great thing that’s coming, right, like, you don’t—we no longer need cash at all; in fact, in the Scandinavian countries they have almost completely eliminated paper cash, and people are very happy about it. Other countries who have done this include China, and you end up with very different systems. When you implement a cashless society what that means is that there is an intermediary for every single transaction that you make. Right now you might pull a five-dollar bill out of your pocket at a farmer’s market, buy some asparagus, and call it a day. Now, you might use your credit card, and it’s great because you don’t—you forgot your cash. Instead you are using Square. Somebody swipes it on your phone, and it feels convenient. But the thing is you are actually asking a bank to approve that transaction. If you used your credit card, yes, it’s because there’s an extension of credit, but even with a debit, it’s really their contractual obligation to honor the commitment that the bank has made to you to hold your money, and they can at any time say, actually, we don’t think that’s asparagus. So in a cashless society, public cryptocurrency becomes the digital equivalent to cash. It is really the only way that you can do something privately. It’s easy to fall into the trap of thinking that, if you don’t have anything to hide, that you don’t necessarily need this privacy. But there are some interesting First Amendment arguments specially around freedom of association, for example, that the government should not be able to see if you have, say, donated to the ACLU, or the NRA, or the Southern Poverty Law Center; that if they can observe that there is a chilling effect, and so it’s already constitutionally protected. And so as we move towards a fully cashless society, we’re going to need to confront how we can handle privacy in that system. Cryptocurrency might be the only actually bullet-proof solution therein. So we don’t know yet. POWELL: So let me just ask you one more question then I want to open it up, so folks can be thinking about what you might want to ask, and that is what you see as the future of blockchain and cryptocurrency five, ten, fifteen years out on the horizon, and what if anything do you think is the role for potential government regulation? BALDET: The regulatory question is challenging. I was able to testify to the House agricultural committee earlier this year during some hearings about whether or not public cryptocurrency should be considered cash, security, or commodity, which is an interesting question. But I do think that, especially in the U.S., where we’re very business focused, we might be missing a larger question around what it means to be rebuilding or building the next generation Internet infrastructure. We truly take for granted that much of the Internet as we know it was developed in the West and with our sorts of ideals in mind, and that it has been a democratizing force around the world. But if you look at other countries that are embracing cryptocurrency quite early, they have different ideas potentially about how these networks should work. And without proper privacy controls, you can end up in a situation where our entire national economy is queryable like a database by anyone and any actor in the world. And we probably don’t want that. It’s difficult—it’s a difficult conversation to have because there are very good arguments on the side that you do not want to have completely private digital currency either because of money laundering, and terrorist financing and other things. But we have the same issues with real cash, you know, or fiat cash, sovereign cash—whatever you want to call it. We have the—you know, we don’t necessarily put potholes in our roads to prevent bank robberies, and we’re in a bit of the same situation. It’s similar to the—what were called the “Crypto Wars” of the ’90s, which had nothing to do with crypto as bitcoin but about strong cryptography, and the export controls around software coming out of the U.S., and how is was able to be used around the world. There are competing cryptographic schemes globally—and a lot of times when I say crypto I actually mean cryptography—(laughs)—you know, and there’s a fantastically educated, and insightful, and motivated group of people that cross the boundaries between some of this advocacy for cryptocurrency as we know it now, and it looks kind of cool and sexy, and actual cryptography problems, information security problems, and national security implications of information security policy. So there are people working at that intersection, but it’s often overshadowed by the hype of what’s the price of bitcoin today, and that’s very disappointing. POWELL: Wow, OK. I have so many more questions, but I want to— BALDET: Could I answer the last part of the question that you had? POWELL: About— BALDET: I feel like I totally skipped something. POWELL: About government regulation? BALDET: No, I think it— POWELL: Or sort of the future of blockchain? BALDET: Oh, right, data, data. POWELL: Oh, yes. BALDET: OK, there we go—data. I knew I was forgetting something: data. And what I work on at Clovyr, we’re working on infrastructure deployment, and orchestration, and developer tooling. It’s very unsexy, but it’s important to the actual kind of pipes and plumbing of the Internet and when we think about how things run and connect together. And so you hear a lot these days about data privacy and the changing public sentiment around our privacy and the data that we’re creating—GDPR coming out in Europe, similar California privacy laws—but all of those sorts of regulations govern consensual data relationships that you have with the vendors that you work with. You sign up for a service, you check the box, you are getting something valuable from them, and therefore they can see your data. Some of that might be minimized in the future. They just won’t take as much of it because there is too much risk. But on the other hand, there are ways that we can start to keep our data locally and privately, and encrypt it ourselves maybe, whether it’s on your phone or on a personal cloud, and then allow people to do things with your data without them actually seeing it or having access to it necessarily. And so you can think of it as the application coming to your data rather than you sending the data to the application. And that has really interesting possibilities when we talk about machine learning over private data, about businesses being able to derive the same kind of business insights they get from these massive data lakes but actually being able to unbundle those lakes, which would be great for everybody. And also the competitiveness within industry verticals, so with the way that Google and Amazon are hoovering up this data right now, they can compete with almost any specific industry and the leaders in that industry. So within the banking industry, for example, there might be five companies that have enough data to really run their own credible machine learning and what they will now call artificial intelligence and cognitive computing, but whatever. It’s just math. But you need a lot of data to run it, so you have to be very big. But what you can do with some of this data coordination technology, which does not mean you put your data on the blockchain. I think anyone who tells you to put your sensitive data on the blockchain does not understand privacy. But with some of this new privacy technology, you could get a lot of minor players in an industry to actually collaborate without disclosing their data to each other and become more competitive. So fostering that kind of technology and that kind of research could be really important to both antitrust considerations, but also just general market competitiveness where we’re seeing these—kind of a narrowing of the field. POWELL: OK, well, as I said, I could—I have several more questions, but I’m going to hold my fire and open this up. So, as is our tradition, just put your card sideways if you have a question, and— BALDET: That’s good. I’ve never heard that before. POWELL: Yes. (Laughs.) And then—also, I should have said up front that this is on the record, so this is being recorded so that those who couldn’t come this evening can access it online. Let me start with Masuda, and then I’ll go around to other people. Q: Is this on? POWELL: Yes. Q: Thank you so much for this session. I am so excited to come here today. Amber, I wanted to ask you, we were just at Blockchain for Social Impact Conference, and I have limited engagement with this space, but I’ve noticed that a lot of the social impact solutions seem to be not very scalable. They all seem to be very new to me, and so I wanted to ask if there were projects particularly related to women and girls that you thought were very scalable or were being scaled at a high level. BALDET: So this is a very specific question. I did get to participate on Newsweek’s blockchain for impact awards this year, so I think we screened over three or four hundred different projects that are trying to achieve impact—whatever that specifically means. It’s very hard to measure. I don’t quite understand why blockchain has suddenly gotten this patina of being so applicable to social problems other than it opens things up, but there is also an inherent conflict, I believe—I’m not sure if this is true, but it’s my belief—that publicly accessible blockchain networks function best as a commons, and that’s where everyone participates, and everyone can get something back for what they have put in. It seems to be fundamentally incompatible with a venture capital model, and so we saw these kinds of—the explosion of tokens, and ICOs, and things last year. A lot of that has imploded. A lot of investment gains that people have made is about them selling thing downstream; not so much about actual adoption and growth of the projects themselves. So where social impact projects, to me, seem most promising is where they are community based and where the participants—they are working together, and it is a small way. But rather than scaling it to one global project where you get hockey-stick growth of a million adopted users, you can spam the same kind of projects many times for local populations. So the—one of the groups that I got to do a writeup on—which was great—was Grassroots Economics—I hope that was correct—but they started before blockchain, going into communities in Kenya and other nearby countries, and creating a web of trust because the problem was that, at the top of the—at the top of the food chain in the country there was too much volatility in the national currency, and it was causing currency shortages in local markets. And people couldn’t keep up with price fluctuations; they had no access to transparent pricing information, right? So by creating, leveraging the local web of trust, if I interact with these kinds of retailers, they pay my teacher, I pay the grocer, we all—like, we know each other and we’re neighborhood there—they are able to extend each other local credit and create—literally create their own currency that was not pegged to the national currency. Now if you do this at enough scale in the right country it would probably just be straight up illegal from the beginning, and they did run into some regulatory issues, but they have now moved that on to the blockchain. They are working with the government, and it has really eased liquidity, and it is now running the same project but in 14 different places, I think. And it’s not all connected in one thing, and there is no profit model. POWELL: I’m going to come over here to Patricia. BALDET: That wasn’t a girls’ thing, but I’m not sure I actually know a girls’ one. Q: No, but this follows directly on this conversation, and I really appreciate the clarity of your presentation, Amber. It’s really, really excellent. There have been other meetings here at the Council where it has been more obfuscating, and this is very clear. So thank you very much. (Laughter.) But I want to follow up on the— BALDET: It gets me in trouble, but I try. (Laughs.) Q: But it’s important because I want to follow up on the gender question. So I have done work on the Grameen Bank. In the end of the ’70s, this was the Muhammad Yunus and presenting credit for women at the household level, this is going to be a breakthrough, women are going to no longer be stuck in their households, they’re going to—so it succeeded in some ways, and it failed in major ways because there was no other amplifying factor so that women could use the credit, women could benefit, could become entrepreneurs, could pay off their debts. They went into debt. And so I’m just wondering is this moment—and I think the—I hear how—and what you just said about keeping it local is very important, but I worry about the adverse impact if women think this is going to be a way—especially at marginal populations, if they think this is a way that they are going to be able to have access to resources and be able to fund things that they didn’t think they could do before. So I’m just wondering what are the controls that protect—not regulations—but the protections for women who are least able to—in the position right now to protect themselves? BALDET: There are no protections right now. Using a lot of these currencies, if you try to send money to—it’s kind of like you put in a phone number, but it’s a little more complicated than a phone number, to identify another wallet. If you put in the wrong number you might lose your money. If you forget your password, you might lose your money. There’s a whole number of scenarios where you might lose your money, and in a way, it’s considered a feature of the system because you have full control over it. I worry about the same thing. I think how difficult it is for people to simply manage their passwords; now we’re going to ask them to manage their private keys, and it’s going to get a lot of people in trouble. And I have this conversation—I know many brilliant, wonderful academic, you know, hacker, cypherpunk, lovely people who have been fighting this fight for thirty years of saying, everyone should care about their privacy, everyone should know this, everyone should learn to code. But that is such a myopic view. And a lot of them tend to look the same as well, right? So they are talking—it’s a very insular echo chamber that you should think what I think, and then, you know, we’ll solve this problem together. So we do work on usability as a whole around all of this software so that it will get better. Early adopters and early projects are the most worse off. I think that, much like logging into Facebook by—you know, you can use your Facebook login elsewhere on the Internet, that stuff will come, but it’s a couple of years out. What is interesting to follow, though, is now called DeFi, which is like decentralized finance. The hashtag is like D-E-F-I—DeFi. And this is a number of solutions where people are trying to replicate things like what you are talking about—credit markets, microloans, property transfer, fractional tokenization of larger assets and things. And it’s really promising and really interesting, but there is no regulation. And the difference is if you look at something like bitcoin, bitcoin is a real-time gross settlement system in banking terms. That means that there is value, it goes somewhere else, it settles. It was explicitly created as a reaction to the economic meltdown in 2008. It is not meant as a system of credit, it does not work—it does not enjoy short selling, you cannot really do securities lending with it. People have tried to retrofit a variety of those kind of things onto it, but it wasn’t made to do that. When you look at some of these other networks, like Ethereum where the decentralized finance stuff is happening and some others, they function—they allow more, they like to call it programmable money, so they are made to handle extensions of credit. The problem is that right now there is no way to accurately—at all—manage risk. And just because you have offered someone a loan and then can repay it does not mean that you—like, general people, regular people do not know how to price risk. And until you have a robust market and way to get competing quotes so that you can figure out and automatically tell people what they’re getting themselves into, this stuff is off-the-charts risky. POWELL: I have Naureen Kabir. Q: So I would love to hear you talk more about terrorist financing and money laundering. I work with the—for the NYPD’s Intel Bureau. And you’re right, our money-laundering and terrorist-finance cases are difficult enough with conventional ways of transferring money. But I’d love to hear more from you about the other security challenges that you foresee that exist with cryptocurrencies and blockchains and things that we perhaps aren’t even thinking through. We’ve obviously seen an increase in individuals expressing interest in transferring money via cryptocurrencies. You know, it’s a world that I think we’re just starting to get insight into, so I’d love to hear more about some of the challenges you think exist in that realm. BALDET: Yeah. It’s a bit of a misnomer that bitcoin is private. It’s pseudonymous at best. But it is pretty easy, especially if you are targeting someone or a group of someones, if you can access their computers, their network traffic, variety of other things, you can find out what all is going on. Q: But if you can’t and you’re relying on legal process. BALDET: Right. So the entire ledger is public and there are a number of companies, like Elliptic and Chainalysis, that perform—they do analysis over the publicly available blockchains. And when it comes to something like bitcoin, it’s a lot more fruitful because it is not actually as private as people think. Now, criminals are often not quite as smart as they think they are, and so as long as they keep using bitcoin that would be great for everybody. The thing is there are cryptocurrencies that are more private than bitcoin. Something like Monero is one example, another is called Zcash. I am on the board of the Zcash Foundation for what it’s worth, which is a nonprofit that seeks to work on internet privacy. But even when you’re looking at something that is meant to be, as an academic project, a creation of a fully private, digital currency, there are things like it features selective disclosure from the beginning. So if you would like to disclose your transactions you can. For example, you could walk into a bank and just as you open a bank account, you could open a Zcash wallet and have agreed to share this information with your bank. What that means, though, is that you can end up creating different kind of classes of this digital money. And it’s one of the problems with bitcoin. It’s not actually fungible. Like, did you hear about the NHS ransomware hack? I think it was almost a year-and-a-half ago at this point. But they were—they were basically ransomed, all their data was encrypted, and they had—they wanted some bitcoin in order to decrypt it. But you could watch, you could look at the malware, you could look at the address that was in there, and you could watch it on the public blockchain, you could watch the bitcoins kind of coming in to unlock these files. And first of all, they only made about thirty thousand dollars, which I think people were expecting it to be a lot more. But forever, you can trace those coins and see that they were involved in this—in this hack or this crime. And because of that, it creates nonfungible currency. It’s like, you know, I guess there’s a bit of an anecdote that every hundred-dollar bill has some trace of cocaine on it in the U.S. Certainly true in the ’70s, I don’t know about now. But you are not held responsible for that, right? Like, if you put a hundred-dollar bill in an ATM, it is not legally your responsibility to have known that or cleared that. So the issue that banks deal with is understanding how many hops back they’re responsible for because, in a way, it’s significantly more traceable than actual cash is, but we’ve created this kind of social contract and business contract that as long as you’re following traditional AML rules, you know, ten thousand dollars, this kind of thing, you can put cash in an ATM. And we can do the same thing. We can say you’re transacting twenty-five dollars’ worth of bitcoin, that’s fine. You flip flags of, you know, we can look at layered transactions, things that add up, and use all of today’s systems. It’s the exact same thing. The problem is simply that they’re not interconnected right now and, in a way, it’s almost not disincentivizing, but it’s, like, ironic that we could be doing a better job of knowing what was happening with these transactions if they were integrated with the traditional financial system because then we would have ties that we don’t have right now. Right now it is a bit of the wild West. POWELL: OK. I’m going to—did you have—you good? BALDET: Yeah. POWELL: I’m going to come over here to Cindy Chin and then I’ll come back around this way. Yes. Q: I don’t know how to turn this on. Sorry. OK. Hi, Amber. It’s great to see you again. My question is actually more on supply chain. And I wanted to know what the landscape is like today. I know what it was like last year, but I wanted to hear from you and your perspectives on how it’s come a year later. BALDET: Right. So we’ve been talking about money, and if you’re not in this every day you might wonder why we’re all of a sudden talking about supply chains. That’s because anything you can track on a spreadsheet, you can track with a blockchain as a data structure. And so there’s been a lot of interest in track-and-trace and have a provable, immutable record of a good through a supply chain. I think it’s very promising, as it has been from the beginning, I think because just deal with regular data as opposed to dealing with financial assets it, like, removes a big barrier to entry. But the problem—the problem there is not technical. The challenge is industry coordination and competitiveness and none of these people trust each other. And you get a big player like Maersk and IBM, right, like, when they did that project TradeLens I think it’s called, and, you know, they wouldn’t—somebody has to bootstrap this thing, and there’s a lot of R&D costs that’s going into the early networks. And the large companies who see first-mover advantage will flag run, they will put up the money, they’ll put in the teams, they’ll invest. But because of that, it doesn’t look like a mature, decentralized network where we’re all a group of equals and peers, and so it’s difficult for participants that are coming in, these other companies, to trust and know that somehow Maersk isn’t getting something extra out of it. And that was covered in the press a little bit, so I’m sure you know about that already. So in the last year, I think that—I mean, I saw a really interesting project actually a couple of weeks ago that was doing oil and gas tracking. And I think they’re coming along. You know, it’s, like I said, it’s not a technical challenge, so it comes down to industry lobbying, standards setting, and, like, the will to do it. And like I said, getting all these, like, smaller players to be able to collaborate is a real strength of the technology. So maybe—maybe—Maersk wasn’t the right player to do that initially. But no doubt, the supply chain will be on a variety of blockchains in a decade. So you want traceability from your farmer to a tomato in your fridge? At what point do you expect the privacy to stop? Like, at what point do you want to be protected? Sure, it’s great if some head of lettuce it turns out it has salmonella on it and there’s a recall, but do you want your health insurer to know what you’re eating and if you bought enough vegetables this week? So supply chain also, at the last mile, has a whole host of privacy concerns. POWELL: Let me come—I can’t read your card. Q: Alex. POWELL: Alex—OK, let me come to Alex and then I’m going to come back over to Maryum. Yes. Q: Hey, I appreciate you speaking. I run a different startup in the blockchain and finance space. And I would echo the sentiments, that was very concise and clear the way you presented it. And I think to the point that, you know, essentially, it’s a database or, you know, the concept of a database in a different way. That, to me, is morally neutral, just like all technologies. And specifically, I wonder how you think about—right now, I think there’s opportunity in that the culture of this powerful technology is not fully defined. There’s a few different groups from corporate people, antigovernment initiators, some scam artists, and other groups. And I wonder—and I think there’s opportunity there more than I think it’s easy to get lost in the technical jargon. But there’s really culture that’s going to define a lot of the outcomes and I wonder how you think about strategies or tactics around an industry that’s dominated by young white men, such as myself, to influence culture in a way that’s more focused on gender equality and other types of disenfranchised groups. BALDET: Yeah. I think it’s interesting that we have these conversations about fostering usage of underrepresented populations at the same time that the types of folks that you’re mentioning are having conversations about their fears of being de-platformed and becoming oppressed because you can’t shout on Twitter about things. So technology that allows you to do something in an un-censorable and unstoppable way without having to ask someone, I don’t know if I would ever say technology is morally neutral. I think it does carry a lot of the intent of its creators, whether they intend it to do so or not. But it is important to recognize that that means that this technology will be used for a variety of things. You know, I sometimes say, which is a stolen quote, but that every tool is a weapon if you hold it right. And we cannot put the genie back in the bottle. If we—if we prevent usage by people who just want to do normal, everyday things, then the only people who are going to use this are people that are looking to share hate speech and do terrorist financing and everybody’s going to say, see, that was bad from the beginning, and that becomes a self-fulfilling prophecy. So by focusing on use cases where you are fostering usage for underrepresented populations where we’re finding people that have been excluded and not just, like, building something for them, but actually involving them in that process, especially as we see one-half of the global population coming online over the next decade, right—this is, like, three billion people getting the first internet access forever—and, you know, we worry certainly in the U.S. saying it’s 23 percent of people are considered either unbanked or underbanked, meaning that they use things like cash advances and other outside-of-the-standard-banking-system practices. How does a cashless society affect them here? That’s one question. But how you get people to participate in their own economic life that have simply never had access to traditional financial tools before, I think that hopefully the answers to those problems will absolutely dwarf this kind of usage that you’re never going to be able to excise from the system. And having that kind of conversation and being super vocal about how you want to see that world, especially coming from someone who looks that way, is very important, so please advocate. And I appreciate that. Thank you. POWELL: Great. Let’s come to Maryum Saifee. Q: Hi. Maryum Saifee. I’m a CFR international affairs fellow and I’m working with an organization called Human Rights Foundation, so they work with dissidents and journalists from authoritarian regimes. My question is, in countries of crisis, like, that are undergoing crisis, like, I’m thinking Venezuela with hyperinflation, what applications can blockchain or cryptocurrency have in those situations? BALDET: That’s a great question. I’m surprised we didn’t touch on that already. But definitely Venezuela is an interesting case in that there has been some organic adoption both of bitcoin, but also of Zcash. The backstory there a little bit was that people started mining bitcoin because the energy there is subsidized and because of that you could kind of be doing it for free. The entire—the entire security model of bitcoin is based on real-world economics of how expensive electricity is. It’s not based on fancy crazy math. And so then there was—there were crackdowns, there was the government would go in and arrest people and actually take their hardware and then we heard they were using it to mine themselves. Then they put out the petro, which was, like, a government-backed attempt at some cryptocurrency, which was a hilarious disaster. But because of that, there was a lot of awareness and so for a while, we saw something. People were using this bitcoin to literally import food and to buy goods that they needed, which was great. But we also saw usage of something like Zcash as a temporary hedge or a temporary intermediary currency to move into dollars, because you can’t just go directly, right, through the FX markets there, and so using that as an intermediary hop into a more stable currency. So we have to think about things like soft-power projection and how do we want to be able to foster the global access to stabilizing currencies. And there certainly are opportunities there. On the other hand, when you look at something like—I’m sure we’ve all been watching some of the protests in Hong Kong this week. I saw a picture this morning of a number of protesters lining up to pay for subway tickets with cash because since they’ve gone cashless, again, there’s now a link that you would be able to trace people’s location and who had probably been there. Right? And so these are opportunities where we can— Q: For cash, the revolution. BALDET: Yeah. Money is, you know—when it’s tied to your phone and it’s tied to your location, there’s no such thing as anonymizing this data. Not only is it trivially reversible, but it’s cross-referenceable. And so especially if there’s a dedicated target or target group, it’s extremely easy to track stuff down these days. Q: So was that part of the original— BALDET: Yeah. So certainly there’s applications in those sorts of economies. Also, the Human Rights Foundation does a whole bunch of work around this. They airdropped USBs with bitcoin into North Korea. (Chuckles.) You know, it’s—it is—it’s interesting to give people kind of “money is freedom” kind of as an ideal. POWELL: Camilla McFarland. Q: Thank you. A slightly different question. If I can just ask it. POWELL: Oh, it’s already on. Yeah. Q: Oh, great. I also work in the blockchain space at ConsenSys, a company building blockchain solutions in enterprise, consumer, all different parts of the industry. So much of the problem we have when speaking to clients, similar to how you started this lecture, is explaining how blockchain works. BALDET: Feel free to use those. (Laughter.) Q: Yeah. No, and I—and you think of sort of how much we use the internet today, but how many times do you either build something on the internet or explain it to someone and find yourself explaining TCP/IP protocols. And so when you do think or will we get to a place where blockchain will just be an inherent part of day-to-day life and people might not even know if it’s blockchain- or Web2-based? Or will it be more of a present, sort of separate conversation always, do you think? BALDET: Yeah. Well, it’s one of the reasons that I’m working on developer tools, because I think when there’s real applications and people just say, oh, I can put a widget in my sidebar that lets me accept donations or run my own crowdfunding campaign, I don’t have to pay a percentage to Kickstarter, I don’t have to get kicked off Patreon or Patron or whatever because they don’t like my project. When you can just solve people’s problems, then it will be that way. I used to feel the same way about that when trying to do the banking kind of applications. You know, nobody sits down to build a new trading platform and be, like, let me talk about packet exchange for a little while before we do this. But it’s just the novelty, and so you can’t get away from it. We really—again, we take for granted, like, people think the internet just, like, popped out in the ’90s when there was that bubble, but, I mean, DARPA projects started in the late ’60s. They were government-funded, there was no expectation of delivering some return on investment, there were no VCs standing over anybody’s shoulders asking them to build something like TCP/IP. And there were—there is an interesting analogue, though, in that at the time there were a lot of competing standards, just like there are now, and IBM was a major proponent of a specific alternative to TCP/IP, the open standards interchange that they were advocating. And they had all of these enterprises lined up and they were, like, the big guys want it, this is how the internet’s going to look, we’re already connecting to each other over our closed networks using it so, like, just fall in line. And it turned out that this open, lightweight, like, kind of hacky protocol ended up being much more applicable and easy to understand for the developers that needed to build stuff. And here we are. So I think there’s a lot of applicability to the story of enterprise blockchain now because it’s, like, Groundhog Day of people trying to impose standards. But yeah, like, the winners slide I show, you know, 98 percent of this stuff is just going to be a graveyard in five years. POWELL: Let me ask—it’s really a follow on, and we probably have time to take one or two more questions if people have them—just how—to what extent we can think of this as a leapfrog technology. So for—you mentioned, what, a third of—is that what you said—a half or a third of the population is coming online in the next decade. BALDET: A half, yeah. POWELL: That was just stunning to me, people who don’t have access to the internet, to technology. To what extent does this provide an opportunity the way cell phones did for people just leapfrogged over, you know, landlines? And then related to this, you mentioned the group in Afghanistan—Inspire, is that what it’s called? BALDET: Code to Inspire. POWELL: Code to Inspire, where girls learn to code, get paid in bitcoin. How do groups like that get up and running? I mean, are there people in Afghanistan and countries like that who are starting blockchain, cryptocurrency types of applications? Or is this all coming from the West? I mean, you mentioned the internet started in the West with DARPA. How democratized is this? So it really relates back to the leapfrog question. I guess with cell phones—and I don’t really know the history of that—but to what extent is this sort of motivated by the West helping other countries, you know, kind of come up to speed and how might we think about that in this context? BALDET: It’s super fascinating and it’s, like, all over the place. The Code to Inspire, one, is a fascinating story. You should definitely Google it. It’s a bit of an outlier in that the founder is a woman who is an—she’s a refugee. She was not able to reenter the United States and is kind of stuck there, but has this—is, like—was a CS professor or graduate, like, had a higher education and was, like, well, what am I going to do here, I need to teach these girls something. And so it’s a very unique story. But one thing that we see that might be worth looking at is you’ll hear a lot about decentralized identity these days and that you’re going to be able to, say, hold your medical records on the blockchain—please, don’t do that—or, you know, have an attestation about your diploma so that if your university every disappeared you could still prove that it happened, we could prevent all this diploma fraud on LinkedIn. There are certainly applications where we can verify that something is real. You don’t necessarily need a blockchain to do that, but we’ll talk about that another time. But as we talk about creating those kinds of systems globally, you see large consultancies and large corporates going out and doing them, especially as impact projects by the way, but especially targeting refugee populations for identity projects. And so these are people that are already displaced and do not have agency in the technology that they’re using, are likely to lose their device. And you have to ask, you know, right now, the closest thing that we have to a local wallet for your digital data is your phone. Ideally, your phone backs up stuff to some private cloud that you have so that if you drop your phone off a bridge you don’t lose everything in your life. But in the case that your phone was taken away from you and you did not have something like that set up, how do they recover that? Well, I guarantee you these consultancies have an answer and it is that they actually maintain a backup copy of everything, they maintain access, the password, the copies of the passwords, the private keys. And so we have created an interesting arbiter of identity that is non-sovereign. Ideally, you would want to have this sort of identity before someone is a refugee. And really, governments where there’s not a problem are not super interested in participating in these things because they want to be the arbiters of identity. And when we talk about identity, it then immediately gets super complicated because who I am in a government context when I pay my taxes is very different from my social media identity which is very different from my LinkedIn or professional identity perhaps. Being able to publish a blog anonymously as a whistleblower might be important. So it’s very complex and it’s one of the reasons that diversity in this space is incredibly important. I have seen some of the most naïve implementations of identity from people that simply do not understand internationalization of names, how things change when you get married, like, all of that stuff. We need to learn from the absolute wreck that has been the digitization of forms for the last thirty years, right, and not—and not do that. But it’s—oh, it’s a critical question. POWELL: So, yes, go right ahead, Nadia. Q: So I’m really just trying to—I’m very new to this language and space. I’m a former banker. So do you see a world where—and I might be misusing it—will everyone have a blockchain? BALDET: There’s lots of blockchains. Q: Just as you said something—so, but, you said specific to, like, a refugee. BALDET: Yeah, everybody would have a wallet in that world. Q: A wallet. BALDET: Yeah. Q: Which would have your—could, which you say you shouldn’t right now, have your medical, your this, your that, all of these things that you may fall off a bridge and your phone, everything gets lost and you need access to these documents and resources. BALDET: Yeah. You just want to think of it—I know this is—people in the blockchain space have done a real disservice in talking about privacy and local data and take back your data and all this stuff, because there is a model that people understand. I think we all had computers fifteen years ago. Remember before the cloud when you had files on your computer and no one else had access to them? That was—your data was taken back before we all gave it up to the cloud and other people. So really what we’re talking about is simply we want to have the convenience of access from anywhere, but we want to have the privacy and security of when you had a computer at your house. You want the assurance of a disaster recovery backup system, like why you use Time Machine or AWS or what not, but you don’t necessarily want Google scraping all your files for advertising information. So you don’t need a blockchain in order to store your data. Your computer will work just fine. But what we can do is we can use blockchain and blockchain-adjacent technology to coordinate that access to log who has access to delegate access to others and share access to that information and facilitate the backups and all this other stuff in a way that might root around some centralized cloud. So you could do this in more of a peer-to-peer way. You can involve the cloud, not involve the cloud. But it’s all about, like, instead of having one central cloud, we have many, many little clouds and you would have your own kind of little cloud and it can all connect. Again, that’s more around the data, which is a slightly separate challenge to where are my tokens in my wallet. They’re really two adjacent, but very differentiated problems. And when we conflate them, you end up with overengineered solutions where people put your medical records on a blockchain and then someone asks about quantum cryptography decryption and, you know, it’s, like, a rabbit hole from there. We don’t—you don’t need that. POWELL: Diane. Q: I’m fairly new to this as well, but I’m curious about how all of these different blockchain approaches actually make money. BALDET: They don’t because no one owns them, so there would be nobody to get the money. They’re just technology, like the internet. The internet doesn’t make money, right, it just is, it’s a network, right? But what there are is a lot of businesses who are trying to use the technology, just like pets.com in the ’90s. They want to launch their first website on this new type of internet and so they’re looking to create some sort of business value, like, say, we’re going to create an application where you can offer a microloan and someone else can get that loan and we can set up a payment structure and for that we’ll take a percentage of a fee. The blockchain technology itself and the blockchain network does not make any money off of that. It just exists. Q: But somebody is maintaining this software. BALDET: Yes, that’s a great and very astute observation that you do not necessarily get paid for that. It’s a—it’s a sunk cost. In something like the bitcoin network, if you are doing what’s called mining—which we will absolutely not talk about here—(chuckles)—part of securing the network means that you can get paid for your time. And every so often, if you are so lucky, a couple of coins go in your own wallet for your time participating in the network. But not all networks function like that. And as I mentioned earlier, there’s some energy intensiveness involved in always keeping that infrastructure running, and so there’s a lot of work into alternatives that make it almost free to run the networks. But none of them are live and none of them work yet. But yes, there’s huge overhead and people want to be paid for that. POWELL: Maybe we can just—I want to give you a minute to— BALDET: I can explain that again later. I feel like I jumped around on that. POWELL: We only have two minutes left, so I just—I want to ask you maybe just a concluding question and then if you have any concluding remarks, which is just sort of—you know, I think versions of many of the questions are, you know, what can—what do we bring to this party? And so, you know, I’m wondering, kind of thinking ahead, what role, if any, can governments do, not so much regulation, but to support, whether it’s USAID, other development agencies? And also, you mentioned the woman in Afghanistan who was a student in the United States, computer science. I’m an academic. What role can U.S. universities play to support either people in computer science, business school? What, if anything, can we do to support the growth of this, particularly for people who don’t have as much access, women, other disenfranchised people, people in developing countries? BALDET: There’s tons of new academic programs. I hear about them every day, you know. Some university, every MBA program, a lot of the legal programs, a lot of government affairs programs or government policy programs are adding, whether it’s a weekend class or an extra certificate or something— POWELL: I have a colleague who’s teaching a blockchain course, basically, in law school. Yeah. BALDET: Yeah. And so you do get a lot of kind of active participation from subject matter experts that will come in and talk and things. Of course, that’s not going to scale forever. So, sure, it would be great to include that in the curriculum. I think why people are interested in it, though, especially in the MBA space, because they saw a lot of people making a lot of money. And that is not necessarily the right question to be asking. And I wonder why, you know, policy, people that are going for a program in policy are so interested in this because it’s, you know, on the cover of Wired and Economics and what not, and they’re not getting a weekend certificate in information security when cybersecurity is by far the largest threat or challenge that we have right now. And so they are adjacent. And I like to use one as kind of a backdoor for the other. And, you know, similarly, if you’re at a corporate and you could never get funding to go really buy that corporate, you know, cybersecurity stuff you needed, saying it’s for a blockchain project is a great way to get funding. (Laughter.) And so, you know, I hope that it doesn’t—the programs do not end up with hype and let’s talk about ICOs and let’s talk about, again, cash, security, commodity, taxation. I guess there’s room for all of that, but there are fundamental issues that could be, to your point, you know, a challenge to democracy and access globally. And I hope that people want to and are inspired to spend time working on those challenges first or as well. POWELL: Thank you so much. Please join me in thanking Amber for speaking with us. (Applause.) (END) This is an uncorrected transcript.