Economics

Labor and Employment

  • Human Trafficking
    COVID-19, Migrant Labor, and the Case for Labor Recruitment Reform
    This post is part of the Council on Foreign Relations’ blog series on human trafficking, in which CFR fellows and other leading experts assess new approaches to improve U.S. and global efforts to curb trafficking and modern slavery. This post was authored by Jeff Bond, associate director, Global Fund to End Modern Slavery (GFEMS).
  • COVID-19
    Color of Covid: The Racial Justice Paradox of Our New Stay-at-Home Economy
    In what Catherine Powell calls the "color of Covid," the pandemic has highlighted a range of underlying inequalities on race—including on the job front—now exacerbated by the health crisis and the emerging stay-at-home economy.
  • COVID-19
    After the Pandemic: Can the United States Finally Retool for the Twenty-First Century?
    Over the more than half a century since the United States embraced its integration into the global economy, it has produced both the strongest and the weakest of the advanced economies. The strengths are obvious in the United States' brilliant scientific establishment, its top-ranked universities, its lead in innovation, and its world-beating companies from Apple to Amazon. The weaknesses have never been more obvious than during the current outbreak of the coronavirus–among these a woefully inadequate health insurance system, lack of paid sick leave and other basic job protections, and an unemployment insurance system that encourages companies to fire workers quickly. The virus has ruthlessly exposed the shortcomings of a country that has failed to remake itself for the world it now occupies. When the pandemic recedes, the United States will face some of the toughest questions in its history about how to retool itself for the modern world. In my 2016 book, Failure to Adjust: How Americans Got Left Behind in the Global Economy, I told the story of how economic globalization caught the United States off-guard. For most of our history, we were a reasonably self-sufficient economy, with an expanding domestic market that was more than large enough to exploit economies of scale. So as trade, global travel, and financial integration began to grow explosively in the 1960s, the United States was slow to recognize that it needed to adapt its institutions to the new realities. One of the most telling examples is the program known as Trade Adjustment Assistance (TAA). It was launched by President John F. Kennedy in 1962 with the explicit goal of helping to support and retrain those who would lose jobs as a result of the coming acceleration of global competition that Kennedy and future presidents embraced. "When considerations of national policy make it desirable to avoid higher tariffs,” Kennedy said, “those injured by that competition should not be required to bear the full brunt of the impact.” Despite the soaring rhetoric, the program was stillborn–under-funded by Congress and overly restrictive from the start. When the surge in Chinese imports in the early 2000s contributed to the loss of millions of manufacturing jobs, only a small fraction of displaced workers received TAA. Far more exited the labor market entirely through programs such as Social Security disability. TAA is only one example of where U.S. institutions are poorly designed to deal with disruptive change, which has been accelerating over the past several decades. Whether the causes are trade competition, financial crises, job-displacing automation, or an unexpected and lethal pandemic that spreads across the world, the United States sorely lacks the capacity to help its citizens manage these shocks. Two of the most glaring deficiencies are the absence of sick leave for a significant portion of the workforce, and an unemployment benefits system that requires companies to fire their employees before those workers have access to any government aid. The first has helped undermine efforts to contain the virus, and the second means that economic recovery in the United States is likely to be especially prolonged. The importance of paid sick leave has never been more obvious than during this pandemic. Workers who fear losing pay, or even their jobs, if they fail to show for work are likely to shrug off the slight cough or fever that is the first sign of infection. Yet a large portion of U.S. workers lack access to this basic right. Only 43 percent of part-time workers get paid sick leave, for example, compared with 83 percent of the full-time employees. Among the top 10 percent of income earners, 93 percent have paid sick leave; for the bottom 10 percent, fewer than one in three enjoy the same right. The unemployment insurance system is similarly riddled with holes. Benefits for workers only kick in after they have been fired from their jobs, which has encouraged companies to lay off workers in droves. In the last two weeks of March, more than ten million Americans were thrown onto the unemployment rolls, more than 6 percent of the entire U.S. labor force. In contrast, Germany, the UK, Denmark and many other European countries are supporting the wages for workers who remain employed, allowing companies to keep them on staff and resume operations quickly as the economy recovers. In contrast, many U.S. workers will not be rehired until the consumer economy picks up and companies regain confidence in the future. That could take several years, depending on how long it takes to develop a vaccine or other cures for the coronavirus. Unemployed workers also face impossible choices on health care coverage, because the United States remains the only advanced economy without some form of universal health insurance. They can maintain their former job-based plan–if they had one–only by paying the full costs through COBRA. Or they can take their chances on the ObamaCare market, where many plans come with huge deductibles. These issues are just the tip of a very large iceberg. Lower-income Americans are woefully unprepared for retirement, and the crash in stock markets will make it worse. A new St. Louis Federal Reserve Bank survey says that among those without a high school diploma, or only a GED, just 22 percent had any sort of retirement savings, and among those with savings the median balance was just $35,000. Even among middle-income families, the picture is fairly bleak. The average retirement savings for couples over sixty-one, for example, is just $132,000–enough to generate just $5,200/year in retirement income on top of Social Security. One piece of good news is that the stimulus bill passed with strong bipartisan support in Congress was properly ambitious. The measures to increase unemployment insurance, including expanding coverage to gig economy workers, will be especially critical in helping the growing ranks of the jobless. But the bill was premised on a short-term economic shutdown of no more than a few months. If it lasts longer than that, Congress will either need to find more funding, or many Americans will run out of resources. Poorer Americans will be at the mercy of whether the two parties in Congress can continue to find ways to cooperate. If the money does not keep flowing from Washington, many Americans will find themselves unable to pay for mortgages, rent, health care bills, and other critical needs. What the country needs is not a series of short-term bailouts, but long-term plans to ensure that most Americans are protected against such crises in the future. Will this be the event that finally drags the United States into the twenty-first century? We certainly have the capacity to learn. The 2008 financial crisis, for example, exposed the dangerous fragility of the U.S. banking system; reforms put in place by Congress and the Obama administration in the aftermath, for all their shortcomings, left the financial system in a much stronger place to withstand the current economic shutdown. But the lessons of the pandemic will be harder to absorb, because it has fully revealed the massive inadequacies of a social safety net designed for another era. We can learn that lesson and remake the country for the world we now inhabit. Or we can keep lurching from one crisis to the next.
  • U.S. Congress
    Andrew Yang’s Moment: The Economic Costs of the Pandemic Mean the Time for UBI Is Now
    As fears of the growing coronavirus pandemic are leading to something close to a temporary shutdown of the U.S. economy, the moment has come to listen to the most important young political voice in the country: Andrew Yang. Yang’s dark horse run for the Democratic presidential nomination was based on the simplest of ideas: if Americans are poor and struggling, give them money. He took the idea of “universal basic income” (UBI) from the stuff of think tank analyses and policy books to the front pages of newspapers. Its moment has come more quickly than he could have imagined. Mitt Romney, the Utah Republican senator, has joined a growing chorus of Democrats in calling for direct cash grants of $1,000 to all American adults to help them weather the economic hit from the virus. As Congress is considering additional measures to help an economy that is careening into recession, getting money quickly into the hands of struggling individuals and families must be a top priority. To be clear, I have not always been a fan of UBI. In our 2018 CFR Independent Task Force on the Future of Work, we called for more targeted measures of the sort that are also under consideration now—extending sick leave to all working Americans, wage subsidies, increasing tax credits for lower-income workers, and strengthening unemployment insurance. In ordinary economic circumstances, such targeted measures may offer more bang for the buck. But the overwhelming virtue of UBI is its simplicity. It gets money to individuals in need, and out into the wider economy, more quickly than any other alternative. Unemployment insurance only kicks in after people lose their jobs, and does not fully cover many part-time and gig economy workers, or others who may see a temporary sharp reduction of their income during the crisis. Aid to small businesses will be critical, but the loans are complicated and often take months to disburse. A cash transfer has immediate impact that these other measures cannot match. That money is going to be needed quickly. In just the past several days, governors in major states from New York to Washington have ordered the closure of bars, restaurants, gyms and other recreational facilities. All concerts, conventions, sporting events and other mass gatherings have been canceled. Most Americans have appropriately stopped travelling, which is pummeling the airlines and hotels. Many retail establishments from Apple to Starbucks are shutting down or reducing hours. In an economy where consumer spending drives 70 percent of economic growth, millions of American workers are going to feel the impact immediately. It is heartening to see shows of personal generosity, such as NBA rookie phenomenon Zion Williamson, who has pledged to cover the salaries of New Orleans’s arena workers for one month. But the reality is that most Americans will have little or nothing to fall back on. Even with the solid economic growth of the past decade, some 40 percent of Americans still say they do not have the resources to cover a $400 emergency. The cost of UBI, of course, looks daunting. There are roughly 210 million Americans aged 18 or older, so the first $1,000 check would cost the government about $210 billion. And there is no reason to think one month will be sufficient. The current closures are likely to last at least two months, and possibly much longer. Despite the $1 trillion budget deficit currently being run by Washington—much of it brought about by the irresponsible 2017 tax cuts for companies and wealthier Americans—there is no question that such quick relief is affordable. The Fed has now cut overnight interest rates to near zero, and in the current market chaos, investors will still want to hold even very low interest-bearing Treasury debt. The money is there if Congress asks for it. Beyond that, who knows? Americans may find that the stability provided by a steady monthly check is exactly what they need in the current era, where the economic uncertainties of daily life are multiplying. It could mark the beginning of a long-overdue rethinking of how to help more Americans flourish in the economy of the twenty-first century. The 2008 financial crisis and the Great Recession left a poisonous political legacy in part because Americans believed that we were not all in it together. Big banks and others were bailed out, while many Americans suffered through grinding months and years of unemployment or part-time work or unmanageable mortgage payments.  This crisis is a chance at a do-over. All Americans must have the means to take time from work to protect their health, or the income to stay home and support their families as needed. If they don’t, the virus will likely spread more quickly and the economic pain will linger far longer. Andrew Yang is right. Give money to people. Do it now.
  • South Africa
    What’s Behind South Africa’s Recent Violence?
    Recent attacks that appeared to target immigrants have underscored South Africa’s struggle to combat violence and limit tensions with the rest of the region.
  • Women and Women's Rights
    Behind the Screen: Gender, the Digital Workforce, and the Hidden World of Content Moderation
    Podcast
    As user-generated content on the internet continues to increase in popularity, the question of who moderates this content comes to the forefront when discussing the future of social media . Dr. Sarah Roberts, author of Behind the Screen, Content Moderation in the Shadows of Social Media and assistant professor of information studies at the UCLA Graduate School of Education and Information Studies, joined the Women and Foreign Policy program to speak about the world of content moderation and the importance of this invisible work.     POWELL: We’re going to go ahead and get started. I’d like to welcome everyone. And my name’s Catherine Powell. I’m with the Women and Foreign Policy Program here. And I want to acknowledge the Digital and Cybersecurity Program which we’re co-hosting with tonight. So some of you may have gotten the email through that program as well. It’s with great pleasure that I introduce Sarah Roberts, who is a professor at UCLA School of Education and Information Studies. I’m not going to read her whole bio because you have it in front of you, other than to say that she came to my class today—I’m a professor at Fordham Law School where I’m teaching a new seminar this semester on digital civil rights and civil liberties. And she came to speak with my students about this fantastic new book she has out, Behind the Screen: Content Moderation in the Shadows of Social Media. And it was a very engaging discussion. So Sarah’s going to outline some ideas for about ten minutes or so, and then we will open up for discussion because we have a number of experts in the room and the discussion is always the most fun part. Just as a reminder, this is on the record. Without further ado, let me turn it over to Sarah. ROBERTS: All right. Thank you so much. Thanks to everyone for choosing to spend time here this evening. It’s certainly a delight to be a part of this series, and to be present with you. So thank you. I am going to do my best. I’m a professor, so I have a problem with verbosity. We try to keep it short and sweet. I’m going to try to speak quickly so we can get to discussion. So if there’s anything that seems like a bit of unpacking, we can return to it. But I’m going to do my best to give an overview, assuming that you have not all spent as much time as I have with the subject. So basically I’ll talk a little bit about the research that’s contained in the book, and then I want to tee-up some issues that I think are pertinent to the present moment particularly, because this work is the culmination of nine years of research. We like a slow burn in academia, so it’s been simmering for some time. When I began this work in 2010, I was myself still a doctoral student at the University of Illinois, but I had a previous career in the IT field, although I had, you know, perhaps made the unfortunate decision of finishing my French degree—French literature degree rather than running out to Silicon Valley during the first kind of net bubble in the mid-’90s, so there you have it. But I have fun when I go to France, I guess. Anyway. So I was working in IT for about fifteen years before I decided to go back to school. It was going to just be a quick in and out sort of master’s degree. And I became enthralled with really feeling like I needed to pursue some of the issues that I had to live through first-hand, mainly the widespread adoption and also commercialization of the internet. I had been a user of the internet at that point for almost twenty years in 2010, and I had considered myself a late adopter. I thought I kind of missed the wave of the social internet. But anyway. So in the—in the summer of 2010, I always want to give credit where it’s due, I read but brief but powerful report in the New York Times tech section. It was sort of what we would consider below the fold. I didn’t say that to your students today because I didn’t know if they’d know what I was talking about. (Laughter.) But it was a below the fold kind of piece, a small piece about a firm in rural Iowa. I was sitting at the time in the middle of a corn field at the University of Illinois, so I could relate to these people who were described in the article as working in really what, for all intents and purposes, seemed to be a call center environment. And they were working not taking service calls for, like, your Maytag washer or your Sears home product, but in fact what they were doing was looking at material from unnamed social media sites that had been uploaded by users and which had been flagged by other users as having some issue, being problematic. And this typically fell around issues of perhaps being pornographic, or obscene, gratuitously violent, all the way to things like child sexual exploitation material, images of abuse of other sorts, and the list goes on and on. And I won’t belabor it with examples, but you can sort of imagine what one might see. What I wanted to do upon learning that as really get a sense of what to what extent this need was in fact a fundamental part of the at that time ramping up but very significant social media industry emanating from Silicon Valley. So I should just contextualize this by saying I’m talking about American companies that are based in Silicon Valley. I’m not an expert, unfortunately, on some other parts of the world. But these companies, of course, cover the globe. And in fact, last I knew the stat, Facebook’s userbase is 87 percent outside the United States. So it’s quite significant that these American firms and their norms are circulating around the globe. The findings are detailed in here. It’s also a bit—I have to admit, I guess this is a first-time book writer’s thing where you sort of go into your own autobiography and you really wax poetic. That’s in there too. You don’t have to take that too much to heart, but I think what I wanted to do was contextualize the way in which from that period in the—in the early to mid-’90s to where we are now, the internet has become—and what we consider the internet, which is our social media apps usually on our phones, right—has really become an expectation, a norm, a part of our daily life in terms of interpersonal relationships, in terms of maybe romantic relationships, business relationships, political discourse, and the list goes on and on at how these platforms are a part of—a part of the fabric of our social life. And how those companies that provide these essentially empty vessels rely upon people like us to fill them with our so-called content. Content is a funny word, because it just stands for, evidently, any form of human self-expression you can think of. And so I often come back to that as an interesting thing to unpack. I’ll tell you a little bit about what we found, and we’ll buzz through this, and then we’ll—I’ll get off the mic for a minute. But essentially what I discovered over the subsequent years was that this activity of content moderation on behalf of companies as a for-pay job—something that I came to call commercial content moderation—was viewed by the firms that solicited it as a mission critical activity. In other words, these firms viewed this practice so important as to be really unwilling to function without this kind of stopgap measure to control the content on their sites. This—you know, we can think of this as a gatekeeping mechanism, which means it’s also a mechanism by which content is allowed to stay up as much as it is a mechanism to remove. But what was really important to me to understand about the impetus for this particular activity, and then the creation and shoring up of a global workforce, was that the activity was taking place primarily as a function of brand management for these firms. What do I mean by that? Well, I mean that just as, I don’t know, CBS Studios is unlikely to flip on the camera, open the door, and ask New Yorkers to just come in and get in front of the camera and do what they will—without any control—neither are these platforms. But one of the biggest differences about the ways those kinds of relationships have come to be understood in our—in our everyday life is that I think the expectation about the former is much clearer than it is about the latter. These platforms have come to take up and occupy such an important space, in large part because they were predicated or sold to us on a—on a claim that essentially it would be us, to the platform, to the world. In fact, YouTube’s on-again, off-again slogan has been: Broadcast Yourself. I mean, they say it better than I can, right? You just get on there and emote, and do your thing, and it’s going to broadcast all over the world. So what I came to find was that in fact there was a workforce in the middle. And to me, that was revelatory, and it was shocking. I had never considered it. And I was supposed be getting a Ph.D., right, in this stuff. And I had worked for fifteen years in this area. I actually started asking other colleagues around campus—esteemed professors who shall remain nameless, but who are victimless here—they also said, gosh, I’ve never heard of that. I’d never heard that companies would hire people to do that. That’s the first thing they said. Then they said, don’t computers do that? Now, if these are the people who are—have their fingers on the pulse of what’s going on in social media, why didn’t they know? Well, that led me to speculate that in fact this practice was intended to be, to a certain extent, hidden. That actually is the case. So I’m just going to talk for a minute about what this workforce looks like, and then we’ll go into some of the maybe provocations, I guess we can call it. As we speak today, I would—it’s difficult to put numbers on what kind of global workforce we’re talking about, but I would estimate that we’re thinking about maybe 100,000 at this given moment. The number I arrive at for that may be conservative. But I take that number from looking just at the public numbers that Google and Facebook now offer up around their workforce, which are in the tens of thousands. Those are two platforms out of how many? The Snaps, the Instagrams—they’re not counting Instagram—the TikToks of the world, right, whatever the latest thing is. I’m starting to show my age and I don’t even know what’s going on anymore. But anyway, so any—essentially any company that opens the opportunity and—as a commercial entity—opens the opportunity for someone to upload is going to introduce a mechanism to control that. And that’s where we can arrive at these numbers. The thing about this globalized workforce is that it’s diverse, it’s dispersed. You can find it a number of different industrial sectors. But there are some things we can say about them overall that they share in common. And those characteristics that I think are important to mention is that this work, and the workers who undertake it, are typically viewed as low status. They are typically low wage earners. And they are typically limited term workers for a firm. So the expectation is not that one would make a lifelong career at this work. We can think about why that maybe is. It may in fact be because you wouldn’t be able to stomach this beyond this—right? We’ve got the shaking heads. It’s like, no thank you. I—personally, I couldn’t do it for a day, much less a year. But it’s often limited term. The work is often also to some extent done at remove from the platform that actually needs the service. So how do they do that? Well, no surprise, it’s going to be contracting, outsourcing, and other sorts of arrangements that look other than permanent and look other than direct employ. They often, of course, in the case of the United States, for one, given that circumstance, lack significant workplace benefits. Now, when we start thinking about the fact that this work can put workers in harm’s way psychologically because of what they view as a precondition of the work, that lack of benefits, that lack of—and even under the Affordable Care Act people might not be able to afford mental health support, because we know that’s often extra. I mean, I know even in my health care plan as a UCLA professor that’s something I would have to pay for, to a certain extent, out of pocket. How might a worker, who’s contractual, and low wage, and low status, go about obtaining that? Now, when we think about this work being global, we also know that there are places in this country and in other parts of the world where mental health issues are highly stigmatized. And so seeking that help is also encountering barriers just based on cultural norms and other sorts of attitudes towards that kind of—that kind of support. And so really, what we’re looking at is a system of work that has been essentially outsourced and devalued. And yet, those who are near to this kind of operational activity within firms know that it’s mission critical. They know that this gatekeeping mechanism and ability to control what’s on the platform has a fundamental function in their operations. And they really wouldn’t go forward without it. As one person quite candidly put it to me once: If you open a whole on the internet, it gets filled with, blank. And so that was her way of telling me, therefore every time we allow someone to upload something into essentially this empty vessel, we have to have a mechanism to control it. OK. So I’ll talk a little bit about the outcomes here. I’m just going to list them. We can come back to them. But the primary findings in the book, I would say, are as follows. We’re talking about a fractured, stratified and precarious workforce, as I’ve mentioned. You will find this workforce not sort of in a monolithic site that can be easily identified as this is where commercial content moderation is done, but instead in a variety of industrial sites and sectors, some of which might not be recognizable to workers who are actually doing the same work because of the setting or because of the nature of the work. What do I mean by that? Well, some people go to work every day at Silicon Valley. They sit next to engineers, or maybe down the hall as the case may be. But they have a different color badge. They’re contractors. While others do this work disembodied over a digital piecework site, like Amazon Mechanical Turk. It maybe even has a different name. One person might be doing work called “community management,” and another person is doing dataset training for machine learning algorithms. And guess what? They both might be doing some form of commercial content moderation. So this—when we think about how workers might self-identify, these are the features that make it difficult. There is a patchwork approach to covering this labor need. So, again, global. And, again, using these different industrial sectors, because there’s simply often not enough people available to just be taken up into the apparatus and be put on the job, just going straight through one particular firm or one particular place. This is where we’re now seeing big labor provision firms in the mix. The Accentures of the world and others are now in this—in this field. Workers, again, globally dispersed. And one final thing that I’ll say that I think is actually very key, again, it is often secretive. The work is often under a nondisclosure agreement. Now, many of you know that Silicon Valley asks you to sign a nondisclosure agreement every time you turn around. It’s sort of a cultural norm. But this is taken actually very seriously for these workers in particular. So I had to guarantee a certain level of anonymity and use pseudonyms and other things when talking about particular cases in the book. I talk about a case called Megatech. And I was speaking to the class earlier today, one of the funniest things that I never would have expected is that when I meet people from industry and Silicon Valley, and have over the last few years, they say to me: Well, we’re sure that our company is Megatech. We know you’re talking about us, Megatech. I’ve had, like, six different firms tell me they’re certain that they were the field site. (Laughter.) Now, I can neither confirm nor deny that, so that leaves them a little anxious. But I find it fascinating that so many companies see themselves in what I reported here. I never would have expected that. That’s the beauty of doing research, I guess, and having it out in the world. OK. I just want to give a few provocations or thoughts about—I know everyone here is quite interested in policy implications and things of that nature. So I want to give a couple highlights about that. I’ll say I’m not a lawyer. I kind of like to hang around with them a lot. They seem to be a good crowd for the most part. (Laughter.) But I’m not one myself. But the nature of this work, and the operationalizing of this work, means that I have to have a direct relationship to what’s going on at a policy level, and then ever further at a regulatory level. And that’s sort of been an interesting trajectory for me that I might have not expected originally nine years ago. So what’s going on? What are the pressure points on industry around this type of work? Well, I would identify them as follows—and I guess this is sort of in a sort of order, but not really. The first one I would say is regulation. Regulation is the big one, right? That could mean European maneuvers at the EU level. So we’ve seen things like that already—GDPR and other regulations passed sort of at a pan-European level, but also at the state level. Germany, for example, or Belgium, or France, where they have pushed back on firms. We have heard about, and are seeing, movement around antitrust in the United States, for example. We have seen discussion and invocation of perhaps a need to revisit Section 230 in the United States, which is what has allowed these firms to grow to where they are now, because it has given them the discretion to both decide to keep up, but also to remove, again at their discretion and to their benefit, content over the years. And then there’s this—I guess the next kind of layer I would talk about would be litigation. We have seen a number of interesting cases crop up just in the last few years—this is a new trend—of current or former content moderation workers, working as commercial content moderations, who are filing lawsuits. So there’s been a couple lawsuits that have been individual cases. One that was very interesting was about Microsoft. They had a hard time claiming these workers were not direct employees because, as you may know, Microsoft got the pants sued off of it a couple decades ago around this issue of having long-term employees they called contractors. So that was an interesting thing with that case, where the people were unequivocally direct employees. But also there is a—there is a class-action suit that’s in the state of California right now. There’s also rumblings of some cases being filed in places like Ireland which, as you may know, is a huge operations center for Silicon Valley firms, for no small reason because it’s a tax haven. OK. What else? Journalism. Negative journalistic coverage. This has been a big one, a big pressure points on the firms. Exposés around working conditions for content moderation. We’ve seen them over the years. I’ve partners with many journalists, and many journalists have broken stories themselves around this. It tends to focus on the negative working conditions and the impact on the workers. Doesn’t often go to some of the deeper policy dimensions, but it’s the kind of headline that shocks people, and it brings people into a position of, first of all, knowing that these people exist and, secondly, taking issue with their—with their treatment. And that leads us to consumer concern. Of course, the biggest fear for platforms is—well, maybe not the biggest—but a huge fear for platforms is losing their userbase. They want to gain a userbase and they want to keep people coming back. Of course, they’re advertising firms and they connect these users to advertisers. But if consumers become dissatisfied enough they may leave the platform. So when that sort of rumbling occurs, they respond. And then finally, last but not least—well, I guess I would say also academic research has had an impact to a certain extent here. But last but not least, labor organizing. This is seen as a huge threat. Again, the same with the regulatory pushback. I think labor organizing they’re trying to avoid at all costs. I think it goes without saying that these firms site their content moderation operations in places that are probably on the low scale for strong organized labor—places like the Philippines, for example. Places where the United States might have had a long-standing colonial relationship, and therefore firms there can say things like, our workers have great colloquial American English, as if that just happened by happenstance. (Laughs.) It didn’t, right? All right. So I think I’ll leave it there and we can just open it up? Is that good? All right. I tried to be short, sorry. (Laughs.) POWELL: So as is our way here, please just turn your card sideways if you would like to ask a question. I certainly have questions of my own, but I’m going to first turn to you. And I’ll just jump in later. So let’s start with Anne (sp). Q: OK. So I just want some facts. ROBERTS: Yes. Q: Where are these people geographically? What is their demographic? Are we talking about Evangelical Christians? What are their value sets? What is their filter? Because—you know, how hard is it to control what they do? ROBERTS: That’s right. OK. So the first company that I came in contact with was this Iowa firm. And this firm’s tagline was quite literally, “Outsource to Iowa, not India.” So they were setting up this relationship of don’t go to the racialized other somewhere around the world. You want your content moderation homegrown, are good Iowa, you know, former farm family workers. Of course, their farms are gone, so now they’re working in call centers. So that was something that they actually saw value in and they were willing to treat as a commodity, to a certain extent. What’s going on now with the larger firms is that—so these are—these sites can be found in places like the Philippines, especially for American firms, but also in India. Then for each country that sort of introduces legislation that’s country-specific—for example, Germany. Suddenly, there needs to be a call center in Germany, because they need to respond quickly to German law, and those people have to be linguistically and culturally sensitive to the German context. So these places are springing up, frankly, like mushrooms all over the world to respond to the linguistic and cultural needs. How do they homogenize the responses? This is the difficulty. Well, you would not believe the granularity of the policies that are internal. If there are—if the percentage of the image is flesh tone to this extent, delete. If not, leave up. If the nipple is exposed, delete, except if it’s breastfeeding. You can now leave that up. Except if it’s sexualized, delete that. So these are the kinds of decisions that have been codified— Q: From headquarters? ROBERTS: From headquarters, correct. And the expectation is that the workers actually have very little agency. But what they do have is the cognitive ability to hold all these things in their mind at once, which guess can’t do that very well? Computers. Algorithms. Not successful in the same way on all of this content. Some things computers can do well, but the cost of building the tools to do this and the worry of having false positives, or losing control over the process, means that humans are filling the gap. I think there’s a sensibility in Silicon Valley that this is just for now. That soon that we’re going to have computers that can do this. Q: But— ROBERTS: Right? Thank you. That’s what I say too. And if you talk to folks close to the operations, you know, in a candid moment they’ll say something, like, look, there’s never going to be a moment where we let the machine loose without some sort of engagement in human oversight. In fact, when the algorithmic tools are unleashed on some of the content, what has been happening is that it goes out and aggregates so much content that they actually need more human workers to sift through the stuff. So it’s actually not eliminating the humans out of the pipeline at all. Hopefully that answers— Q: But in the U.S. case, Facebook, Twitter, they are using Filipinos and Indians? It’s an outsourcing industry right now? ROBERTS: And, again, that’s— Q: In some instances. ROBERTS: Yeah. Yeah. I mean, there’s—again, it’s like a patchwork, right? So there might be folks who are local. There might be people who have specific competencies who are employed to look at certain types of content, or certain cases. An example I would give is Myanmar, which Facebook took a lot of heat for not having appropriate staffing. You know, they’ve staffed up. So there are people who are, you know, kind of country-specific, like the way we think about people who work in policy work, actually, right? But there is often a fairly significant gap between those people who are—who are putting into operations the rules, and those people who are making the rules. And that’s another big kind of tension point, if you will. POWELL: Let’s go to Joan next. Q: Hi, Sarah. ROBERTS: Hi, Joan. Q: I’m Joan Johnson-Freese, a professor at the Naval War College. ROBERTS: Hi. Q: Thank you for a great presentation. I’m wondering if you could talk a little more specifically about the gender aspect. ROBERTS: Yes. So actually in my research I found that it was fairly gender-equal in terms of who was doing the work. One of the interesting things however is that in talking to some of the workers who were female or female-identified, in particular one woman who was working on a news platform, she talked about the way in which her exposure to hate speech that was particularly misogynist in nature, or that would typically include rape threats or other kinds of gender-derogatory terms, was affecting her personally to the point that she described herself as—I’m sorry you heard this already—as a sin-eater. And she was supposed to be employed part time, but she found herself when she would be maybe out to dinner, out to a restaurant, sneaking away to check her computer to see what had filtered in. And she talked—she was a person. She’s female-identified. She self-identifies as queer, working class, and oriented towards racial and social justice although she’s white herself. And she talked about the way that misogynist language in particular and threats, homophobic speech and threats and behavior, and racially insensitive and hostile material was starting to affect her so much that she felt like when she was not even on the clock she would go in and check the site, because if she wasn’t there doing it she felt like others who weren’t prepared to see the material were being exposed. Right? So she described herself as a sin-eater to me. And she said, I call myself a sin-eater—as if I knew what that was. I didn’t know what it was, I admit. So I asked her to describe it, and I looked into this later. And for those who don’t know, it’s a figure—something of a folkloric figure. But it’s a person who in the context of England and Wales was typically a poor villager, man or woman, someone destitute in a particular community, who would upon the death of someone more prominent volunteer to take a loaf of bread or other kind of food that had been passed over that individual, was imagined to be imbued with their sins, and would eat it. That person would therefore be absolved of the sins and go to heaven, and the person who was eating the sins would, I guess, suffer the consequences later. So that’s how she described it. And she—in the book we go into detail about her experience and how it became very difficult for her to separate her multiplicity of identities. But especially as a woman, and as a queer-identified woman, dealing with the kind of vitriol that she was responsible, essentially, for cleaning up. So that was a pretty stark one. (Laughs.) That was—that was tough. Yeah, thanks. POWELL: Let’s go to Catherine (sp). Q: Yeah. This is super interesting. And I actually have an experience as an early comment moderator myself, because I was the sixth employee of the Huffington Post, who would get phone calls from heads of—like Dick Cheney’s office, calling and saying: Could you please take this negative comment down about the vice president? And we would—you know, it was from the Secret Service. So, anyway, lots of stories there. But my bigger question is, what—like, it sounds like you’re talking about the labor force and this unrecognized labor force. But then from what you just said, it’s the fact that we have this unbridled comment stream of hate and how are companies ever going to really reconcile? Like, when is the moment where they finally say: We have to do something bigger than just moderate all day? ROBERTS: Well—(laughter)—what—if we can solve that this evening we can go find VC investment and we will—we’ll resolve it. But I think—you know, if I can sort of read into what you’re saying, I mean, I think your discomfort is on a couple of levels. One is, this is the function of—good, bad, or ugly, however you feel about it—Section 230s, internet intermediary definition of these platforms as being able to decide to what extent and for what purposes they will moderate. So that’s the first thing. But I think the second thing is a little less apparent. And it has to do with business model. It’s not as if it was a foregone conclusion that the whole world would just flood channels with cat pictures, and this was my sister-in-law’s wedding, and whatever they’re posting or, you know, Nazi imagery or other—you know, terrorist material, child sexual exploitation material. But there’s actually a direct relationship on these platforms to the circulation of material that we call content—which already, again, I would say is a ridiculous, too-general category—and monetization, and treating that material as commodity. So what I’m getting at here is that the platforms are a little bit in a bit of a pickle, to say the least, about how they have developed a business model that’s predicated on a constant influx of new material. Why? Well, because they want us to come back. If it’s just the same stuff all day every day, they don’t think we’re going to come back. What is—what is going to get the most hits from viewers? Is it going to be something really boring and uninteresting, or is it going to be the things that’s just maybe this side of bearable and everyone’s talking about it because it’s viral, right? So these are the kind of economics and logics that have been built up around the constant influx of content. And so it’s gotten to the point where this computer scientist that was at Dartmouth, and he’s now at Stanford, who developed one of the primary AI tools to combat child sexual exploitation material, and actually does work very well in that use case, he pointed out in a paper that he wrote, and then I cited him heavily in a recently in a paper I wrote, where he said: Look, what’s never on the table when I’m in boardrooms is, what if we slow down the hose of the influx of the material? That’s never under question. And he’s—for heaven’s sake, he’s the developer of this tool. And he’s the one thinking, hello, the always on, constant on, kind of unvetted uploading from anyone in the world is maybe not an awesome idea, right? Like after the Christchurch shooting in New Zealand, which was a horrible massacre, that was maybe the first time you heard Facebook seriously question, maybe we shouldn’t just let everyone in the world turn on a livestream and go for it. Maybe it should only be trusted users, or people whose info we have or something, right? So we get back to this problem of the business model. And it’s the thing that it’s kind of like the elephant in the room. It’s, like, the thing that they don’t want to touch because that’s how they make their money. They monetize the content that we provide. I’d also say that we are unfortunately fairly implicated. And I mean, like, look, I’m sitting here with my phone, tweeting, doing all of the things, right? We are implicated ourselves as users and being a part of the economy. But I can’t in good conscience tell everybody to throw out their phone and get off the platform, because I can’t do it. So they’re—I don’t know. There’s got—you know, there’s a slow food movement that came up a number of years ago because people were sick of the scary supply chain industrialization of their food, right? And I often think about, who’s going to come up with slow social media? Q: Yeah. No, that’s sort of my—I have a friend who’s pretty high up at Facebook. And they’re complaining about how the guy who wrote what the Zucc, or something—or, Zucked, advertises on Facebook all the time. Like, the very— ROBERTS: Yeah, right? Q: But then they’re making money off of that. Which is like a terrible cycle. ROBERTS: Which is, like, also—yeah. And these people are probably completely disembodied from that ecosystem anyway, right? So I think one of the other things I just throw in the mix to think about is that we’ve hardly tapped any control mechanisms that might be at our avail in other realms. So things like—again, like some of these regulatory things. Or even the fact that these firms have been, for fifteen years, been able to self-define almost exclusively, without intervention, as tech firms. It’s not just because they have an allegiance to the tech world that they call themselves that, but what if they called themselves media broadcast companies. Well, what happens when you’re in broadcast media? Can you just air anything you want? I mean, George Carlin made a career out of lampooning the fact that you can’t, right? So, you know, one day at some point years ago I thought, let me just go look at the FCC’s rules on broadcast media and what you can and can’t do. Let me go find the comparable thing for social media—oh, right? And yet, they’re all engaged not only in soliciting our material, but now they’re engaged in production of their own material too. I think about YouTube as, like, the prime example of that business model relationship, where we have literally people getting checks cut to them if they—if they get a lot of views. So there’s a whole economy now, and the logic of the platform, that almost goes unquestioned and seems innate. And yet, it hasn’t been that long that it’s been this way—which is one of the things I’d like to think about. I don’t have the solution, however. Remember, I— Q: More like, is there going to be a tipping point? I mean, that’s what I—yeah, if you’re seeing it. ROBERTS: Yeah. I mean, I don’t—I’ll tell you this. Like, I don’t like to do prognostication because, again, I decided to do my French degree and not go to Silicon Valley in the ’90s. (Laughter.) But I don’t think—if I had to bet, I don’t think the pressure will come from the U.S. I think the pressure is coming from Europe. Yep, and they’re very, very worried about that. Q: Did you see that the Danes have an ambassador to Silicon Valley? ROBERTS: Yes, they do. I saw that. Indeed. Q: I was just in Denmark. And you know, these people think differently. And they’re going to think harder about the regulation issues. ROBERTS: But you’ll also see—you’ll also see social media CEOs be received as though heads of state. I mean, we’re talking about policy that rivals legal code. Q: And economies that rival maybe the GDP of some small countries as well. ROBERTS: Correct. Correct. POWELL: So we’ve got Rufus (sp), and then Kenneth (sp), and Abby (sp). Let’s go to Rufus (sp). Q: So, a two-part question. And they kind of play with each other. So this is mission critical from a brand point of view, and it supports their advertising, and, you know, you want to have control over your platform. But I’m curious in terms of is the—is it somewhat a resource problem? Like, are they just not investing enough in it, and therefore you have very bad labor practices, and that’s the problem? And then the second part of that, of my question, actually has to do with maybe how it’s different in China, because it seems like they moderate their content real well. (Laughter.) And they have social platforms— ROBERTS: Yeah. Let’s copy that model, right? Yeah. (Laughs.) Q: Yeah, no, but, you know, I’m just curious. Like, clearly they have control over their social platforms in a way that we don’t. And I wonder if there’s anything to learn from that or be afraid of in terms of we should control more. Does that— ROBERTS: Well, to answer the first question, I think it—I can’t just say yes or no, right? I’m going to have to— Q: Sure. ROBERTS: Sorry. (Laughter.) I’m sorry. I think it is a resource problem, but it’s also a problem of prioritization. So how can I put this? This function, although it’s been present in some form, I would argue, since the platform started, was never thought of as central. So it was always a bit of an afterthought, playing catch up. And I think that that position of the activity within the firm has always lagged, essentially. There’s an interesting moment in this film called The Cleaners that I was involved in, where Nicole Wong, who was at the time the general counsel at Google, was up one night making content decisions. So there were people in the firms who knew—I mean, at those high echelons—who knew this was an issue and a problem. But, you know, it was sort of, like, someone else’s problem? And it wasn’t a problem that was seen as—it wasn’t—it wasn’t a bucket that was going to generate revenue, right? It was a cost center. I mean, there’s a lot of ways to slice that. I think you could argue, for example that, well, a PR disaster in the absence of this activity would be immensely costly, or you could say that a company that has good, solid practices and has an identity that maybe they even build around their content moderation that gives a character of flavor to the platform could even market on those grounds. But the decision was made early on to sort of treat this activity as secondary at best in terms of how it was presented to the public. I think that was also because they didn’t want to be accountable. They wanted to make the decisions and have the discretion to make the decisions. So because it’s always been laggard, it’s like there’s been this huge resource shift within the firms to figure out, go figure, you know, if all you have is a hammer, everything looks like a nail. So the answer is let’s get computation on it solve it. Well, one of the reasons that they want to use computation is, of course, the problem of scale. So despite there being maybe a hundred thousand people working in this—in this sector, that pales against the amount of content that’s produced. It means that just some portion, some miniscule portion of content is ever reviewed by humans. That’s one of the reasons why they want to use computation. But another reason—there are a few reasons. Another reason is because that’s what they’re in the business of doing. And that, again, also takes out this worry about rogue actors, people resisting, people making their own choices, making errors, disclosing to the media or others—academics, such as myself—what they’re up to, disclosing to regulators or others who might want to intervene. So there are other—so we should be suspicious about some of the motives around the computation. But I think functionally, at the end of the day, there are very few companies that could actually build the tools. I mean, we’re talking about bleeding edge AI implementation. When I started this research I went over to the National Center for Supercomputing Applications at Illinois. We were in the—in the cheaper side of campus, so I went over to the monied side where the big computational stuff was going on. And I went to this computer vision lab. Now, again, this is 2010, to be fair. But I went into this computer vision lab and I spoke to this research scientist. And I said, look, here’s the problem. Let me sketch out the problem for you. Can computers do this? Is that reasonable? And he said, see that over there? And he pointed at an oak table in the middle of this darkened cube—visualization cube kind of space. I said, yeah. He said, right now we’re working on making the computer know that the table is a table. Like, controlling for every—(laughs)—you know, aspect of the—we’re way beyond that today. But it kind of shows the fundamental problem. First of all, what does it mean for the computer to know? Usually it’s pattern matching, or it’s some kind of matching. So the best kinds of computational tools for content moderation are matching against something known. This is why the problem of child sexual exploitation can be effectively treated with a computational tool, because for better or for worse people who traffic in that material tend to recirculate a whole lot of material. So it can be put in a database and can be known. But for stuff that doesn’t exist anywhere else in the world, or is hard to understand, or has symbols and meanings that you have to be a cultural insider to understand, or you have to just be a human being to understand, there are only a few firms in the world that have staff, money, know-how, the need to put workers on it. For many firms, it’s just cheaper to get humans. Now, your second question about China, I confess to being an ignoramus when it comes to China. But I would say that, you know, just off the cuff, a huge difference is that Chinese companies don’t just spring up and do what they want from the start. I mean, they are—(laughs)—I mean, they are fostered by the state and they’re typically quite intertwined with the state at first. There is no Section 230 in China, in other words, right? And there’s probably a lot more labor to put on this in China, and more of a sensibility that it’s going on, I think, than in the United States. But people have creative ways around it, always. POWELL: I guess it would be harder to carry out your research in China too, to document what’s going on there. ROBERTS: I mean, yes. Although, you know, I should tell you, I have a new Ph.D. student coming in a matter of weeks. And he’s coming to work with me because he told me he wants to do comparatives studies of the Chinese case of content moderation versus the United States case. And we’re on Skype and I’m, like, dude—shut up, dude. (Laughter.) You know? Like, we’ll talk about it when you get here, man. I’m, like, all nervous, because I don’t know who’s listening. Yeah. So I think that work will come. And I think we need comparative studies, because I am limited by my cultural context, which is the American one. But that is an important one to understand right now, because of the global impact. POWELL: Kenneth (sp). Q: To what extent can you offer specific normative suggestions on how to improve content moderation towards the ideals that you have? ROBERTS: Well, I think—yeah, it depends on what we consider an improvement. I think for the purposes of the book, it has to do with working conditions. So let’s take that as the goal. And to get ideas around that, I’ve often relied on the workers themselves, since they’ve thought so much about what would help them. I think there are a few things—well, I think there are a number of things that we can think about. The first thing that comes out of everyone’s mouth, you won’t be surprised to learn, is: Pay us more. I mean, it’s sort of a flip response, but I think it says a lot, because when I hear workers say that I hear them say: Value our work more. I also think the secretive nature of the work is something that impacts the psychological difficulty of dealing with the work. So— Q: Excuse me. What are they paid? What’s the range? I mean, are we talking— ROBERTS: So I’ll give you a very recent example. In May, Facebook made a big announcement, sort of leading the way in this arena of how to better support content moderation workers. They’ve taken a lot of heat, so that’s part of the reason. And they announced that for all of their American-based content moderators who are in third-party call centers, or wherever they are in the chain of production, the base rate of pay would be $15. And in other metro areas, New York, kind of—San Francisco, high-expense areas, it would be a higher rate of pay. So fifteen’s the floor, and then going up from there. Q: My maid makes twenty (dollars). ROBERTS: So, right. So this raises some important issues. Q: That’s like basic minimum wage now. ROBERTS: For—right. We know that also, again, they’re a step out from basic minimum wage that will be enacted in California, first of all. So again, thinking about how this—there’s a strategy of being head of regulation a lot of times. Q: But without benefits? ROBERTS: Well, right. And then the other thing that this brings up—there was sort of, like, the deafening silence from other industry players. I thought maybe some of them would follow suit. Q: That was way too high, yeah. ROBERTS: Yeah. But they haven’t. Google went on record and they said that, I think, 2022 they were going to get everyone there. Also, this was American only, but we know that there is so much of this work that’s outside of the United States. Unless it’s a place where the mandatory minimum wage is higher, which might be in some European cases— Q: Not the Philippines. (Laughs.) ROBERTS: Correct. So it’s usually very low wage. The other thing that companies have started doing—Facebook is one, and others—is bringing on psychological support on site. Workers told me a bit about this in their case. And they said that while one the one hand that was a welcome improvement, because they didn’t really necessarily have access to those services, it was in some cases voluntary. And what ended up happening was that the therapist, the psychological services person would come at the appointed time, take a room in the department, and anyone could come and speak to him or her. So that worker who’s struggling and having a hard time has to get off the queue, tap out of his or her work, stand up, walk through the department, walk past the boss, walk past the coworkers, and go in and sit with the therapist—thereby, letting everyone know: I’m struggling with looking at content, which is the precondition of my job. So some of them said: It would be nice if that were mandatory, and if everybody had to visit with a therapist at some—at some prescribed time. That’s another thing. I think benefits is another big thing. And I would also add that very little has been done by way of developing tools that could be supportive or assistive. When I talked to some of the workers, they were using outmoded kind of homebrew solutions. Or, in the book, we talk about a firm that was using, like, Google tools—like, Google Docs, Google Chat, like, sort of kitbashed or kind of—kind of quasi-internally developed but really, like, just commercially available stuff. I think there’s a market for tools that would allow workers to do things like specify a queue that I’m not comfortable being in today. Like, today I just—if something comes in and it’s flagged child abuse, I just can’t see that today. I’m going to tap out. I’ll take the one that’s, yeah, take your pick, right? Rape threats. I’ll take that one. But, you know, when we—when we as users report content, we usually go kind of triage that material. So that could be used proactively on the worker side to allow them to opt out. And it’s not—you know, some days you can handle it, some days you can’t. These were kinds of things that the workers reported to me. You know, usually I’m OK with animal abuse. That day I just couldn’t do it. One guy said, I just can’t take it when there’s someone screaming in the video. So maybe he could look at videos with audio off. So there’s, like, little things that we could do. Making the screen black and white rather than color or making the screen fuzzy might be a tool. Again, based on and maybe tailored to a worker preference. Workers told me that they would do things like they would look at the screen by squinting so that they would only get—you know, they would know it was gory and they could tell just by squinting if it was too much blood, according to the rules, or too much kind of violence, and then they wouldn’t have to, like, experience the whole thing. We could develop tools that could do that for them, right? And maybe if they felt like, I need—unfortunately I need a closer look, I’ll press the thing to unveil the entire image. So there are—I think there’s a lot of things we can do that it’s just frankly not been prioritized, right? It’s not the thing that’s going to—it’s not the new function that they’re going to blast around. POWELL: So we have two more questions. I think we can—oh, OK. Q: Sorry. POWELL: No, it’s fine. (Laughter.) So let’s see. We might have to get the last two together. But let’s go to Abby (sp). Q: Sure. So just—it’s a—it’s a bit of an expansion on the question that Kenneth (sp) just asked. But what do you think the changes to the labor workforce would be on the actual product, which is the moderation? So let’s hypothetically say we have a workforce that is appropriately compensated, that is centered, maybe directly employed. How would the product of content moderation change, in your view? What would look different to the user? What would look different to the company? ROBERTS: Well, again, I think there’s sort of a fundamental missed opportunity in the fact that the work was rendered secret, whereas again there were all sorts of experiences we have in our daily life where we look for expertise and curation. So what if we thought of people who did content moderation not just as cleaners or janitors, or people who sweep up a mess—which, of course, are important activities but are typically undervalued in our daily life. But what if we thought about them as people who were curators, or tastemakers, you know? I don’t know, sommelier of the internet. I’m just making stuff up, so please don’t—(laughter)—don’t say, that woman said sommelier of the internet. But, you know, people who can help be a guide rather than an invisible agent. I think that that has really hamstrung the circumstances for the workers in a lot of ways. I think thinking about—I didn’t—wasn’t able to get into this in the talk, but there’s a whole host of metrics—productivity metrics that are laid on these workers in terms of how much stuff they’re supposed to process in a given shift, for example. When I was in the Philippines, the workers described to me that they used to have something like thirty seconds per review, per item that they were looking at. And it had been cut to more, like, ten to twelve seconds. Now, another way of thinking about that is their productivity had been more than doubled, the expectation. Or that their wage had been cut in half vis-à-vis productivity. So I don’t think anyone benefits from a ten-second look. I don’t think the workers benefit. I don’t think users benefit. I don’t think the ecosystem benefits. Ultimately, I mean, from just, like, just a cost-benefit analysis on a balance sheet, I guess that comes out looking good for the firms. But I don’t think in a perfect world that we get any kind of quality any more than we think of a McDonald’s hamburger and, you know, a—I don’t know, a farm-to-table meal as the same thing. They’re fundamentally different. Q: What you’re saying is things get through that shouldn’t and things that should go through don’t? The famous image of the girl in Vietnam. You know, you all know that. ROBERTS: That’s right, the terror of war. Q: Right. ROBERTS: Now— Q: You know, just don’t do it very well. ROBERTS: Right. And you have ten seconds, and you’re a twenty-two-year-old college graduate in Manilla, you’re educated—you asked a bit about demographics. All of the workers I talked to were college grads. That has shifted somewhat now, but the workers in Silicon Valley in particular were grads of places like Berkeley and USC. But they had, you know, such as yours truly, made the unfortunate decision to major in econ, or history, these other—I’m kidding, right? (Laughter.) Like, I mean, I think these are very important disciplines. But they—you know, to be employed in STEM or to be employed in the valley, they were, like, kind of not prized disciplines. And yet, they actually had the acumen and the knowledge to make better decisions than some of their peers would. POWELL: So let’s collect Lawrence (sp) and Donna (sp) together, and then let you make concluding remarks. ROBERTS: OK. Q: So you’ve been discussing the irregularities, inconsistencies in the workforce in terms of particular categories of content, which need some measure of moderation—sexualized images, violence, and hate speech. But all these, I think there’s some margin of error. In the case of sexualized imagery, A, it’s—they’ve been able to quantify it, to some extent. I’ve had pictures I took at major museums that were censored because they thought the content was oversexualized. I thought it was silly, but so what that they censored it. Which way they err doesn’t bother me very much, unless it’s child pornography, and you say that they have pretty good methods for that. In the case of violence, again, I hope the err on the side of eliminating violence. It’s not a First Amendment concern or something like that. In the case of ethnic—things that stir up ethnic discord, such as what happened in Myanmar, again, I hope they err on the side of eliminating that kind of hate speech. But what really concerns me is inaccurate—is false content, often spread by governments, the Russian other manipulation of the U.S. elections, the Chinese and Taiwan elections, others in Europe, where it’s a question of facts. And here you have huge competing values. You have—here, you’re talking about real political issues, and governance issues, and it really should be a First Amendment right to speak on these issues. And yet, this false information is doing—particularly deliberately spread false information—is doing enormous damage to democracies around the world. So how do you begin to train people to moderate that, which is far more critical. If anything there’s, to me, less room for error. Less room for error in censoring what should be allowed. And it’s quite a tragedy that so much of this is being propagated and that we’re unable to control it. So how do you begin to deal with that? How do you train a workforce to deal with that? POWELL: So we’re going move to this question. We’re going to collect Donna’s (sp) question as well, and then you can answer them together. ROBERTS: All right. Q: Sarah, I don’t know how you remember that. I’ll make mine, I guess, kind of simple. You are familiar with the Verge—the Verge articles that have come out? ROBERTS: Yes. Q: I guess one question I have for you is I’m trying to get my head around listening to this and saying: What is it that’s really concerning you? Because part of this conversation has been about the worker, about the human piece of it. You have asserted—and I’ll say it’s an assertion—that technology can’t clear—can’t significantly reduce the gap, it seems. And then we’re talking about the social media companies, but we know that this is an internet issue. It is not just a social—it is not just Google, YouTube, and a Facebook issue. So it’s like, when you sit there and look at that—so I was trying to figure out too, OK, is your angle, you know, this is—we’ve got to go after—is this about Facebook and Google? Because if you think about it, right, they’re cleaning—they’re required, in essence, because they are commercially operating a channel, to keep that as clean as they can. And we do regulate that a little bit, right? But the fact of the matter is, our challenge in the content era is this content can show up anywhere on the internet, on any—you know, any website. And that’s the challenge. I’m sure if you followed, you know, child pornography, right, they’re not just looking on social media channels. They’re going to find it anywhere, including the Dark Web. You know, anywhere, parse video. So I guess it’s, like, who are we as society looking to to address this issue? And I guess, is it the worker piece that you’re—are you—and I understand there’s a big issue with humans, you know, involved in the processes. POWELL: You have approximately a minute and a half to answer both questions. (Laughter.) ROBERTS: So the answer to your question is, yes, it’s the worker welfare piece that first compelled me, yeah. And I think I wanted to address my remarks for an audience that I thought would have maybe more direct relationship to policy issues and regulation. But that’s—the book is concerned with the worker welfare, and that’s what my concern has always been, and that was my point of entry. I think what I found is that you can’t really carve that out somehow from the other issues. So for me, that was a foot in the door to now I have to understand the ecosystem. So what I tried to do was also map that out to a certain extent. I’m not certain that—(laughs)—I mean, I’m not sure I would necessarily agree with you, per se, in the way that you framed up the issue of it’s not an XYZ issue, it’s an internet issue, in the sense that I would say this: I find it difficult to, in the American context, locate many internet platforms or services that are not commercial. And that’s part of my—you know, that’s part of the claim that I make of why there is an ecosystem of this work going on. It’s because there was great profit to be made in setting up channels that encouraged people to upload, and to do it all the time, and to actually, in some cases indirectly but in other cases directly, monetize that activity. And that is fundamentally different from what the internet used to look like, which was not—I’m not Pollyanna about it. It wasn’t the halcyon days. In fact, it was a real mess for a lot of the—a lot of the interaction. But it was a different kind a mess and a different set of problems. So that’s sort of the conceit here. But it’s not some—you know, it’s not—it’s not a simple case of exploitation writ large without any other complexities. And it’s not a simple case of Facebook is trash, and sucks, and should close down either. Which has put me in the weird position of, like, working with these people, right, to problem solve. The other question was about basically veracity of information and gaming of the platforms. The one soundbite I’ll give you with that is I think that the issue that you raise is fundamental to the protection of democracy around the world. And I would also say that it’s much harder to make determinations about those issues than it is to know if too much of a boob is showing. And so what the companies tend to do—and I call them on this all the time. I say, you are—your level of granularity on things that maybe don’t matter is in the absence of your ability—or your willingness, let’s say, to articulate your own politics. Because guess what? Other countries where these platforms are engaged don’t have the same commitments to democracy, or to freedom of expression, or whatever it is. And they want to be in the Turkish marketplace, and they want to be in China. And that’s put them on the ropes, and put others in the position of making demands on the firms of, like, well, what are your commitments? Well, they’re very mushy middle. And so then it’s easier to look for and take care of, in a way, some of this content that is obviously bad, versus sitting and spending time, and money, and energy figuring out is this truthful or false? Is this from a vetted source, or is this propaganda? And I think, just to close out, your point that state actors are the ones who should be scaring everybody the most is a great point, for sure, because those are the folks, like you said, who are calling up Facebook and saying: Take down blah. POWELL: Yeah. We should end it there, but please join me in thanking Sarah Roberts. ROBERTS: Thanks. (Applause.) (END) This is an uncorrected transcript.
  • Sustainable Development Goals (UN)
    Taking Stock of the UN Sustainable Development Goals
    This week, representatives from UN member states meet to discuss progress on six goals of the 2030 Agenda for Sustainable Development.  
  • Labor and Employment
    Women This Week: ILO Institutionalizes #MeToo
    Welcome to “Women Around the World: This Week,” a series that highlights noteworthy news related to women and U.S. foreign policy. This week’s post, covering June 21 to June 28, was compiled by Mallory Matheson and Rebecca Turkington.
  • Labor and Employment
    Beyond Unemployment
    In modern economies, people may have jobs, but they still harbor major concerns in a wide range of areas, including security, health and work-life balance, income and distribution, training, mobility, and opportunity. By focusing solely on the unemployment rate, policymakers are ignoring the many dimensions of employment that affect welfare.
  • Women and Economic Growth
    The Economic Gains of Gender Parity
    Play
    Kim K. Azzarelli, Jamille Bigio, and Richard Fry analyze factors underlying the global gender wage gap and discuss the benefits of gender parity, with Elmira Bayrasli moderating.
  • Global
    CEO Speaker Series With James Gorman
    Play
    James Gorman discusses his approach to setting a global strategy for Morgan Stanley and the importance of effective leadership and clear communication when running a large multinational company.
  • United States
    The National and Economic Security Imperative of Helping More Americans Adapt and Thrive
    By Penny Pritzker, chairman and founder of PSP Partners; and Edward Alden, Bernard L. Schwartz senior fellow at the Council on Foreign Relations. (Note the following excerpt is from a chapter written for a new Aspen Strategy Group book called Technology and National Security: Maintaining America’s Edge. You can find the full chapter and the book here.) The United States today faces twin challenges — building its global leadership in the next generation of transformative technologies and rebuilding economic opportunities for more of its citizens. The first cannot be done successfully without also doing the second. Innovation and competition are the great drivers of prosperity, but they have also created a growing gap between the economic winners and those struggling to get by. Unemployment in the United States has fallen below 4 percent, and the well-being of Americans has been improving as the economy continues to grow at a strong pace. Yet four in ten US households still report that they are unable to cover an unexpected $400 expense without borrowing money or selling something they own. More than a decade after the last recession, economic insecurity remains widespread. This continued economic insecurity poses a growing and fundamental threat to America’s economic competitiveness and national security. While technology and global competition have helped raise incomes and living standards around the world, they have also created huge new challenges in the labor markets of many of the advanced economies, from the disappearance of once well-paying manufacturing jobs to the growth of the gig economy and other contingent work that comes without traditional employment benefits. Americans need far better access to the education and retraining opportunities required to prosper in this rapidly changing economy, and government support systems must be updated so that working Americans can again have greater confidence about their futures. The reality is that for more than thirty years we have failed as a nation in this regard. In the United States, where the social safety net is especially porous and support for job retraining is weaker than in any other wealthy country, labor market disruption has already contributed to social and political upheaval. Donald Trump was elected president in 2016 on a platform that promised greater restrictions on both international trade and immigration to the United States, blaming both for the economic challenges facing many Americans. Since taking office, the president has approved the largest increase in tariffs on imports since the 1930s, has slashed refugee admissions to their lowest levels since the refugee program was created in 1980, and has taken a series of steps to reduce the entry of highly skilled immigrants to the United States. Such restrictions on trade and immigration will erode America’s technological and economic leadership. Immigrants today — many of them initially attracted by the high quality of American universities — are more than twice as likely to start a business as native-born citizens; from 1996 to 2011, the business start-up rate for immigrants increased by more than half, while the native-born start-up rate fell by 10 percent, to a three-decade low. Of the eighty-seven start-up companies that had reached a value of more than $1 billion by 2016, immigrants founded more than half, and over 70 percent had immigrants as part of the top management and product development teams. On trade, internationally engaged American companies — those that both export and invest abroad — are America’s most innovative companies, accounting for nearly three-quarters of private sector research and development. The success of these firms depends on markets that are open to both trade and investment. And while the United States has imposed few restrictions on the deployment of new technologies, some 75 percent of Americans today are worried about a world in which computers and robots do more of the work, fearing for their job prospects, their family’s future, and that inequality will worsen. Polls indicate that the public does not favor tariffs on imports, sharp restrictions on immigration, or regulations that curb technological innovation. But the public is wary about what technology and global competition mean for their jobs and their future. Public support for economic openness can no longer be assumed; it must be rebuilt. That requires rebuilding the connection between economic openness, innovation, and better work and life opportunities for Americans. The US education system must do a better job of preparing Americans for the world of work by expanding career-related offerings; better support is needed to allow mid-career workers, or those displaced by technology or trade competition, to return to school and retrain for new careers; and the benefits that are now available to most full-time workers — health care, sick leave, vacation pay — need to be available to everyone with a job. Improving and rebuilding the links among education and workforce training, good jobs, and greater economic security is vital to our future security and economic competitiveness. As technological change is accelerating, the United States needs to show the same level of public and private commitment to meeting this challenge as it showed when the country transitioned from an agrarian to an industrial economy just over 100 years ago. Meeting the twin challenges of technological leadership and rebuilding opportunity must be the primary goals for US economic policy. Given the seismic forces of innovation, automation, and globalization, the nature of work is fundamentally changing; we must help more Americans adapt, adjust, and thrive. America needs a more forward-looking, comprehensive economic competitiveness strategy that includes an innovation leadership agenda, modernization of our workforce training and education systems, immigration reform, and expanded multilateral trade. If the United States fails to meet these challenges, it will have neither the resources nor the political support needed to play a large global role. The United States won the twentieth century because it finally got the big challenges right — education, scientific excellence, innovation, immigration, and trade. Yet, in recent decades we have not done all that we can as a nation to adapt government policies and approaches to the rapid pace of economic and technological change. Too many Americans have been left behind by the rapid changes in the economy, without the necessary tools and resources to prosper. The reality is we can do better. With diminishing opportunities, it is not surprising that Americans have been susceptible to populist promises. The United States has been here before and risen to such challenges in the past. We must do so again as our national and economic security depend on it. (For the full paper, go to https://www.asgbooks.org/technology-national-security/)
  • India
    The Interim Indian Budget: Jobs Are the Bigger Issue
    The Indian government released an “interim” budget today, a document designed to provide continuity for the next three months or so before the general election takes place and a newly constituted government takes office. This interim budget in many ways resembles a continuing resolution in the U.S. system, a “holdover-keep-things-running” function that typically receives almost no notice in the United States. (Unless, of course, one doesn’t pass and the government shuts down.) By contrast, the Indian media are treating this three-month interim budget just like a full-fledged budget, devoting attention to its meaning and its implications for industry, farmers, the middle class, and foreign investors. This is because the scale of the proposals in the budget suggests a roadmap for the entire year rather than a holding exercise. Should the next Indian government, once it is constituted at some point likely around late May, have a different focus, then it would be worth having a more extensive budget discussion at that time. But the question of the hour really has to be what to do about jobs. Against the backdrop of mounting bad news on the jobs front, it’s no mystery that the interim budget would offer support to farmers, tax relief to the middle class, and an opt-in pension plan for informal sector workers. These are palliative measures, however, that do not provide broader recommendations for how to address whatever is hindering job creation in the Indian economy. The much-delayed release of National Sample Survey data on unemployment led earlier this week to a data leak to Business Standard, and the information released should wake everyone up to the urgency of the problem. According to this report, the survey finds headline unemployment at an over-forty-year high of 6.1 percent.  If unemployment in India is indeed as high as the reported National Sample Survey figure—keep in mind that the nongovernmental Centre for Monitoring Indian Economy unemployment estimate for December 2018 was even higher at 7.3 percent—then all parties should focus like a laser on what they can do to solve the problem. Answering the jobs question with the best possible policy proposal requires the best available data. Here’s an example. It has been politically impossible—across parties—to implement extensive structural reforms that would allow India’s manufacturing sector to flourish, as has been the case in East and Southeast Asia, and even in Bangladesh. In Bangladesh, the ready-made garment industry has been a pathway out of poverty for around four million people, mainly women. As China moves out of the garment industry, countries like Bangladesh and Vietnam are picking up the slack, and this could be an opportunity for India as well. How well have the changes to India’s textile sector that were initiated in 2016—changes that allowed for seasonal employment and eased labor laws that disincentivized hiring more than one hundred people—helped with job creation in the industry? It has become abundantly clear that demonetization and the complexity of the Goods and Services Tax rollout hurt small businesses. So public policy ought to focus on how to improve opportunities for small businesses and help them grow, because they are engines of job creation. That means I agree with the question posed by Takshashila Institution Cofounder and Director Nitin Pai: “What is your plan to create 20 million jobs every year?” That’s the question that ought to be everyone’s top priority. My book about India’s rise on the world stage, Our Time Has Come: How India Is Making Its Place in the World, was published by Oxford University Press in January 2018. Follow me on Twitter: @AyresAlyssa. Or like me on Facebook (fb.me/ayresalyssa) or Instagram (instagr.am/ayresalyssa).