>> Well, now that we have a better sense of how our election might work this year, let's move our exploration forward to consider the effects of technology on elections more broadly. We all know that casting a ballot is the most obvious form of political participation. But there's a very large realm of informal political participation, where citizens inform themselves, where they act together with others to communicate their interests, and where they engage in dialogue. We can call this the public sphere, and media, such as newspapers, radio, and television play a very large role here. In our technological age today, there is a digital public sphere where social media platforms and search engines are the main players. It's no surprise to hear that many people around the world acquire information from online sources, even when that information is produced by old world media. The cost of producing and distributing content, text or images, or video in this new world has gone to zero. It's allowed a massive democratization of voice. That's the upside of the new world we live in. Every one is a producer and a consumer, and can easily distribute information online. Digital technologies empower people to produce and share content and to act collectively all with access to more knowledge than ever before, and across borders and distances previously considered insurmountable. Think for example, about the Arab Spring, the Umbrella Revolution in Hong Kong, and many of the color revolutions across the world, including events like Black Lives Matter, or the Me Too Movement, the Yellow vests movement in France, the youth lead climate action in the US and Europe. None of these would have been possible without social media. It's a form to put it simply of hashtag activism, where social media is the primary or predominant form of communication. But on the other hand, these very same social media forces that connect individuals the world over and allow them to produce and share information, lead to a super abundance of content, an ocean of information that threatens the very health of a functional public sphere. Nowadays, we need a search engine to find the information we want, and algorithms curate what we see in our news feeds. This flood of content has contributed to the rise of a so-called post-truth politics. In which citizens lacking easy ways to sort good from bad information are susceptible to misinformation, disinformation, and even straightforward propaganda. Social media and other large technology companies often will tell us they're not responsible for the content on their platforms for the simple reason that they don't produce it, they don't write it, the users who act on the platforms do that. However, the business model of the dominant tech firms, which is about capturing and then reselling users attention, can reinforce or exacerbate political polarization by locking people into filter bubbles and echo chambers where the like-minded speak to the like-minded. That ultimately increases the distance between political discourse and reality. I think it's fair to say that we all have an understanding of what a social media platform is. But we don't yet really understand what companies like Facebook, YouTube, Twitter, Instagram, and TikTok, do to the information ecosystem in our societies. To explore these ideas, let's hear from Joshua Tucker and Joan Donovan. Joshua Tucker is a professor at NYU and he's one of the Co-founders and Co-directors of the NYU's Center for Social Media and Politics. He's also the editor of a great book with Nick Persily, whom we met earlier, on social media and democracy. Joan is the research director of the Shorenstein Center on Media, Politics, and Public Policy at Harvard. She leads the field in examining internet and technology studies, online extremism, media manipulation and disinformation campaigns. Let's start with what she thinks about when she thinks of social media. >> When I talk about social media, I think about it as a three-part concept. I think about it as a tool, that is, some people are just going to use Facebook to keep in touch with relatives, and it's like a telephone in that sense. They're going to put their Facebook event page up or whatever, and invite people over for cupcakes. It would be really great in the pandemic people to invite people over for cupcakes, wouldn't it? That would be awesome at this stage. When I think about people using it as a tool, I think about that. But it could be the case that you are an advertiser and you want to use it as a tool, and it helps you reach people who are really interested in your brand of cookies or whatever. But then I think about it as a tactic. When I think about movements, I'm thinking about it less as a tool and thinking about more tactical strategy. Which social media, or are all of them equal in helping you get your message across? What we see most about the ways that different social movements as well as the nefarious people that I study, like white supremacists and violent misogynists. They are multi-platform actors. They have to spread out across every platform. That redundancy and the overlapping networks that they're creating are a buttress against some kinds of content moderation. Then the third thing I think about is social media as a territory, and that's where the governance question comes in. Which is to say, if you are in charge of the overseeing that territory, you do have a responsibility for ensuring the safety and security of those people. I think about, maybe it's not useful to think about these technologies as these one-to-one communication mechanisms. But if broadcast certainly has public interest obligations, so I think that when we get to governance and social media, we also have to think about is this a broadcast technology? If so, what are those responsibilities? Then the other piece of governance that I'm often vexed about is this shifting obligations. Right now, social media companies are saying, "Hey, yeah, legislate us, whatever, it's fine, we'll just adapt, whatever you want to do." Meanwhile, they're pouring millions of dollars into lobbying to prevent any [LAUGHTER] kind of regulation. The things they tell us that they want to accept are not the things they're actually trying to move inside the halls of congress and across different nations. There is a strong impulse to regulate platforms at this stage, and it seems to be coming from both sides of the house. But we'll see down the line. Yeah, I don't think we're going to get any movement on it until after the election. >> Josh before you answer, I want to amplify a version of Marietje's question there with some of the material Joan just gave us. Because I also was struck by your use of it as a tool. Joan just gave us this bit. It's a tool. It's a tactic. It's a territory. Joan also expressed that thought really succinctly, how the mere introduction of something seemingly so trivial, the Like button, is a design choice on social media platforms, where when I like something, I'm feeding information to the platform about what it should show me next algorithmically. When we talk about filter bubbles, echo chambers, and polarization, if I'm feeding the platform information through my likes about what it should show me to keep me on the platform, the hypothesis is, it's going to feed me things that are close to what I already believe because that's what I tend to like. Who wants to be shown news over and over again, or updates over and over again that piss you off, or make you upset? When I think about that fact, that the mere trivial choice of design implementation of the Like button, amongst lots of other things now, of course, helps to constitute this information ecosystem, it doesn't seem like a tool anymore. It's just not neutral sitting there. There are some baked in value choices by the platforms themselves that make it seem something different than a tool. Then to govern it, if this is our civic public square, this is where information now is distributed and consumed. If it's all about ad revenue, where are the responsibilities? The governance question can't just be about content moderation on the platform decided by Mark Zuckerberg, or whomever. There's got to be external legitimacy, it seems to me. Push me beyond the tool because I'm not happy with that. >> Maybe a better word is an arena, and it is a place in which political actors contest for power, the same way political actors contest for power through elections and they contest for other places. I'm going to stick with my thing that it's neither good nor bad for democracy, and I'm going to push back on you. Play a little devil's advocate here for a moment, Rob. The thing that you have just said, and trust me, I'm going to flip in a second and not defend the social media companies in 30 seconds, but the great horror of this product is it's trying to give consumers a product that they like. We're like, literally, you are coming after them for a Like button. You could have come after them for a hate, they could've put a hate button on there, and then imagine the discussion we're having here. Just keep in mind, that's the great criticism you are making here, is that the social media information ecosystem has somehow got completely corrupted because they are trying to provide their consumers with a good experience. We don't talk about movie theaters being completely corrupted because they keep tickets, they see what movies do better, and then they try to bring in movies that more people are going to come see. We don't talk about ratings on television shows having completely corrupted the entertainment industry. Maybe we do talk about that and I'm just in different rooms when people are talking about the dumbing down of entertainment. Probably people do talk about that with entertainment. But I do want to pick up on Marietje's point and your point as well about governance and unintended consequences. Because absolutely, the number one thing that people are calling for in terms of governance is regulation of speech, is that there should be some speech that's not permitted to be said on these platforms. In particular, lately, what we're talking about is whether or not there should be governance around the speech of politicians who are running for office. The first thing I want to say is, I completely get where this is coming from. There are horrible things that happen online and they don't happen randomly. As a white adult male with resources in this country, I'm much less likely to get attacked in these ways than lots of other people are. I watched what happened to [inaudible] in real-time, it was devastating. It was horrible. I completely get where people are coming from with these calls. It is all the worst things about power. It's all the worst things about anonymity. Yet, at the end of the day, as you said in the middle of your thing here, do you want to allow Mark Zuckerberg to decide what gets to be said in political contests? Joan heard me say this before. If your liberal friends come up to you and you say, "You know what I really think we should do, I think we should give ExxonMobil the right to regulate how people talk about politics, what is acceptable speech and what's not acceptable speech." People would say you're crazy. That's a giant, powerful, multi-national corporation with massive financial interests in something that's destroying the planet. But Facebook is ExxonMobil. It's a giant company. It's not producing oil, but it is a giant, powerful political actor. My background is in comparative politics. Before I got into social media and politics, I studied politics of post-communism. Again, when people say, well, Facebook just has to do something. It has to start weighing in on the scales and not letting politicians tell lies on the platform. Well enough, but what if you woke up tomorrow and Mark Zuckerberg had been replaced by Donald Trump Jr. Because that's what happened in Russia with VKontakte, which was a Facebook-like platform that was run by someone who wasn't cooperating with the Kremlin and he got pushed out of the country and replaced by a Kremlin crony. We have to be careful for what we wish for here. Governance is tricky. What we don't want to do, is we don't want to make bad policy. What I would say at this point is that the huge barrier to making good policy is we don't have enough information as to what is going on on these platforms. If we want to make good public policy around regulation, the first regulation we need, let's call it the regulation to own them all, is that we need access to data to be able to understand what is going on in these platforms. We need access to data to see what the consequences are. Facebook is being used by billions of people. Small, little regulatory decisions could have massive impacts on the world. As Joan talks about, there are lots of people with very low levels of resources who use Facebook to organize. You put in things that make it much easier to shut down content but have an appeal. [OVERLAPPING] I'm going to model Donald Trump here just like yelling in the Zoom. I would add that there was democracy before Facebook and these groups will figure it out if there is no Facebook. The point is, is that a lot of these groups have left because they've been harassed, because they've been imitated, because their voice has been tarnished. Their base doesn't even want to be there, some of these groups. The moment we're in now is qualitatively and quantitatively different than the heyday of when people started really using this. If we want to talk about who's using Facebook right now to get organized, the most significant event that was organized this year on Facebook was the pro-gun protest in Virginia on MLK Day, where a bunch of gun enthusiasts, gun rights advocate showed up armed 10,000 of them. That's who's using Facebook now to get organized. We don't want to just conflate where people are organizing because some people still organize there, because the bigger networks and organizations have moved on and are now using newsletters. [OVERLAPPING] I would push back, even if that's happening in the United States, Facebook is a tool that's used globally around the world by people to organize. If you begin to make these changes in governance in the US, it opens up the possibility of making these changes in governance the world over, which may have unintended consequences. [OVERLAPPING] In other countries the context is different, and I would suggest that yeah, we need different policies in different countries, especially to deal with the creeping authoritarianism. The problem isn't that small groups of some people are using Facebook to get organized, by and large that it could be happens and nobody pays attention to it. It's when a regime pretends to be a social movements, or 100 journalists, or an entire town and starts to attack and in some cases try to genocide smaller groups of people. The way in which Facebook tilt the balance of power is something that Mark Zuckerberg is the only one in charge of. It's not the case that Facebook is beholden to any board or any oversight. Facebook is a very difficult case to think with in this moment, especially as we know that people with a little bit of money, a little bit of resources, and a whole lot of power are using it to silence and oppress other people. I would say that Facebook now has less benefit than it did before for folks that get organized and that is a global problem. At the same time, it's opened up a whole new bag of tricks for disinformers and media manipulators to reach people directly. That's where I'm just like, I don't know how to parry that with, well, just give us the data. Because once we get the data, we still have no leverage to say, this is what should happen. The hard part about the data question is that I come to Facebook as a user, not because I think some social scientist in Cambridge is going to analyze what I have been posting pictures of my dinner. That's not what I come there for. The way in which academics have tried to wrangle the data is also an issue here because part of it then if we go a layer below in terms of governance, we get down to privacy, and then all gets lost.