The Evolving Discourse of Social Media

A conversation with Renée DiResta

The Evolving Discourse of Social Media

Listen At: Apple Podcasts | Spotify | YouTube | Pocket Casts | Overcast | RSS

Transcript

Aaron Powell: A lot of the issues that seem to be hot button when we’re talking about digital conversations, communities on social media, aren’t new. There have always been communities of humans talking with each other and passing information around. And sometimes that information is good. Sometimes it's bad. Sometimes misinformation spreads through these communities. There's always been levels of toxicity and hostility and canceling each other and so on. But we don't get as worked up about these issues or see them as pressing or demand that stuff be done about them outside of digital spaces the way we do now. So what's different? What makes digital spaces flash points for worries about these standard features of human communication?

Renée DiResta: That's a great question. I would say the first thing that comes to mind is norms. We have, in a lot of ways—as you note, when people talk to each other in person, or even virtually as we're doing now—a certain series of … the way that you've become culturally attuned to how to have a conversation with somebody. And even if you're doing that in a big group—lecturing at a town square, sitting in the proverbial public square, we're sitting there, we're having conversations—there are expectations about how we comport ourselves. Somebody who comes in and is a heckler, that's actually really frowned upon. That's not a thing that we want to have. We want to have that discourse. We want to have that dialogue. And that's because when you're sitting there in the confines of a physical space, you're listening with your ears, you're looking at the person, there's ways in which you engage and ways in which you ascertain whether somebody is communicating with you in good faith, signals you look at to determine if they're lying to you, or to determine how trustworthy they are. And virtual removes the vast majority of that, so you're operating on a very, very different structure.

And the “tired media theory” trope, but it's quite true, is that the structure really, really shapes the outputs and the style of engagement. And so social media became this vast, many-to-many environment where we could all talk to each other, we could all say whatever we wanted to whomever we wanted. And there were certain aspects of it where the norms had not yet been established. And in some ways, a lot of the norms around how to have a discourse—you know, now it's a meme, right? How do you have the discourse?—but how to have conversations online became very much rooted in, are you dunking on your enemies? Are you owning your enemies? You want to get attention. You want to differentiate yourself from the horde of other users who are posting on the internet as well. So you have a system of incentives, that are mediated in part by algorithms, not by interpersonal communication. You’re performing not only for the persons that you are trying to reach, your human audience, but you also have an algorithmic overseer that's going to determine whether your content, your commentary, is seen by more people.

So there's a very different incentive structure and a very different set of norms that's evolved in this space that I think feels very, at times, hostile and toxic in a way that the same two people who would fight on the internet might be able to have a perfectly civil conversation. There's actually an effort looking at this called America in One Room. Could you bring people who were politically divergent into a physical space? Turns out they actually engage quite well. Put them back online and all of a sudden you are a member of a warring faction performing your identity for all of the observant bystanders who are taking cues on how to behave. And in many ways, the most toxic people, the most extreme viewpoints, got the most attention and came to serve as avatars of what it meant to be a good member of a particular identity or community. And I think that over the last, maybe, seven years or so this has solidified into some pretty terrible online norms. And that's where I think that that tension is felt by a lot of people who don't necessarily know how to react to it or how to behave in that space.

Aaron Powell: Is this a result, then, of the digital mediation or intermediation between us? You and I are talking digitally right now, but it's this kind of weird halfway because I can see you and so you can read my body language and I can read yours, which is different than when we're interacting on Twitter or Threads or Bluesky or something like that. But that's somewhat distinct from the algorithm creating the incentives because we could imagine social media without the algorithm. This is the reverse chronological feed that people demand should be the standard for every social media platform. We could also imagine … I mean, in-person has its own sorts of algorithms. Like, as you were describing this: you become kind of more radical, more shrill, and that signals that you're more serious in your community, it gets you more attention. You know, this is ... the history of emerging religious faiths …

Renée DiResta: Yeah, absolutely. I think Cass Sunstein's done a bunch of stuff on this in the pre-internet era, too.

Aaron Powell: So is this just a matter of, if we could tweak the algorithm to emphasize more civil communication, that would fix it? Or is this basically an unfixable problem within digital spaces where, like, if you and I disagree, I might yell horrible things at you within the anonymity of an online space, but if I have to actually see your pained reactions to my words, I soften them? I'm not as much of a jerk in person as I might be online. And that seems unfixable unless we just all go to video chats.

Renée DiResta: I think there was this interesting experiment during the pandemic, Clubhouse, which I know still exists, but I think it was really kind of a big thing during the pandemic. I really enjoyed it, actually. I really enjoyed it a lot because you could hear people's intonation. There was no video, it was an audio chat for those who weren't on it, but it felt like you could have a little more of a dialogue where you could hear the other person. And people who were good storytellers, good interviewers, would really thrive in that environment.

Anonymity is an interesting one because it's a double-edged sword. I think for many, many years now, you've seen people have various opinions on how to reform the internet and ending anonymity is one that comes up constantly. You might recall Facebook, one of its differentiators was the idea that you were speaking to someone who was a validated, true-named account. That was actually what it set out to be. It turned out that people manipulate. And in any system, you're going to have some group of people that are trying to manipulate it. But I think the problem of scale might be the real differentiator here, in that the more you try to have larger and larger communities of people, you do see that kind of jockeying for position, as you noted. You do see that desire to rise to the top. Certain voices become more representative.

One platform that I really like a lot that I think does this very well is actually Reddit. And what I like about Reddit is it has persistent pseudonymity. So you're not deluged with random anons. You can kind of tell if somebody's been a member of the community for a while. You're actually in some communities, if you post and you're brand new, there's a delay. You don't get to just spam the subreddit with 10,000 thoughts. So there's a friction there. There's also the upvote, downvote dynamics where other members of the community are serving as reinforcement. So somebody who goes and leaves a terrible comment is going to be downvoted into oblivion. It'll eventually be kind of hidden from view as a non-constructive comment. You have moderation that's set by the moderators. So there's top level platform dynamics that say, like, “There are certain things that we are going to ban; here they are,” very generic list up at the top. But then after that, you actually have very, very detailed rules and rules that shape the norms of the community that say, “This is our community for posting pictures of cats, and if you post a picture of a dog, we're going to kick you out.” And that's just it. That's the social contract that you enter into when you become a member of this community. And if you decide to be an asshole and post a dog and they kick you out, you can scream censorship if you want to, but that doesn't actually matter. You violated a norm that a community has set. This is the space that it wants to create. It's very clearly articulated. You know what you're getting into when you join.

Some Facebook groups are like this also. You join the group, it posts a whole list of rules. I don't know if you've engaged in Facebook groups. The mom groups are a whole thing. They have a whole list of rules and you accept the rule as conditional upon entry. And then these volunteer moderators who are also members of the community are in there and they will enforce the rules. It is not the kind of thing that you can do at mass scale, but when you have a few dozen, a few thousand people, you can create that kind of environment. And it's not necessarily anonymity that's the problem because Reddit is very pseudonymous. But it is a question of, “we have all chosen to opt into this rule set. We've come to an agreement that this is how we are going to engage in this space.”

Aaron Powell: How is that mechanism different other than just maybe size of community than, say, Twitter or Facebook? Because what you're describing is also how those places operate. There are moderators in the sense that there's Meta or there was Twitter's trust and safety teams and their content moderation policies. And if you violated the community norms, you wouldn't get engagement or you might get deplatformed. And that happening is what has caused the current conflagration about social media. We're recording this a few days after the Supreme Court heard oral arguments and I think it was the NetChoice case about whether you can kick conservatives off. And I think about it in the context of—

Renée DiResta: —Well, that's what it was ostensibly about, to be clear.

Aaron Powell: Right. But it seems like a lot of what we're seeing now is people not really upset about the cat community on Reddit kicking out someone who posts a dog picture, but the broader community on Twitter kicking off someone who posts a swastika and saying that's an unfair abridgment. What's the difference there?

Renée DiResta: So, you called it “the community on Twitter.” And I guess what I would push back on is the idea that there is a Twitter community. Twitter is an infrastructure for many communities the way Reddit is an infrastructure for many communities. But you might think of Reddit as more of a bordered set of communities, more clearly delineated. You know that you're going into the cat subreddit and this is how the cat people want to do their thing. On Twitter, you are organized into networks, meaning you choose to follow and be followed back. You create, you form your communities. It's like an open crowd. Closed crowds, you have people who are there, they're forming around a particular religion or a particular group. They're the Kiwanis Club, they're the cat group on reddit, and there's a sense of, “We are here because we are of this community and we are adhering to these norms.”

What you have on Twitter is an open crowd where you have many different people with many different types of norms and they all see each other. It is an infrastructure for forming, for having conversations, but it only has the top level mod. On Reddit, you have the top level mod and then you have the community mods and the community mods are the ones making the cat rules. On Twitter, you have to try to create one set of rules for all of these different factions that have all assembled on your platform that are all very tightly networked on your platform, who are also then constantly seeing the people that they hate in their field of view. And in fact, the algorithm will—as people hate-read content … the algorithm does not understand that you just hate read that, so you're going to see more of it.

There's tons of social science on this at this point. Chris Bale's book comes to mind, Breaking the Social Media Prism. He's at the Duke Polarization Lab. I don't study polarization. I'm not an expert in polarization. But what they find time and time again is that a person who is on Twitter as, let's say, a liberal, sees these conservatives saying these things and feels very angry about it and feels that it is her responsibility to respond, to correct the record, to tell them that they're full of shit, to go and fight back. Because Twitter is not the public square—Twitter is the gladiatorial arena. We are there to do a thing. The incentives of that platform are not to have nice, closed-crowd conversations about cats. It's actually to say, “Your norm is censoring my expression.” The conversation around trans people is, I think, possibly one of the clearest examples, but the sort of norm that the platform tries to set about how you speak to people, what name you use, what pronouns you use, the sorts of rules that they try to put in place, was in response to legitimate harassment in which individual people were targeted. And so they try to create these rules, but then you can actually, it turns out, rile up your faction more by arguing that the rules are foundationally illegitimate. That they are biased against you. And then your faction feels aggrieved, whatever the rule is.

You see this on Bluesky too. Whatever the rule is, your faction is now going to be unified in its hatred for that rule. A lot of it is actually this ref-working about, when you have that one central, top-down moderation regime, if you will, you are working the refs to try to make it the most favorable for you and your group. And that's because the thing that we haven't really touched on here is that we're not just talking about cats, right? We've got world leaders on the platform. We've got people in pursuit of political power. We have people in pursuit of massive wealth. And so the ability to be algorithmically rewarded versus to be throttled translates into material wins and losses in terms of either elections or financial “lucre,” if you will. And so that's why this idea of Twitter, as this incredibly high stakes arena, is foundationally different than the ways that people behave on Reddit, in the cat subreddit, and you're not constantly out there like, “Well, as a liberal, let me tell you!” … you don't have to do that. That's not how you're performing, but on Twitter you are.

Aaron Powell: I think there's also a further factor that adds to these problems on Twitter and other centralized or social media platforms where everyone's thrown into the same space and then builds what you called an open crowd. When you're on Reddit or—I cut my teeth in the era of web forums—you have this clear indicator that you are in a specific community. You can see I'm in this subreddit. It's labeled as such, that's the community I'm in. The other people I'm talking to are in that community. Or I've gone to this particular URL that is running this forum software and joined this particular forum and it's this community. On Twitter and similar kinds of platforms, you're all thrown into the same space. And we know that conceptually—we know that everybody's on, there's several hundred million people that I've been thrown into space, but you're seeing a very narrow slice of that because you're seeing the people that you chose to follow, you're interacting with the people who chose to follow you. And so you've constructed what looks like a Reddit cat community in the sense of, it's a small network around a shared set of interests, but you don't have the visual indicator that it is a subcommunity. And instead, what you think of it as is, “what I am seeing is Twitter. And the conversations I'm having and the norms and the kind of shared epistemology just is Twitter.” When someone from outside, when you get the context collapse and someone from outside of it pops in, you're like, “this person is a fringe outsider as opposed to they're just another community that I have bumped into. And the things that we know to be true in my community must be what everybody knows, because it's just what everybody I see on Twitter is talking about.” When people push back on that, it feels less like they're in a different community and more like they're challenging kind of the whole world. I think that contributes to, like, “the norm should reflect me, not just because I'm working the refs and I want them to reflect my subcommunity, but this is just what all of Twitter is.” And so of course they should support it because it would be nonsense for them to support something else that is not representative of Twitter as a whole.

Renée DiResta: I think you really saw that on Bluesky. I think you were kind of early there, as I recall. I was very early there. I had had a chat with the CEO. I'm at Stanford Internet Observatory. We're very interested in new and emergent platforms. And we're interested in them, actually, because of several reasons. One, what is the sort of trust and safety framework that they envision for community governance? But the second is, whenever you have a new entrant into the information space, things kind of reshuffle. New communities form. And I was just interested in what changes. How do narratives move with the new entrant or the new technology?

I remember joining Bluesky when it was very early. And I didn't even know what to post. I feel like I've been online forever. And I got there and I was like, “Alright, people are posting AI generated art.” This is March of 2023, so about a year ago. And they're posting a lot of AI generated art. And then there were a lot of nudes. I was like, “OK, I'm not really sure how I fit here. I don't really know what to do here.” So I kind of lurked. I posted some AI generated art. In some ways, I liked it, because it felt like a little bit of a reset. I didn't have to post about the stuff I normally talk about. I wasn't there to grow an audience or talk about my work or anything like that. I posted like, “Here are these crafts I made with my kid.” I found these random gardening people. And before it had formally organized into feeds and I had a garden and I would post my gardening stuff and I would have really nice conversations about it. And I felt like it hadn't yet factionalized, if you will. It was small enough that even though I couldn't quite figure out where to enter the conversation a lot of the time, it also felt like people weren't fighting about politics constantly or fighting about whatever social culture war issue constantly.

But I do remember, they had a couple of things where they screwed up some basics. Letting people register slurs in their username, things like that, things that were not good. But there were also these moments where you would see that it became a very left-leaning community. Because it was an invitation-based network at the time. And so people invited their friends. And so it was a lot of left-leaning people who invited other leftists, anti-fascists, anarchists—and those communities kind of sprung up. And what was very interesting about it was, even though the entire premise of the platform, from its foundational reason for being, was that it was going to be composable moderation. Different communities were eventually going to run servers and moderate themselves akin to Mastodon. People really didn't seem to actually want that, right? They wanted very specific moderation types and moderation rules. And they wanted them because they felt that Twitter—at this point, Elon owned Twitter—had rolled back some of those things. So in a sense, what they had lost, if you will, in the ref-working fight over on Twitter, you began to see manifest in the ref-working fights over on Bluesky as people tried to say, “These are the rules that we want to establish for our community on this thing.” So you do see, even as the network is forming, even in the early days, the recognition now among a lot of users that establishing this as a friendly (I hate the word safe) space for their community is, in a sense, they tried to turn Bluesky into what Parler and Truth Social were for the right.

And so it's going to be interesting to see, now that it's open to the public and is still moving into this decentralized world, how they're going to handle that. Because the way I see it is between Threads and Mastodon and Bluesky, the trend is towards decentralization. And when you have decentralization, there are actually no refs to work. Eventually there are no refs. Those other communities that you don't like will set their rules for their piece of the platform and you choose to federate or defederate, which just determines whether or not you see it, but it is still out there. And I think that this is where, unfortunately, the kind of content moderation culture wars on the big centralized platforms are leading more towards decentralization and everybody moving more off into their own worlds, as opposed to finding ways to bridge those gaps and say, “Okay, here is how we will create top level all community norms that will achieve maximum happiness for the greatest number of people on the site.” And I think in some ways, that experiment is maybe being shown to be just a loss, it's just not possible. And so this is where it's almost like this collective retreat into smaller spaces.

Aaron Powell: Yeah, I mean, I am a big fan of decentralization, but the dynamics you just described have been really fascinating to watch play out because it gets to those like conceptual confusions. Because I think a lot of these problems are humans … we communicate a lot. We're very used to communicating. It's our jam. But it's only recently that we have had these new platforms, and these platforms introduce a lot of conceptual weirdness to the way that communities function, the way that communities interact, the distribution of what we say, that you feel like you're having a private conversation, but it's really a public conversation. And it's not quite the way that you and I could be sitting at a coffee house having a conversation and it's kind of private, but there could be other people listening in. It's that hundreds of millions of people, anyone in the world who wants to listen in our conversation, can just click on it and do it. And as you were talking about this, it put me in mind if there's a fight right now about … Bluesky's opened up, there's Mastodon and the fediverse that Threads will eventually join, and there's a guy building a bridge between Bluesky so that a Bluesky user could follow someone on Mastodon and see their posts within Bluesky. And there's been this huge fight in the Mastodon community because, on the one hand, they want to use this platform where anyone can follow them, but on the other hand, they don't want anyone who's using Bluesky to be able to listen to them, even though the conversations they're having are public. And it feels like a lot of this navigating it is this tension in interests. On the one hand, we want our tight-knit community with our norms that we can have a good conversation with and we can feel like we belong. On the other hand, we want engagement and a large audience and a place to push links to our newsletter and to promote ourselves. We want to be influencers with the highest follower count. And those things aren't really compatible.

Renée DiResta: Well, it's an interesting point. I don't know that we all necessarily want to be influencers. I think that is an interesting dynamic in and of itself. I'm trying to remember what the stat I saw—Oh, it’s TikTok. You know, we have what's called the 90/9/1 problem. 90% of the people are simply lurkers. 1% create the majority of the content on the platform. I'm trying to remember what the 9% do. I think they're in there contributing sometimes, but they're not trying to be the sort of influential 1%. And I think on TikTok, it was something like 25% were creating the vast majority of the content, which actually is quite large. It indicates a pretty big dynamic in creators. I was actually surprised by how many were creating.

But the question of, do you want to be an influencer that is known by people outside of your faction? There's a very small, I'll use the word elite, group of people who think that way, who run newsletters and things like that. Then there are the people who want to have what you might call local influence. So the way that I've described it is you have the influencer, then you have the crowd. And the people want local influence within the crowd. They want to be seen as authoritative in their small community, but they don’t necessarily want that massive visibility and reach. They're not looking to shape the national conversation. They're not activists on a particular issue. Or if they are activists, they're content to be amplifiers. Many, many people see their role, particularly on Twitter, as boosting their side. They know that they have to put out content. This has been sort of like taught to them since maybe the 2015 presidential election, 2016 election, but 2015 campaign. They know that they have to do the work to amplify their viewpoint. And so they're there and they're performing that role. They will put in their bio things like “retweeted by Charlie Kirk or whatever. They have these sort of like local icons. And so they're indicating that they are locally important, but they're not necessarily running off to start a Substack and trying to become a conservative or liberal influencer. So I think there is a divide there.

You do see a lot of people who showed up to Threads. And I thought it was interesting because Chris Cox, the chief product officer at Meta, expressed, “We just want a place for sane conversation. “Sane moderation,” I think is how he put it. But you just saw these people who found Twitter exhausting, who no longer wanted to be in that crowd dynamic, who just found the whole thing tiresome. And the people who are resistant to Threads in some ways are the people who have the massive followings on Twitter because they don't want to leave that behind. They did a lot of work to amass that. And so they have that large-scale visibility. And so, you do see some entrenchment, I think, among people who have like worked to really build a platform because they have a thing where they either want to monetize a large audience or they want to be able to influence a large audience versus what we might call the vast majority of normies who just want to be in a place where they can have a conversation without feeling like they're about to become the main character of something. And I think that fear of becoming the main character, saying the wrong thing, the context collapse of having somebody who runs a nutpicking account grab your tweet, screenshot it, retweet it, blast it all over the internet, people are afraid of that. People really don't want that to happen. I think creating spaces where you're not going to get massive amplification, we don't have to have trending algorithms pushing things at us at all times, is something that people are looking for. And I think that there is some evidence that places like Mastodon or Bluesky are really thinking, how do you have that community feel, that community moderation, but that feeling, the thing that they're afraid of, that visibility is the fear of having somebody go and pick up their stuff and turn it into fodder for harassment. Having people begin to threaten you and all of the other things that go along with visibility on the internet. And I don't know what the solution to that is. I did set my Bluesky to private for off Bluesky. I just didn't see the point in leaving it public. If people want to see me publicly, they can see me publicly on Threads. And I feel fine with that, or Mastodon. But you don't need to be public in all places at all times, in my opinion.

Aaron Powell: Yeah, it makes me think of one of the really interesting things that I have noticed, like when I was in elementary school, everybody wanted to grow up to either be a sports star or a movie star. Those were the things. My kids, the thing that all the elementary school kids and middle school kids talk about is wanting to grow up to be a TikToker or a YouTuber, is like their version of celebrity and their ideal career path. But at the same time the interactions that they have with each other online seem to be trending away from public platforms and social media. I was surprised that Snapchat had come back, but Snapchat is huge again. Or it's just small group chats in iMessages. They're not using Twitter or its various analogues. They're on TikTok, but it seems like very few of them are actually creating anything or have really a desire to actually do that on the ground versus just kind of imagining the lifestyle of one of these influencers. So I think you're right. It feels like I don't need to have these conversations in public anymore. I can just have my small friend group.

But all of the energy in building out these platforms is in building things designed from the beginning to be public. Bluesky is like radically public in the sense that even if you make your account … you can't make your account private. And even if like you turned off “outside of Bluesky,” anybody, the way that Bluesky stores its data, anybody can query your data. There's absolutely no way to have private conversations on this in a way that you can at least make a private account on Threads or Twitter and so on. So it just seems like all of our technological energy is building this thing that a lot of people, as you said about Bluesky, was designed to build a certain protocol and it was just a reference implementation of a protocol, but everybody who used it just fundamentally wanted it to be something else. Is a way out of this to just resolve this tension and to be like, if you don't want to have public conversations, don't have them on platforms that are public?

Renée DiResta: You know, I moved into the WhatsApp groupification kind of trend that happened among a lot of folks who work on controversial topics or even just don't want constant visibility. Like, you want a place to be wrong or to have a debate or to actually learn something as opposed to broadcast, right? So the moving into the various like WhatsApp groups of like friends and friends of friends, I found actually really constructive. I had a bunch that were extremely politically diverse or I was the only liberal in the group, sometimes the only woman in the group, you know, and it would just be a place to have debates and conversations. And some of those really did collapse as the vitriol and polarization increased on the outside and it would kind of make its way into the group in terms of what was shared and how did we react to either the main character drama or whatever the latest culture war was. Others persisted. It was kind of interesting to me to see which directions things went. I think Venkatesh Rao called it like “cozy webs,” the places you could go where you wanted to ask a question or have a debate or, you know … I was always kind of sad that we couldn't do that in public. There's this like the transitive property of bad people, right? Where you talk to so and so and the mere act of engagement, means that it's a liability for both of you in a weird way, like, “How are you talking to her? Whoa.” You know? And it's of course the equivalent on the other side. And so I found it sort of depressing, actually. I like debating. I like engaging. I like arguing. And yet it would just turn into: people would just go through your mentions.

I remember one pulling out … like, I once had a one-sentence conversation with a man I had never heard of. He replied to me, I liked the tweet, and then all of a sudden it turned into like, “Renee liked a tweet by a Nazi.” And I'm like, “Who the fuck is that?” I don't know who he is, you know? And I'm not famous, I'm not important in that way, but just this bullshit, this sense of like, “Oh, well, she liked that tweet, she follows this person, she reads this Substack.” It just turned into such a caustic, nasty environment that I think a lot of people, you might be surprised at how many people are actually texting with members of the other side and just don't want to do it in public. And I think that's actually bad. I think that is the worst part of what Twitter created. It was the sense that you constantly had to be arguing—that was what it meant to be a good member of a particular identity or political party or what have you. And it's actually terrible. I think the open crowds at this point are terrible.

Aaron Powell: I remember you could download block lists that were basically like here's a bad person's account on Twitter, usually like some figure in the far right. And you can just auto-block everyone who follows them. And I remember being involved in conversations about that because these major figures in far-right circles are followed by a lot of researchers and journalists who study or cover the far right. And so going to block a lot of far right weirdos, but you're also going to block a lot of people who are actively engaged in combating far right ideologies. And the response was just like, “So what? They shouldn't actually be following these people. They shouldn't be engaging with them. They should be finding other ways to see what they're up to if they need to rather than like giving them the follower counts.” And it was deeply weird and yes, it's caustic to just the way that we should interact with each other and with knowledge and paying attention to what's happening in the world.

It's died down a little bit, but in the last week there are big fights over the Taylor Lorenz’s interview of the Libs of TikTok woman and the very fact of interviewing her—even though was an interview that made it very clear that she's just not terribly bright, not thoughtful, just kind of a bundle of grievances and base urges—but the very fact of interviewing her was beyond the pale. And it seems like there's this kind of performative non-engagement or performative ignorance that is a part of these cultures of like, the proper way to respond to bad ideas is just pretend you don't even listen to them.

Renée DiResta: I think it's a very early 2010s attitude. There was this selective amplification argument that. … It's a really interesting paper—I think it's Dana Boyd and Joan Donovan from back in the early 2010s. And one of the questions was, “How should media cover?” And at the time, they still did not have very massive followings. So there was a really interesting question about platforming, which refers to mainstream media—which was still very much where that center of attention, center of gravity was—how should mainstream media cover these rising niche figures? And they were still niche figures at the time. I think Jack Posobiec had like 50,000 followers maybe when Pizzagate started, compared to the two million he has today. And it's a very, very different dynamic.

So, I would say, as the rise of influencers and that powerful communication system became ascendant, we still in some ways talk about this idea of platforming as if it's still mainstream media elevating this person who otherwise would be an unknown. And that's just not true. Fourteen years later, that is just not true. And so there's this legacy mindset that recognizes that in the 2010s, that might have been a way to … if you're going to cover it, cover it carefully, all the different sort of news literacy ways for thinking about how do you profile or cover a controversial figure who has some influence and reach without elevating that influence and reach. When you've got somebody who's got two million followers, they have the influence and reach. It's done. The ship has sailed, right? And so thinking about it in terms of platforming is completely wrong. You have to be engaging the ideas. You have to be counter-speaking. And my residual frustration with particularly the center left is that it continues to believe that that is not necessary. It doesn't see these two systems as equal, which I do. I really do at this point. Maybe I'm wrong about that, but I just wrote a whole book about it. People can critique it. But I think that as the centralized media era is also waning, as things have also decentralized on that front, you have got to be engaging with the content and with the ideas.

I remember in 2018, I wrote this article. I think it was actually called, “Freedom of Speech is not Freedom of Reach,” whatever Wired called it, but that was where that phrase came from. Me and Azar Raskin wrote this article because we were writing about recommender systems and curation. This was in 2018 as this kind of grievance about moderation was inherently “censorship” or “biased”? How could you think about what was that best path forward where the idea that you were going to simply take it down and then the idea would go away? Even in 2018, it was very, very clear that a) that was wrong and b) that was not actually even possible.

So, then we started to talk about, how does curation figure into this? How do you have the Nazis or the anti-vaxxers on a platform and not amplify them? Do you choose to have them both? Maybe you have the anti-vaxxers but not the Nazis. And you see the platforms trying to struggle with these same questions. You begin to see explicitly hateful ideologies come down. Okay, but then you have these kind of conspiracy theory groups and then there begins to be this big debate about what do you do about them? And I think in that regard, there was this mindset that because it was like three centralized platforms, if you took an account down, it would go away and the ideas would go away. And that is just not what happened. What actually happened is a lot of it went to Telegram and got much more extreme. And then it came back. And this is where I think that our understanding of “amplification” and “platforming” is rooted in an ecosystem that no longer exists. Or that at a minimum is very significantly waning and we need to be thinking much more about countering those ideas and how to do that ethically—but also quite clearly.

Aaron Powell: So if we are moving to this decentralized world where you can pick your particular server with your particular rules or you can compose your own moderation system if it's on Bluesky, but you can interact with anyone in the broader ecosystem of these servers that there might be big players, like if/when Threads enters the Mastodon fediverse, it will be unquestionably the biggest player in that. But if you don't like what Threads is doing, you can just move your account without losing all your followers to a different server. So, as you said, it feels like a lot of these arguments we're having now are based on a technology stack and ecosystem that it still exists, but it's clearly waning. So, looking ahead, then, assuming that we are describing accurately the new emerging social media landscape of this decentralized setup, what lessons should we be taking from the last 10, 15 years of our social media experience in order to try to make this new one better?

Renée DiResta: That's a great question. I think the initial network establishment is a really interesting question. So, Bluesky didn't have a way for you to import your old follower graph. Mastodon kind of tried to. There was an effort in November of 2022 or so where you could use these tools. They were semi-reliable, but that question of, how do you think about what people see? And I would say, I think recommendations is the thing that needs to be significantly rethought. Recommendations and trending. Those are the two things I always come back to.

Content moderation is an end stage process. You are already assuming that something has been created, it is somehow bad or wrong. And we are dealing with the problems at the end stage, as opposed to, are there ways to design systems better earlier, up ahead … upstream. And I think groups like New Public are doing this. Ethan Zuckerman is a big person in this space. Eli Pariser is at New Public. And the question is, if you design it from the ground up, what do you want to see? And there are some real questions around, how do you decide who to recommend to someone? Are there better ways to think about, what is the incentive?

There's kind of a flywheel effect with a lot of this. Even the very, very early days of Twitter, when Twitter began to create suggested follower lists, what they do is essentially reinforce, there's like a flywheel effect. You have some reach, some power, some follower count, and you get more because the platform thinks, “OK, this is a person who I should suggest because this is a person who a lot of people follow.” And then that becomes a self-fulfilling, self-reinforcing situation. And that's not necessarily the best way to do it.

You remember Clout? I feel like we're probably about the same age, right? OK. So Clout, for those listening who don't know, sort of sat on top of Twitter and you received Clout scores for how influential you were or how much of an expert you were. Not writ large, but it actually recognized that expertise was local. Maybe you were an expert in dinosaurs and you could have clout in dinosaurs. And so, if somebody wanted to find the dinosaur guy, there was a way to do it. And people made fun of it. I definitely made fun of it. But it was an idea before its time, I think, in that, you might remember during COVID, you see Twitter actually scrambling to find doctors to give blue checks to, to say, “Okay, who are these randos talking about this disease and how many of them have any medical background whatsoever? Maybe we should be blue checking and elevating the front line physicians.” So, it's kind of scrambled to go and find them and do that. But even something as basic as that, recommender systems that are a little more local that say, “These are the topics I'm interested in,” instead of just showing you the people with the largest follower count who have built that follower count maybe by being sensational or being grievance mongers or what have you, maybe there's different ways to surface that, to establish those networks.

And then I think on the curation front, there really has to be much more control in the hands of users. I think that there are a lot of people who—having never actually played with the levers of what a curation system can do—think that they want reverse chronological. They don't necessarily recognize that that creates its own incentives, which is for people to be very, very frequent posters. So in a lot of ways, it becomes very spammy quite quickly. But if you let people have tools where they can kind of push one thing up and push another thing down—’” want to see more black women in my feed. OK, here we go.” This was what Ethan Zuckerman's Gobo tool actually did. I don't know if it was as granular as Black women. I feel like definitely women was in there. But you could play with these different levers and see like, “Oh, look at how my feed changes when the algorithmic curation changes.” And that also, even beyond the ability to shape the feed, gives people some visibility into, “Oh, this is why I'm seeing so much of this stuff and not this other stuff.” It helps people realize even in that moment, this is the power that the algorithms have over what is being pushed into my field of view, what I am reacting to, how I am feeling is in part based on this kind of stuff. And here is where I have both more control and at a minimum can understand a little more about how the system works.

I remember in the olden days, like 2018 on Twitter, having conversations with people who—this was when the “I'm shadowbanned” narrative really began to take off—I would talk to, and I would say, “Why do you think you're shadowbanned?” These are accounts with like a hundred followers. These are not the kind of accounts that even would have risen to the level of the platform being aware of their existence. And they would say things to me like, “My friends don't see all of my posts.” The platform is shadowbanning me. And they foundationally did not understand how a curated feed actually worked and so they felt that it was some sort of reflection of a platform not liking their post or their content. That made them feel very aggrieved, which really opened the door to this persistent belief that the algorithms are out to get you.

Digital expression is weird. When we move our communities and communications into digital spaces, such as social media, the result is an uncertain landscape of new incentives, mechanisms of influence, vectors of information and disinformation, and evolving norms. All of which have profound effects on our personal lives, our culture, and our politics.

Few people have put as much thought into how these platforms function, or dysfunction, as social ecosystems as Renée DiResta, Research Manager at the Stanford Internet Observatory. In today’s conversation, we dig into what makes social media distinct, how communities form and interact online, and what evolving technologies mean for the future of digital expression.

Produced by Landry Ayres. Podcast art by Sergio R. M. Duarte. Music by Kevin MacLeod.