top of page
LEARN MORE

Intro:


[music plays]


Niki: I’m Niki Christoff and welcome to Tech’ed Up.


Today I’m chatting to Bloomberg reporter, Emily Birnbaum. She’s in the studio to break down the big Supreme Court case that has the potential to change everything about how the internet operates. I personally attended oral arguments for Gonzales v Google, and, spoiler alert; the Justices were in a mood. Especially Justice Thomas, which was a complete surprise.


Nerd alert: we’re digging into Section 230.


Transcript:


Niki: Today in the Tech’ed Up studio, I am delighted to welcome Emily Birnbaum. [Emily: Hello]  Emily, welcome.


Emily: Thanks so much for having me!


Niki: So, you are the only guest who's been in the studio twice.


Emily: That's crazy.


Niki: I know! And amazing! Thank you for coming in and taking the time. You are a reporter.  You're busy. 


You're at Bloomberg, tech policy and listen, we're gonna talk today about an incredibly important Supreme Court case. Actually, there are two Supreme Court cases. We’ll touch on one cuz it's a little smaller, but you have thoughts- Gonzalez v. Google. People listening, we have a very heavy DC audience, a lot of lawyers, but they might hear Section 230 and content moderation and just start, like, zoning out. Like the teacher from Peanuts, so we wanna dig into it in an interesting way.


Emily:  Yeah. I actually try not to say the words content moderation, [Niki: laughs] because I think that people in the tech world feel so certain that is like a widely understood phrase and I don't think it is. I think it's way more exact to say online speech. That's, this has been one of my, like, big missions.


Niki: I love this! It's rebranding. [Emily: Yeah]


Okay. We're gonna talk about this, but beforehand. So, you're on Twitter. [Emily: Sure] You're active on Twitter. [Emily: Yeah] And you had a tweet that I thought was completely hilarious- this has nothing to do with tech, by the way,! [Emily: Yeah] We're gonna do it anyway!


Why are there so many women in their twenties who are coupled up but single men in their twenties aren’t?


Emily: So many single men!


Niki: Why?!


Emily: It's a crisis. It's a crisis of masculinity. This is one of my favorite articles I've read in a really long time. It's basically the results of this poll show that, y’know, like over 60% of women or something say they're coupled up, and way fewer men say that they're in a relationship, and it's just interrogating it, they cannot figure it out. They say, “Why? How could it be that young women are coupled up and men are not?” and  [Niki: [chuckling] It ends with seasoned researchers.] Seasoned researchers are appalled; they're upset. They can't figure this out. [Niki: baffled] Baffled!


Niki: And, and tell them what the answer is, Emily, as a public service announcement. Let 'em know.


Emily: Yeah. I'm so glad that you've given me the platform. They're dating each other.  [Niki: laughs] They, they mention it offhandedly in the article. They're like, “Yeah. A lot of, like, Gen Z. identifies as bisexual, so maybe that's part of it. Or maybe women are dating old men a lot.” [Niki: laughs]


Niki: Women are dating each other!


Emily: Women are dating each other. This is a new world. [Niki: it’s math!] I'm, I'm happy to be here,


Niki: [chuckling] Welcome. [Emily: Thanks]


[both chuckling]


Niki: Welcome, Emily. Thank you for clarifying. [Emily: laughs] I saw your tweet and I was like, “Wow, seasoned researchers cannot figure out what's going on, and it's just math.”


Emily: I would, I feel bad for the seasoned researchers. [Niki: [chuckling]: It was amazing]  The world changes so fast!


Niki: We're putting your Twitter link in the show notes. I always sound like an influencer when I say that.


Emily: That's nice. Yeah, I think it's nice. I mean, ostensibly, that's a part of my job is to tweet things that people see. So…


Niki: Yeah, it was a good one, it was a good one. Okay. Now, we're gonna dig into tech. So, let's talk Section 230.


I'm gonna give you the floor to explain it a little bit to people, so they understand why it's so important, kind of where it comes from, why it's important.


Emily: Yeah. Okay. Section 230. It's a provision in a 1996 communications law that essentially protects social media platforms from facing lawsuits over the online speech that they're hosting.


So, y’know, you can't sue Twitter for something that your grandma posts on Twitter or, y’know, along those lines. Basically, the idea of it was to incentivize online companies, both to host lots of speech, y’know, encourage free speech online, and there's an element of it that also incentivizes them to do what they want with that speech.


Y’know, you can take it down, you can leave it up;  it's up to you. And that was really important in the early internet when there were like forums and people were just starting to sort through how you apply laws to this new [in funny voice] “webspace.” And this law has increasingly become the subject of more and more controversy as we have watched the internet become a central part of everyone's daily lives.

And with that comes hate speech, misinformation, allegations that the social media companies are censoring us. So, all of this, like, anxiety and tension around the big tech companies, has coalesced around Section 230.


Niki: So, I've been in tech for a hundred million years [Emily: laughs], and one of the things that I, I think surrounds this issue that people; It's sort of like revisionist history is people say, “These are the 26 words, this one Section 230 of the Communications Decency Act, which by the way, was held unconstitutional. This clause was severed from it.  [Emily: Yeah] And has repeatedly been held, upheld by courts as, as standing. So, there's a lot of precedent. And people say it's the 26 words that created the internet and I think in some ways it's such a disservice because the internet was here before this law. [Emily: Right]


So, we'll talk about the case, but there's this sort of narrative of like, “Well, we didn't even have the internet, and Congress wrote this thing, and now we have [chuckling] algorithms.”  And actually, it's what you just said.


There were chat rooms. My stepmom was in one. She had motherboards, floppy disks. [Emily: Yeah] She's, like, with her pot of Folgers doing whatever computer people did. One of the things she was doing was moderating chat rooms.


And the law protects platforms, but it also protects users.  [Emily: Yeah]  So there's, that's in there, too. And I think that was intended to protect individuals trying to figure out, fundamentally, which is one of the riddles of the internet: like, “How do you organize this information?” You have to have a way to organize it. So, anyways, I think it’s…


Emily: [interrupts]That's so interesting about; what was she moderating?


Niki: Literally, we just were like, “Oh, she's, she's in the back room doing computer stuff,” and who knows what she was doing?


I mean, she was doing a lot of art, she was an English teacher, but she was like, this was her hobby. 

[Emily: That's interesting] You know, at the time, she had to kind of build her own computer. I mean, who knows? [Emily: Well]  We didn't know cuz we were not paying attention. [Emily:  Yeah]  We didn't know [chuckling]it was gonna catch on!


Emily: That's crazy!


Niki: But definitely she was part of this world, right? And thinking through it, and as an individual, she would be engaged in chat rooms, and she would be kind of host, hosting and moderating things, which is exactly what this is intended to protect.


Okay, there's, you have to have a platform and somebody has to make some decisions. It protects you from taking down things if you want to take them down and leaving them up if you wanna leave them up. I think that's the crux of it.


Emily: Yeah. The shield and the sword. I love that. That's Ron Widen's thing. He's really good at catchphrases. Senator. [Niki: Senator Widen] Yeah. Senator Widen, who helped to write the law.


Niki: Right!  [Emily: Yeah] So, okay, let's talk about this case. So, the reason the case is so important is now that we're all in mass anxiety, which I include myself as having mass anxiety about what's happening on the interwebs, [Emily: chuckles] there's a case that has come all the way to the highest court in the land, Gonzalez v Google.


Do you wanna quickly go through what the case is about?


Emily: Yeah, so at the center of the case is the victim of a terrorist attack in Paris in 2015. So, she was the only American killed in the string of attacks and, basically, her family with the help of law firms and other people who are, y’know, pushing for legal changes, sued Google saying that they should be held liable for enabling the rise of ISIS. Basically, that the Islamic State has used YouTube to reach and radicalize new people.


YouTube has not only been hosting the videos by ISIS, and this is the most important part, but they've also been recommending them to users proactively without the users asking for them. So, basically, the question is, “Okay, Section 230. It does protect hosts from users. It protects various videos. It protects YouTube's right to host those videos. But does it protect YouTube when they are pushing that content towards you?”


Which is a pretty interesting question even though it's very controversial among, y’know, sort of absolutists who say we should not do anything to pair back this law.


Niki: Right. So, part of the, we have, as you've said, like, the, the crux of the issue that we've heard usually is, if you're Yelp, Craigslist, ZipRecruiter, by definition, you have other people posting. That's how the whole app works. Reddit, it's all other people's content. They would disappear if they were held liable for everything everyone said [Emily: Yeah] on their app.


Emily: Yeah. My favorite example I did a story running down, like, five different online companies and how they would be affected if the Supreme Court handed down a pretty broad ruling on this issue.

And Spotify is, is the example that I'm most interested in because if you think about Spotify, like, the reason why it grew so popular is its algorithmically recommended playlists. So basically, they build these daily playlists for you and you're like, “Oh my God, I didn't even know I wanted to hear that song.”


But they also do it with podcasts. And podcasts are very, y’know; they're increasingly a subject of a lot of controversy. You can think about, like, Joe Rogan spreading Covid misinformation. So, if the Supreme Court says, “Yeah, companies are held liable when they recommend certain kinds of content to you,” then Spotify could face this situation where they're being held liable for recommending, for instance, a Joe Rogan podcast.


That's not the best example because we have the First Amendment. And so, probably even if there was a case that was allowed to move forward, it probably wouldn't go that far. But, still, it's this, like, messy set of legal questions.


And as you know, like, if you're a company, you're allergic to liability, like, you just do not wanna be facing a whole tsunami of lawsuits. So, you're gonna proactively try to remove content and stop doing certain things just to avoid the lawsuits, right?


Niki: I, I am; We'll just spend one second on the First Amendment, but people, okay!


Speaking of things people are baffled about!  [both laugh] Bad faith First Amendment arguments in, like, the last two years is just [Emily: Yeah]. Y’know, whatever Prince Harry says, [chuckling] like, our constitution says the opposite of that.


Emily: Wow.


Niki: Oh, do you don't want me to drag Prince Harry on this podcast?


Emily: No! I, no, I, I desperately want you to! [Niki: laughs] Like,  I actually really would like you to just, I, I don't know what he said about the First Amendment?


Niki:  He doesn't understand it, but the main thing is what you just said.


[both laugh]


Emily: We'll do another show on that.


[cross talk]


Niki: [chuckling] Prince Harry's, like, misunderstanding of our constitution. [Emily: Yeah]

There's this bad faith argument that, like, that our free speech is being curtailed by the companies. The companies can do whatever they want.  Who cannot curtail our free speech is the government. [Emily: Right]


Which is what you just said. When you start to have rulings where people feel like they have a, it has a chilling effect on what they'll put out, or they're not allowed to post things that are legal. [Emily: Yeah] That's when you start to have a, a good faith First Amendment issue.


Emily: Yeah. I mean, one of the most important things I think about with 230, it's basically just, like, an obstacle in the path of a lawsuit. So, if someone wants to bring a lawsuit about particular kinds of content, you can bring up a Section 230 objection and get it dismissed, so the lawsuit doesn't move forward. You don't have to spend hundreds of thousands of dollars. Hopefully, it just gets dismissed and then you move on. But in a world where Section 230 is paired back in some way, or it's even overhauled altogether, then those lawsuits move forward.


But there's actually a whole other set of questions about liability. Like, “Okay, could Facebook be held liable for allowing people to post Covid misinformation?” Again, just going back to that, probably not. I mean, that's free speech. But, a more interesting question is, like, “Okay, could Amazon be held liable for, y’know, pushing products that are faulty or misrepresented?” Like, that's an actual area of the law.

So, I might be getting too in the weeds, but


Niki: No!  I think it's a good point. So, just to, in a nutshell, to recap that. There are things that are illegal.  

[Emily: Yeah Right] It's why in the early days of my time at Google, we, we took down anything that was a copyright infringement. I mean, that was the biggest part of our legal team for a while, was, like, just dealing with that because it was not legal to have copyright infringement. [Emily: Yeah]


Then, we had a conversation, and I will say I was totally wrong. We had a conversation about taking down revenge porn. So, non-consensual photos, mostly of women, sexual photos used to essentially ruin their lives, posting them online. Not illegal!  Starting to become illegal now, but at the time, 15, 16, 17 years ago, that was not illegal content.


Emily: Wow.


Niki: So, we had an internal debate. Do we take it down? And there was this big group of attorneys saying, “If we start to take things down that are not illegal, that Congress hasn't made illegal, where does it end? Because then we're deciding,” and I was in the camp of “take it down! This sucks! It's crummy! ” And I now think it was a mistake.


Emily: Whoa.


Niki: Because fast-forward, you've got private companies trying to decide what should be allowed and what shouldn't be allowed. And it's much easier to say, “We're just taking down things that Congress or state government says we can't put up.”


Emily: Yeah, I'm so interested in that, actually. Like, during your time at Google and, like, across the tech industry, like, how often did you think about Section 230? Or, like, how often did it come up in your work?


Niki:  It came down to, we used to call it, “takedowns.” [Emily: Okay.] Right. [Emily: Okay.] So, we would say takedowns. “Okay, we got a takedown request for this thing that's an image. What do we do with it?”


And, for- when I was- early days, anything went, we had a product called Blogger. You could say absolutely anything on Blogger. It was like Tumblr. You could just do absolutely anything. But as we became bigger and got more scrutiny, y’know, we, we really tried to hold ourselves to a value system of doing the right thing.


So, there's content that it violates laws and then there's content that's just icky, unhelpful, gross. And we would have these debates and it just evolved over time because the internet got bigger, and bigger, and bigger, and the things people became uncomfortable with were novel and new and showing up more. What do we do?


So, we would have huge debates. And it's country by country. So, in Thailand, [Emily: Yeah] you cannot post criticisms of the monarch. They come down, but it's illegal. In, in Germany, they have very strong laws on Nazi content. They come down. But that content stays up in the US, where we don't have laws about that. So, [Emily: Yeah] it was this constant navigating of, y’know, 70 countries, and everybody's jurisdictions, and we really tried to stick with what the government said was illegal.


Now, one thing I'll say about Google is Google has the resources to do that, [Emily: Right]  If you're a smaller company, you don't. So, we never really; I don't even remember talking about Section 230, per se. We'd just say takedowns and what's our takedown policy? [Emily: Yeah]


And it evolved over time and then it created this world where you have folks on the right, especially, saying, “You guys are censoring me by taking things down.” Which again, they don't have a right to have their speech on a private platform, but it's just created an unsolvable riddle for the platforms. Like, there's no right answer.


Emily: Yeah, that's interesting. So, like, cuz it's, like, when you were there, Section 230 made up the baseline of all of the conversations.


So, you could have conversations about takedowns or keeping up or, like, whatever, because Section 230 was this, like, protective shield from those decisions.


Niki: Right. And we barely had social media, so we didn't really have-


Emily: Oh, wow!  That's, yeah. Yeah, yeah, yeah. Of course.


Niki: We, of course, created. [chuckling] Google briefly had a social media product that went nowhere. YouTube was- we purchased YouTube when I was there. So, it was in its infancy. So, we didn't really have the issues that we have now. But, as over the years, I was therefor  eight years, and around the time I left, we were dealing specifically with ISIS, which will bring us back to this case.


Emily: Oh, wow.


Niki:  Yeah, which was, okay, we take down ISIS videos, but what if the video is a BBC recording of a journalist being beheaded or held hostage?  [Emily: Yeah]  Like, that's news, [Emily: Right]  So, how do you tell the difference?


The, the machine learning and the image recognition cannot tell the difference between an ISIS video, that's that and BBC reporting on it. And it was so, that was when there was quite a lot of news around it and, and really we didn't have, there was no way! You had to have humans make that decision. And the amount of content is huge. [Emily: Yeah]


So anyway, it just evolved over time and then we've ended up in this, like, mess where we're, you have nine Supreme Court Justices looking back and trying to say, “Well, what did Congress mean and what do we do from a policy perspective?”


So, let's talk about the case!


Emily: Yeah, yeah, yeah, yeah! Well, I'd love to hear your experience on the day of cuz you were actually in the courtroom.


My editors decided I was useless as soon as I stepped foot into the courtroom because you can't have electronics and we live in an age where we do blasts based on things Supreme Court justices say 


[chuckles].


So I couldn't do that, but I hear it was crazy in the room, so.  [Niki: it was, really] Like, what was the vibe?


Niki: Yeah. Okay! The vibe. So, so for people who don't live in DC, how the Supreme Court works is: all the oral arguments are broadcast publicly. Anyone can listen and really the only reason to go in person is to see the demeanor of the Justices.


Emily: Right.


Niki: So, the courtroom itself, I think this is interesting, is really, really small. It's incredibly small and there's a space directly behind counsel where if you're admitted to the Supreme Court Bar, which I am, lawyers can sit. And you have to line up. And if it's a really interesting case, it's limited, seat-limited, so you have to get there, get in line, hope, you get a seat, and then you can sit through the arguments. Then there's a space for the public, but it's also pretty small. And then a space for the press And then the lawyers.


And when I tell you, like, I could look into the whites of Chief Justice Roberts's eyes [Emily: Woah], like, you're so close to the Justices, it's really cool. But you can't take anything and it takes forever to get through security. So, you lock, you can only bring a writing utensil, and a pad of paper, and nothing else. No watches, no phones. So, to your point, you can't do your job during that.


Emily: I know. In a, in a different era, y’know, the Supreme Court reporters, like, go in, feel the vibe, leave. They can write their story to run tomorrow. It's just a different time.


But what was your read on their, like, body language, expressions on their face? Cuz they sounded during the argument, like, baffled, like, y’know, like, listening to the audio. They, I think every single one of them at some point said, “Wait, I'm confused by what you're saying.” Like, they in particular, I think responded pretty poorly to the Gonzalez family's lawyer, Eric Schnapper.


But I, I'm so curious what it was like, in person.


Niki: Yeah. So I want you to talk about the lawyers cause I think you know more about their background. [Emily: Yeah] I just went, like, because if you live in Washington and it's a cool case for a company you worked for, you should go! [Emily: Yeah]


One of my friends, Kate Sheerin, this is her second Supreme Court case and she works at Google, so I wanted to go see her. She was wearing these amazing Proenza Schouler combat boots. We love it!


Emily: Oh, that's epic.


Niki: The Supreme Court needs more of that.


Okay. So, you're standing in this long line. I would say one in ten people were women. It skewed much older because a lot of these people are policy experts who've been working on this since the nineties. 


[Emily: Yeah]


Several people, the two gentlemen I was standing next to, are retired and have been working in this space, but they just were so interested in what was happening. And so, the vibe in the line, the lawyer's line, as we're all waiting at dawn to go in, was they said, “Justice Thomas is so mad about taking down conservative speech. He's already said Section 230 is too broad. He's, he's definitely gonna be a problem for us.”


Y’know, someone, one of the guys said, “You can't. It's a black box. Who knows how they're gonna be?”  [Emily: mm-hmm] We had conversations about, “It's a political court now,” all these things, so we're chit-chatting.


Then we get in. We sit down. Justice Thomas, out of the gate, first question! First question was sooo in favor of Google. [Emily: Yeah]  He just said, “This algorithm sounds like it applies neutrally to,” he said, “rice pilaf,” which was a little weird.


Emily: I got lost on that one.


Niki: It was a little weird, but he's like, “Say you're interested in cooking,”    Anyway, who knows? It wasn't that clear. But what he was saying is, “the algorithm is the algorithm. [Emily: Yeah]  And if it's not discriminating, I don't know what the issue here is. It's just a tool to sort content.”


And that shocked the, the lawyer sitting next to me because he told me, “We know how Thomas is gonna rule.” [Emily: Yep!]  Wrong!


I did not think the lawyers were confused or the Justices were confused. I think they absolutely understand what Section 230 does. They absolutely understand [chuckling] recommendation engines and what they were confused by was what the plaintiff was trying to argue because the argument was soo painfully weak. [Emily: Yeah]


So the demeanor was, you had Justice Jackson with her hands on her forehead, leaning forward trying to get the plaintiff's attorney to argue [chuckling] what was in the supporting briefs.


Emily: She had her hands on her forehead? [Niki: Oh, yes!]  Wow! That's vivid. Oh my God.


Niki: Yeah. She was, like, “Come on, you can do it!”


Justice Kavanaugh and Kagan sit next to each other and both, which when you've got those two Justices saying the same thing, they both said like, “This is for Congress to decide. We have a statute. It's really, really clear.”


I'm not a, there are policies around this, now they have different politics, but they both were saying, “this needs to go back to Congress.”  [Emily: Yeah] Which I thought was interesting.


Justice Alito seemed irked [laughs], and I thought he was someone also who would be likely to rule against Google, but I don't think that now. But anyway, so the demeanors are so interesting to watch, and then, Justice Gorsuch was over the phone, so we couldn't tell.


Emily: Right. Yeah, yeah, yeah. That's so interesting! I, I mean, the arguments didn't go the way I thought they would go either. I mean, the Supreme Court was under an immense amount of pressure to weigh in on Section 230.


Like, there haven't been splits in lower courts, which is usually what leads to the Supreme Court taking up a case. But there have been various federal judges who have questioned the contours of Section 230, said it's been overly, it's been interpreted in an overly broad way, y’ know. Like, and also just like, politically, there's so much conversation about 230 and what to do about it.


So, they took up this case and so, it seemed like they took it up because they wanted to say something about 230 and probably something that, y’know, the tech industry wouldn't like because Clarence Thomas had said previously, “Oh, maybe we should treat social media as public utilities,” which is, like, a nightmare for the social media companies.


But they were really skeptical. And y’know, I worked on this coverage with our Supreme Court reporter, who's been doing it for 25 years. His name is Greg Stohr and he's amazing. And I was just trying to ask, like, “How, how typical this dynamic was where they were frustrated by the plaintiff's attorney? They, they really seemed like they wanted to almost help him sharpen his argument?”  [laughs]


Niki: I wanted to help him! I was sitting two feet behind him. I wanted to tap him out and be like, “I will help you with this.” [Emily: Yes] But talk about the dynamics of the attorneys a little bit.


Emily: Yeah. So, Google, we'll start there, has Lisa Blatt, who is known as just one of the favorite lawyers that argues before the Supreme Court. Like, the Supreme Court Justices have a rapport with her. They know her. She has argued before the Supreme Court more than any other woman alive. She's kind of a wild card, like, she's known to, like, say crazy stuff. [Niki: She's funny] Like, y’know, piss people off sometimes. But she's an extremely experienced, well-known, well-respected litigator.


And then, there on the other side, representing the Gonzalez family, is Eric Schnapper, who, like, I talked to him, and I was really, I think, I thought he was so brilliant. He's clearly, like, a really brilliant thinker. He is best known for his civil rights work. He hasn't argued before the Supreme Court in a long time, but, basically, the reason why the Gonzalez family ended up choosing him is there are only so many people who are allowed to argue before the Supreme Court and tons of them were conflicted out because either they've worked for the tech companies directly, or a firm that they've worked for has, like, conflicts.


Like, y’know, this is, it's, it's interesting because it's, like, Google's power and influence actually did affect who was able to argue for the plaintiffs in this case. And it kind of came down to Eric Schnapper who…. Um, yeah, I mean, his argument seemed a little weak. He, he referenced stuff that he didn't really talk about a lot in the brief. Like, for instance, he made a big point about thumbnails, basically that, like, thumbnails aren't protected by Section 230 because they're co-created by YouTube and the creator, y’know, YouTube, like, has some role in making those thumbnails that you click on what's, like, y’know, when a video's recommended.


Niki: Like a little photo, I mean, there was [Emily: so little photos] so much discussion of thumbnails.


Emily: We just kept talking about thumbnails.


Niki: [laughs] Sorry,  I just cackled. It was so bizarre!


Emily: It, yeah, it was a, it was bizarre.


Niki: I think he was nervous. He doesn't know tech very well, and he's a legend in what he does know, which is civil rights law.


Emily: Yeah. I, so I basically, I wrote an article about how this case could impact online advertising basically, because, y’know, what is a targeted advertisement?  It's something that is pushed out to you without you asking for, um, through algorithmic recommendation. Y’know, like that's what a targeted ad is.


And so, if the court hands down a ruling that is really broad, really would impact Section 230 then targeted advertising could be at risk. Like, you don't wanna be held liable for all of the, y’know, millions of ads on the internet. Basically, this would mainly affect Google and Facebook cuz they're the two who serve up the most amount of ads.


So, I talked to him about it, and I was asking him like, “Is that an impact that you've thought about with this case? Like, y’know, what do you make of that? That's a huge sweeping implication that would affect, y’know, the heart of the Internet's business model.”


And basically, he was like, “Well, like, I assume, like, Google and Facebook must be, like, filtering out concerning ads. [Niki: chuckles] Y’know, they must have, they must have, lik,e way fewer ads than they do, like, videos and posts, so it must be easier to moderate ads.”


Which, like, y’know, then I talked to a couple of, like, former trust and safety people at various tech companies, people who have worked on filtering out ads and they say, “No, there's the same content moderation issues apply to ads as it does content.”


Niki: Exactly. There's tons of ads!  It's really hard to filter ads.


Emily: Yeah!  Right.


Niki: And it's the exact same issue. You're right!


Emily: Anyways, just so, just, like, that was a moment where I was, like, “Oh, like, he is really a lawyer looking at the legal issues and not a tech expert.” And, and ultimately, I just think that, like, made the case more difficult.


Niki: You're right. It's a strategy in this town to conflict people out. I mean, not just in this case, but all the time. Microsoft, one of the issues people would have when they were in there, I guess they're still kind of in their heyday, low key, but when they were having a lot of issues is they would just conflict everybody out of every lobbying firm. So you couldn't hire lobbyists.


Emily: Yeah.


Niki: And they wouldn't even do anything with them. They'd just retain them.


Emily: Right, right, exactly. Just have 'em sit around, like, twiddling their thumbs. And yeah, so, like, I think that's important context for understanding the argument. I also think a lot of people, Section 230 experts, just don't think that this is the best vehicle.


Like, even people who wanna reform the law, they, they think this, this case is not very strong because everything on the internet is recommended to you by an algorithm basically at this point.


Like, y’know, like, think about, like, Etsy: they're recommending things to you, like, Next Door. You know, like, all of these companies that have spoken out about it. [Niki: Zillow!]  Zillow, like, they're, they're showing you stuff proactively because there's too much speech on the internet. Y’know, they have to sort through it and tell you what you might be interested in. So, people say that the argument that recommendation makes you fall outside of Section 230 is just a little bit weak.


Niki: Yeah. Okay. Predictions.


Emily: I hate predicting anything. I think, especially with the Supreme Court, I think sometimes arguments can sound one way and the opinion can come down totally differently.


Maybe once they've sat with it more, you know, read more but, at this point, it seems like if they end up ruling on it, it will be in a really narrowly tailored way and probably tend towards Google's side. [Niki: Yeah]


Um, there, there's like, there's kind of like a sleeper issue that I wanted to talk about, which is there is another case that the Supreme Court heard oral arguments on, the same week, the day after, which is Twitter v Tammneh.


That is a similar case. Basically, it is about whether Twitter should be held liable for failing to take down terrorist content. Y’know, “They knew that the content was on their platform and they didn't do enough to stop it” is the main argument. So, the Supreme Court kind of has an off-ramp from Gonzalez.


They could look at Twitter v Tammneh, which they're very skeptical of, very much on Twitter's side. They could look at Twitter v Tammneh. And say, y’know, “We just don't think social media companies should have liability for hosting terrorist content. We just think that, like, Section 230 protects all this. And anyways, the legal issues here are not very strong.”


Like, the, the allegation is that it violates the Anti-Terrorism Act. We don't have to get into it, but y’know, they could say, “Well, this doesn't violate the Anti-Terrorism Act. So, we don't have to deal with the Section 230 issue in Gonzalez v Google. We can just kick this back to the lower courts cuz we've decided the issue at the center of it is irrelevant.”


Niki: Yeah. I mean, that's a possibility. They're expected to rule by summertime [Emily: Yeah] on this case. So, we'll know. I think my main, my main observation was I thought, “Oh my gosh, this, because Congress is so stuck, it's so sclerotic, they're not gonna get anything done. This will be a moment where policy could be made by this court, and this court is making policy. It is a political court.”


But one thing that's different, y’know, you look at the Dobbs decision, abortion decisions, that's the Justices are interpreting, like, the penumbra of a right that is not explicit. It's not a statute coming from Congress. They have a lot more, I think, I mean, I have a lot of thoughts on that, but not for this podcast!  [Emily: chuckles] But in this case, repeatedly they said, like, “Congress was really clear. There's a statute here. Congress can rewrite that. They need to look at the policy externalities.”


So, I think Google wins. I think Justice Jackson has a dissent where she says, ”This is only intended to protect you if you choose to take things down.”


Emily: Yeah, yeah. Right. She had a lot of sympathy towards this idea that, that in order to maintain Section 230 privileges, essentially, you have to be taking bad stuff down. That's the, like, core of the argument, this, like, good faith kind of argument and that was, like, one of the more interesting parts of the case.


Niki: Yeah, I thought it was interesting. [Emily: Yeah] So I think she likely dissents if they rule.


And then the last thing I'll say is, Justice Gorsuch, who again, was calling in, so you couldn't totally tell-he wasn't feeling well- brought up AI and I, we actually have a guest coming on in a couple of weeks and she's an expert, a policy expert in AI.


What happens when it's not a human creating the content?


Emily: Yeah. Right?!


Niki: [chuckling] Which only started happening a couple of weeks ago!


Emily: I know! I was watching a lot of real tech people argue about this issue on Twitter and then also on a separate panel. Basically, I guess it's insulting to imply; cuz some people say, “Oh well, that would still be protected by 230. Cuz basically like chatbots are just assembling speech on the internet and then regurgitating it.” And apparently, that's insulting to, like, the [silly voice]“innovators and creators” of AI [Niki: chuckles] because they say they're doing something totally different than that. They're not just, like, spitting stuff out [Niki: Who!]. So right, it's a live issue, is what I gleaned. [chuckles]

Niki: It's a live issue!  I thought it was amazing that Justice Gorsuch brought it up. [Emily: Yeah] Because clearly, he's thinking about it. Which goes to your point, people were, like, “The Justices sounded confused,” and one of the things Justice Kagan said was like, “Well, we're not experts on the internet.”


I did not think they were confused. I think they were frustrated with the way the arguments were going. [Emily: Yeah]  They spent three hours, which is, I think it was supposed to be 70 minutes, so it went forever.  [Emily: Yeah]   Well, stay tuned. It's a huge-  I'm gonna put a link to your Twitter.


Also, Emily writes great pieces. Check her work out! [Emily: chuckles embarassedly]


Niki: Thank you again for coming on and being our, our first repeat guest in the studio. I appreciate it.


Emily: Thank you! This is my most favorite place and podcast, so I appreciate it.


Outro:


Niki: Tune in for our next episode with Silicon Valley engineer and entrepreneur Tracy Chou. She’s the CEO of Block Party and we talk about her app and how it can empower consumers and make social media just a little bit less of a hellscape.


As always, thanks for listening and subscribing to the pod.

bottom of page