The Supreme Court heard oral arguments Tuesday in one of two cases this week. These cases examine the extent of legal protections granted to tech companies that offer a platform that allows third-party users to post content.
Gonzalez v. Google is a case about recommendations and algorithms that YouTube uses to promote content. These algorithms go beyond allowing users to upload content and arrange the content for users to be promoted in a specific way. YouTube and Google are being sued over allegedly suggesting videos made by ISIS that were used to recruit new members.
Section 230 of The Communications Decency Act states that sites such as YouTube, Google, Facebook, and Twitter are exempt from legal claims based upon the content they post by users. Justices and parties were unable to decide whether the content presented was speech or if it was through clear recommendations or an algorithm.
Sometimes the arguments and issues got a little murky which led to confusion for the justices.
“It’s a court. These are things we don’t really know. These aren’t like the nine greatest internet experts,” Justice Elena Kagan stated, drawing laughter from the room.
Justice Samuel Alito had just told Eric Schnapper that he was “completely lost by any argument you are making at this time.” Schnapper was discussing YouTube’s thumbnail images and links that link to other videos in search results. Schnapper argued, however, that the videos are created by users. However, YouTube and the user jointly create the thumbnails, which is in violation of YouTube’s Section 230 Protection.
Later, Lisa Blatt, a Google attorney, dismissed the argument by noting it was not part of the plaintiff’s complaint in the case.
Justice Ketanji Brown Jackson said, just like Alito, that she was once “thoroughly confused” by the argument because she believed that the relevant issue was Section 230 immunity and not what could trigger liability. Schnapper pointed out that it all boils down to the way certain actions and practices are viewed by law.
Schnapper stated to Jackson that the contention that they are advancing is that many things we loosely refer to as recommendations are outside the statute. This was the central focus of today’s arguments.
“I guess the question here is, how can you go from a neutral algorithm and an aiding or abetting? Justice Sonia Sotomayor raised this question at one point.
Schnapper argued that YouTube’s algorithm for selecting videos from a list was a form speech that YouTube uses, independent of the content of the videos.
Malcolm Stewart, the U.S. Deputy Secretary General, seemed to have a less restrictive approach. Stewart brought up, along with others, a hypothetical where a person walks into a bookstore asking for a Roger Maris book. The clerk will then direct them to the table where the book is. The clerk would give directions based on speech about the book and not the contents of the book.
However, Sotomayor resisted the notion that the store, or in this instance, YouTube or other tech companies, should be held responsible for such speech.
Stewart replied that, even if there were additional lawsuits, they would not succeed unless the lawsuits allege that the speech of the company violates another law. YouTube would not be able to violate antiterrorism laws if it used neutral algorithms that treat ISIS videos in the same way as it treats cat videos.
However, if Google or another company said that they recommended a video, it would be unprotected speech. This promotes content.
Blatt pointed out that the term “recommendation”, which is not found on YouTube, does not mean that it is engaging in unprotected activities.
Blatt explained that search engines consider the user’s information when presenting search results. This includes their language and search history, as well as their location. She pointed out that someone searching for “football in the U.S.” would see different results than someone looking in Europe where the term is used to refer to soccer.
Blatt also claimed that it doesn’t matter if an algorithm has a neutral or pro-ISIS bias because an algorithm would still fall under the criminal activity exception.
She said, “So, if your material support is in collusion with ISIS,”
On Wednesday, the Supreme Court will hear arguments about Section 230 when it hears Twitter against Taamneh. This case examines whether Twitter (or any other social media platform) should be held responsible for ISIS aid by providing them a platform.