Third Circuit Finds TikTok Could Be Liable for Algorithmic Feed
TikTok engages in “expressive activity” when its algorithm curates content. Accordingly, the platform can’t claim Section 230 immunity from liability when that content harms users, the Third U.S. Circuit Court of Appeals ruled Tuesday (docket 22-3061). A three-judge panel remanded…
Sign up for a free preview to unlock the rest of this article
If your job depends on informed compliance, you need International Trade Today. Delivered every business day and available any time online, only International Trade Today helps you stay current on the increasingly complex international trade regulatory environment.
a district court decision dismissing a lawsuit from a mother of a 10-year-old TikTok user who unintentionally hung herself after watching a “Blackout Challenge” video on the platform. The U.S. District Court for the Eastern District of Pennsylvania dismissed the case, holding TikTok was immune under Communications Decency Act Section 230. The Third Circuit reversed in part, vacated in part and remanded the case back to the district court. Judge Patty Schwarz wrote the opinion, citing U.S. Supreme Court findings about “expressive activity” in Moody v. NetChoice (see 2402270072). SCOTUS found that “expressive activity includes presenting a curated compilation of speech originally created by others.” Schwarz noted the court’s holding that a platform algorithm reflects “editorial judgments” about compiling third-party speech and amounts to an “expressive product” that the First Amendment protects. This protected speech can also be considered “first-party” speech under Section 230, said Schwarz. According to the filing, 10-year-old Nylah Anderson viewed the Blackout Challenge on her “For You Page,” which TikTok curates for each individual user. Had Anderson viewed the challenge through TikTok’s search function, TikTok could have been viewed as more of a “repository of third-party content than an affirmative promoter of such content,” Schwarz wrote. Tawainna Anderson, the child’s mother who filed the suit, claims the company knew about the challenge, allowed users to post videos of themselves participating in it and promoted videos to children via an algorithm. The company didn’t comment.