This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Technology,
Constitutional Law

Sep. 17, 2025

Charlie Kirk and the cancer of social media

In the wake of Charlie Kirk's assassination, Utah Governor Cox and other lawmakers are spotlighting social media's harmful algorithms. But Section 230 immunity remains a central shield that allows these platforms to avoid accountability, fueling the cycle of political violence.

Douglas E. Mirell

Partner
Nolan Heimann LLP

16000 Ventura Blvd #1200
Encino , CA 91436

Phone: (818) 564-7991

Email: dmirell@nolanheimann.com

UC Davis School of Law

Doug's practice focuses on privacy, defamation, publicity rights, copyright, trademark and First Amendment litigation.

See more...

Charlie Kirk and the cancer of social media
Shutterstock

In the wake of the horrific assassination of Charlie Kirk, Utah's Republican Governor Spencer Cox stands out as a voice of reason and sanity. At a Sept. 12 press conference announcing an arrest in Kirk's killing, Cox was quick to identify social media as a "cancer on our society right now." 

In a Sept. 14 appearance on NBC's "Meet the Press," Cox expanded on his condemnation: "I believe that social media has played a direct role in every single assassination attempt that we have seen over the last five, six years."  He continued, "There is no question in my mind -- 'cancer' probably isn't a strong enough word. What we have done, especially to our kids, it took us decades to realize how evil these algorithms are."

While others are focused upon endeavoring to concoct "leftist" ideological rationales for this murder, Utah has attempted to address its social media underpinnings by way of the state's passage of social media laws, including one which holds media companies responsible for mental health problems of minors arising from the use of curated algorithms. Though Utah's latest legislative efforts may founder on the shoals of the First Amendment -- in much the same way that its earlier efforts in 2023 and 2024 to mandate social media safety features for minors have been paused out of constitutional concerns -- the impetus for seeking a legislative remedy should be commended.

Likewise laudable is Tennessee Republican Sen. Marsha Blackburn's introduction of S. 1748, the "Kids Online Safety Act" -- a measure with broad bipartisan support that seeks to implement tools and safeguards to protect users and visitors under the age of 17 who use a variety of online platforms -- including social media, video games, messaging applications, and video streaming services. While that proposed legislation, introduced on Mar. 14, 2025, will likewise face free speech challenges, its proponents may be encouraged by the U.S. Supreme Court's June 27 decision in Free Speech Coalition, Inc. v. Paxton, 145 S.Ct. 2291 (2025), that, on a 6-3 vote, upheld a Texas law requiring certain commercial websites which publish sexually explicit content to verify that visitors are 18 or older.

That said, the Supreme Court seems to have sent mixed signals on Aug. 14 when it turned down a request from NetChoice, a tech industry group that represents social-media companies like YouTube and Meta (which owns Facebook and Instagram), to temporarily bar the State of Mississippi from enforcing a state law requiring minors to obtain their parents' consent before they can create social media accounts. In a brief unsigned order, the Court allowed Mississippi to enforce this law against major media sites while lower court litigation continues. However, somewhat paradoxically, while agreeing with the decision to leave this law in place, Justice Brett Kavanaugh wrote that "the Mississippi law is likely unconstitutional."

What has thus far gone unmentioned, amidst the cacophony of speculation about the motives of Kirk's killer and the role that social media may have played, is the most significant reason that inflammatory websites and social media platforms have been allowed to proliferate and flourish without any meaningful constraints. And that reason is attributable to Congress itself which, now nearly 30 years ago, chose to protect the infant internet by enacting Section 230 of the Communications Decency Act.

Aptly characterized by Jeff Kosseff as the "26 words that created the internet," Section 230 provides, in relevant part: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." In a nutshell, this essentially allows social media and other platforms to claim more-or-less blanket immunity from liability for anything that third-party users choose to post. Indeed, our own California Supreme Court has held that this immunity extends to sites like Yelp that can refuse with impunity to remove reviews even after they have been the subject of litigation in which the trial court entered a judgment holding those reviews to be defamatory. See Hassell v. Bird, 5 Cal.5th 522 (2018).

In past years, legislators as diverse as Senator Adam Schiff (D-Cal.) and Ted Cruz (R-Tex.) have decried the overbreadth and consequences of this legislative grant of immunity that finds no counterpart in the non-internet world. Indeed, the U.S. Supreme Court's seminal decision in New York Times Co. v. Sullivan, 376 U.S. 254 (1964), never even hinted at the possibility that the user-generated content which was the subject of that litigation - a paid advertisement by civil rights activists -- was not a proper subject of a defamation claim against the newspaper that published this ad.

Thus, repealing Section 230 would have the salutary effect of placing social media platforms on the same legal footing as their print and broadcast counterparts. Of course, doing so would fundamentally alter the ways in which social media outlets operate since, in order to avoid liability, those platforms would have to actually monitor and take down defamatory and privacy-invasive postings in much the same way they are already statutorily compelled to do under the Digital Millennium Copyright Act, 17 U.S.C. Section 512.

Moreover, repealing Section 230 would also raise serious questions about the ongoing failure of social media sites to enforce their own terms of service -- such as the "Community Guidelines" promulgated by Meta, YouTube and TikTok, or the "Rules" that X (formerly Twitter) and Reddit post on their sites. Diligent enforcement of those terms of service -- many of which prohibit hateful speech and behavior, threats of harm, harassment, bullying, and misinformation -- could prevent the algorithmic-induced evils that Governor Cox so presciently identified. And that might at least be a tangible outcome that could reduce the likelihood of another abominable assassination and other forms of political violence.

#387607


Submit your own column for publication to Diana Bosetti


For reprint rights or to order a copy of your photo:

Email Jeremy_Ellis@dailyjournal.com for prices.
Direct dial: 213-229-5424

Send a letter to the editor:

Email: letters@dailyjournal.com