Content Regulation/Section 230
Information in our society increasingly flows over online platforms like social media and search engines. U.S.-based companies have become dominant in this space, in large part because of the liability shield provided by Section 230 of the Communications Decency Act. This 1996 law allowed platforms to host user-generated content and to take good faith actions to moderate that content without fear of civil liability. But the internet of today bears little resemblance to that of 1996, and Section 230 has become one of the primary targets for internet policy reform.
WBK attorneys advise clients on content moderation issues amid debates over Section 230 reform and the effect of algorithms that engage all three branches of government, including regulatory agencies. We served as amicus curiae counsel at the U.S. Supreme Court in two recent cases: Gonzalez v. Google, the first case raising for the Court whether Section 230 should apply to the algorithmic recommendations that are crucial to social media operations (the Court dismissed the case without addressing the question); and the consolidated cases in which the tech industry is challenging recently-enacted state laws that would prescribe social media content moderation practices. Congress continues to consider proposed legislation regarding Section 230 and numerous areas of litigation continue to work their way through the court system.
The impact of content regulation decisions could reach far beyond online platforms, affecting traditional media as well as telecom networks on which user-generated content appears. In this fluid environment, WBK attorneys help clients understand the key issues and the evolving policy crosscurrents that inform the debate.