Regulating Speech Online: Free Speech Values in Constitutional Frames


Regulating speech online has become a key concern for lawmakers in several countries. But national and supranational regulatory efforts are being met with significant criticism, particularly in transatlantic perspective. Critiques, however, should not fall into the trap of merely relitigating old debates over the permissibility and extent of regulating speech. This Article suggests that the normative balance between speech protection and speech regulation as a constitutional matter has been struck in different ways around the world, and this fundamental balance is unlikely to be upset by new speech mediums. To illustrate, this Article uses a German statute, NetzDG, and its reception in the United States as a case study.

Contemporary U.S. legal discourse on online speech regulation has developed two crucial blind spots. First, in focusing on the domestic understanding of free speech, U.S. legal discourse tightly embraces an outlier position in comparative speech regulation. Second, within First Amendment scholarship, the domestic literature heavily emphasizes the marketplace of ideas, displacing other theories of free speech protection. This emphasis spills over into analyses of online speech. This Article specifically addresses these blind spots and argues that the combined narrative of free speech near-absolutism and the marketplace theory of speech protection make a fruitful comparative dialogue difficult. It ends by sketching the contours of a normative approach for evaluating regulatory efforts in light of different constitutional frameworks.

Table of Contents

Introduction 754

I. Online Speech Regulation in Context 759

A. Europe 760

B. Germany 762

1. Legislative History and Enactment of NetzDG 764

2. Proposed Revisions of NetzDG 765

3. Federal Ministry of Justice and Consumer Protection (BMJV) Assessment of NetzDG 768

C. United States 769

II. Internal and External Sites of Conflict 771

A. Constitutionality 772

B. Regulation and Governance 774

C. Compatibility 777

III. Toward a Normative View 778

A. Shared Underlying Concerns 779

B. First Amendment or Bust? 783

C. The Slippery Slope 785

Conclusion 786


After Twitter deplatformed Donald Trump in January 2021, German Chancellor Angela Merkel reportedly communicated her criticism of the decision along with the suggestion that a law governing online speech akin to a German law, the Netzwerkdurchsetzungsgesetz (“NetzDG”),[2] be passed in the United States.[3] Regulating speech online has become a key concern for lawmakers in many countries. Criticisms of laws or legislative proposals to regulate online speech are plenty, but a common mistake is merely to reiterate well-trodden critiques of speech regulation, including prohibitions of hate speech.[4] Drawing out the old debate over whether to regulate speech in the first place, however, is unhelpful in designing or assessing new regulatory regimes for online speech and obscures deeper theoretical concerns raised by the nature of online speech.

The normative balance between speech protection and speech regulation[5] as a constitutional matter has been struck in different ways around the world, both on the national and supranational levels. I suggest that this fundamental balance is unlikely to be upset by new speech mediums, such as Twitter, Facebook, YouTube, Instagram, and various other social media sites. Though there may be some contestation and renegotiation at the margins, the entire universe of online speech is unlikely to be governed by a different set of rules than offline speech.[6]

Seeking complementary free speech regimes online and offline is as much a matter of constitutional doctrine as of constitutional culture and historical and political context. National or supranational legal systems that have struck the balance in favor of hate speech regulation, for example, will likely seek such regulation to be mirrored online as well. Thus, NetzDG aims to better enforce Germany’s existing hate speech prohibitions and other criminal code provisions on social media platforms.[7] And any emergent theory of online speech regulation must therefore acknowledge that, for better[8] or worse,[9] it will not likely be the contemporary American understanding of free speech—an outlier in its protection of hate speech and other forms of expression impermissible elsewhere—that will govern online speech around the world. This comparative point, however, is often lost in domestic discussions of online speech regulation where a First Amendment-baseline is commonly assumed.[10]

Contemporary U.S. legal discourse on online speech regulation has developed two crucial blind spots. First, in focusing on the domestic understandings of free speech, U.S. discourse tightly embraces an outlier position in comparative speech regulation while remaining largely oblivious to alternative frameworks of constitutional speech protection.[11] Second, within First Amendment scholarship, the domestic literature has “virtually canonized” the marketplace of ideas,[12] and this heavy emphasis spills over into analyses of online speech, casting aside other theories of speech protection.[13] This Article argues that the combined narrative of free speech near-absolutism and the marketplace theory of speech protection make a fruitful comparative dialogue difficult.

Suggesting a path to a deeper normative understanding that better enables comparative dialogue, this Article puts three distinct bodies of scholarship into conversation with each other: first, U.S. law and technology scholarship concerned with questions of speech regulation and the role of social media platforms in moderating content primarily from a domestic perspective;[14] second, comparative freedom of speech scholarship, which has traditionally been concerned with the design of and values underlying various constitutional frameworks of speech protection; and third, First Amendment scholarship on different theories of justifying free speech protection beyond the marketplace of ideas.

Starting from the premise of largely settled arrangements regarding the scope and limits of speech protection that differ among constitutional regimes—the focus will be on the United States, Europe, and Germany[15]—this Article proceeds in three parts. Part I provides a descriptive-analytical account of supranational and national efforts to regulate online speech, focusing on the German NetzDG as a case study, and their reception in the United States. Whereas European efforts are occurring in a context in which the constitutional balance has been struck in favor of permitting some forms of speech regulation, primarily based on historical justifications, the enacted and proposed laws are received in the United States in a context that historically is extremely speech permissive. This fundamental difference has resulted in deep skepticism toward European regulatory efforts that directly impact U.S. social media companies.[16]

Part II identifies sites of conflict, both within national regimes and across them. It first examines NetzDG within the German constitutional framework. Then, Part II analyzes the doctrinal and theoretical dimensions of online speech regulation in general, and NetzDG in particular, through the lens of Jack Balkin’s distinction between “old school” and “new school” speech regulation.[17] Balkin notes that “[d]uring the early age of the Internet, people imagined that territorial governments would lose much of their power to control speech. . . . It did not turn out precisely that way, in part because nation states developed the techniques of new school speech regulation.”[18] The relevant question, thus, is not whether the constitutional balance itself is struck correctly, but rather what the regulatory regimes look like and how they interact with free speech values. Finally, it puts the German regulatory framework into conversation with the currently predominant American approach that has been widely adopted by the largest platforms as their guiding standard.[19] Although the rhetoric of American-style free speech values remains dominant, platforms are actually moving toward a more aggressive governance approach by imposing their terms of service.[20] Notably, Twitter “permanently suspended” President Trump’s @realDonaldTrump account on January 8, 2021.[21]

Part III sketches the contours of a normative framework for thinking about online speech regulation in comparative constitutional perspective with the aim of aiding both theory building and policy design, refocusing First Amendment theory around democratic self-governance rather than the marketplace of ideas. As a result of this realignment, a shared normative baseline may be identified in transatlantic perspective: because speech protection is fundamentally tied to democracy, regulating online speech should enable democratic self-governance.

This third Part then addresses two prominent objections to online speech regulation. The first objection I will call the “First Amendment or bust”-objection—an argument sounding in American exceptionalism suggesting that only contemporary U.S.-style free speech protection is truly sufficient. The second is a version of the “slippery slope”-objection, suggesting that while European-style speech regulation might be acceptable, nondemocratic regimes are using similar types of regulation to further curtail free speech to nondemocratic ends. These objections are mutually reinforcing because the tenet of content-neutrality as a key doctrinal feature of the First Amendment and the normative emphasis on the marketplace theory lead to a situation in which speech is rendered largely irregulable regardless of context. Thought to its logical end, then, speech regulation cannot distinguish between democratic and nondemocratic contexts and content. Both objections therefore counsel in favor of exploring normative bases for speech protection—and, on the flipside, permissible speech regulation—beyond the marketplace theory.[22] And both objections would benefit from deeper engagement with comparative insights.

As Balkin notes, “[t]he problems of free speech in any era are shaped by the communications technology available for people to use and by the ways that people actually use that technology.”[23] Whether regulating the printing press or the Internet, regulatory frameworks must grapple with technological innovation. At the same time, free speech values are likely more enduring than the lifespan of any single technology to which they are applied. And since “regulators around the world are currently writing laws to change the regulatory landscape for online speech,” a better understanding of the normative values animating these efforts is particularly pressing.[24]

I. Online Speech Regulation in Context

Throughout the twentieth century, constitutional designers have contemplated the proper balance between speech protection and permissible speech regulation.[25] The European postwar consensus as reflected in national constitutions as well as on the supranational level has struck a balance that markedly differs from that in the United States.[26]

Much has been written about both the U.S. domestic[27] and comparative dimensions of hate speech regulation, including comparative studies with a particular focus on the United States and Germany.[28] This larger discussion exploring the appropriate scope of protection and limits placed upon speech provides the backdrop to current developments. In fact, the first order constitutional questions are perhaps best regarded as largely settled. Current efforts to regulate online speech in accordance with the constitutional balance are occurring within these largely settled frameworks. Throughout this Article, I also consider the constitutional balance to be an expression of the cultural value given to speech and its acceptable limits irrespective of the doctrinal applicability of constitutional speech provisions to online speech.[29]

This Part briefly sketches legislative initiatives and judicial decisions on the European level to provide the supranational context. It then introduces NetzDG and traces its reception in the United States. So doing, it also seeks to untangle veritable engagement with online speech regulation on the one hand,[30] and general opposition to European-style speech regulation—or suggesting that “the EU does not really even have a concept of protected speech” [31]—on the other.

A. Europe

Freedom of expression is protected under the Charter of Fundamental Rights of the European Union[32] as well as the European Convention on Human Rights (ECHR).[33] These regional instruments may be even more relevant to the question of appropriate limits on free speech than the oft-cited International Covenant on Civil and Political Rights (ICCPR).[34]

Two cases decided by the European Court of Human Rights encapsulate the approach under the ECHR to platform liability for hate speech and defamation, respectively. In Delfi AS v. Estonia, decided in the final instance by the Grand Chamber in 2015, the court held that, consistent with the ECHR, an Estonian website could be held liable for defamatory content in its comments section despite the content’s subsequent removal.[35] In MTE v. Hungary, by contrast, the court held that for the comments deemed defamatory in that case, the ECHR barred liability.[36] Taken together, these cases provide a useful glimpse at the values the ECHR considers to underlie online speech protection and its limits.

The European Union (EU) has been at the forefront of several initiatives protecting online privacy—most notably, the right to be forgotten[37] and the General Data Protection Regulation (GDPR)[38]—and regulating online speech.[39] European efforts to regulate online speech started around 2015.[40]After determining that perpetrators of terrorist attacks in Europe had been aided by access to social media platforms, the EU started targeting online extremism in earnest.[41] Consequently, “European lawmakers warned companies that they would face onerous criminal and civil penalties unless online extremism was eliminated.”[42] In May 2016, the European Commission and several tech companies agreed on a “Code of Conduct on Countering Illegal Hate Speech Online.”[43] In addition, several member states moved ahead with direct regulatory efforts.[44] The European Court of Justice weighed in on the question of removing defamatory statements, found to violate Austrian law, in a case that addressed the geographic scope of the Austrian court’s take down order.[45]

B. Germany

The German Basic Law protects freedom of expression in Article 5(1) subject to a constitutional limitations clause, Article 5(2).[46] Analyzing a constitutional claim under this provision involves a multistep process, culminating in proportionality analysis.[47] The Basic Law offers strong protection of speech, and particularly political speech.[48] In a long line of cases dating back to the early years of the Basic Law, the German Federal Constitutional Court has expounded on the importance of free speech and the limits imposed, including by criminal prohibitions of hate speech, defamation, and incitement to hatred.[49] Indeed, “when viewed comparatively, the German Court’s record in defense of freedom of speech, particularly in recent years, easily rivals that of the world’s most advanced constitutional democracies.”[50]

In 2017, the German federal legislature (Bundestag) adopted NetzDG, a law governing the implementation of existing law on social media platforms.[51] The law’s goal is to enable better enforcement of German criminal code (Strafgesetzbuch, StGB) provisions and other speech regulation in online social media networks such as Facebook, YouTube, and Twitter.[52] The law fully entered into effect on January 1, 2018.

Importantly, the aim of the statute is to enable better enforcement of existing law online.[53] To that end, NetzDG expressly enumerates the criminal code provisions applicable to the online context, the violation of which could result in sanctions.[54] NetzDG moreover creates reporting requirements for platforms that receive more than 100 complaints of unlawful postings per calendar year (section 2),[55] establishes a process for the handling of complaints about unlawful content (section 3),[56] and contains provisions on regulatory fines (section 4).[57]

1. Legislative History and Enactment of NetzDG

The legislative history highlights some of the underlying concerns and reveals the extent to which current debates were already foreshadowed during the drafting process.[58] During the initial parliamentary debate in May 2017, the opposition parties in the Bundestag voiced grave concerns prompting the government coalition parties (CDU/CSU and SPD) to concede that revisions of the proposal were necessary.[59] Notably, however, critics—including opposition speakers from the Greens and Die Linke—agreed with the premise that speech online must be treated in a manner that reflects speech regulation offline.[60] The points of disagreement primarily concerned the enforcement role of platforms and the lack of procedural safeguards.[61] Expert testimony in committee, however, was more divided and several experts argued that the law as proposed would be unconstitutional. After revisions in committee, the Bundestag adopted the governing coalition’s proposal in June 2017.[62]

This admittedly truncated discussion shows that from its drafting, NetzDG has been controversial.[63] But the law’s controversial elements are not the same as those critics in the United States initially seized upon.[64] I will return to the question of NetzDG’s constitutionality under the German Basic Law.[65]

2. Proposed Revisions of NetzDG

Efforts to revise NetzDG have resulted in legislative complications.[66] Responding to an attack by a right-wing extremist on a synagogue in the city of Halle in fall 2019—timed to coincide with Yom Kippur and only partially thwarted by the assailant’s inability to gain entry in to the building[67]—the federal government contemplated, among other measures, stricter reporting requirements for extremist online content.[68] To that end, and also to implement the EU Audiovisual Media Services Directive,[69] the federal government adopted draft revisions to NetzDG in April 2020.[70] The legislative proposal was introduced at the Bundestag in May 2020 and referred to committee.[71] Meanwhile, the Bundestag adopted the law against right-wing extremism and hate crimes (Gesetz zur Bekämpfung des Rechtsextremismus und der Hasskriminalität, GBRH) in June 2020.[72] This law also amends NetzDG. These two concurrent and partially overlapping reform efforts created some uncertainty, not least because the Bundestag’s parliamentary research service has concluded that the GBRH in relevant parts is likely unconstitutional[73] and German President Frank-Walter Steinmeier announced that he would, at least temporarily, withhold his signature pending a review by his office to determine whether the law is “obviously” unconstitutional.[74]

The most problematic provisions concern the new reporting requirements under NetzDG creating an obligation for platforms to notify federal law enforcement (Bundeskriminalamt, BKA) of certain illegal content that was flagged so that the BKA may initiate criminal proceedings.[75] This NetzDG revision is connected to changes of the criminal code and the code of criminal procedure.[76] In particular, the proposal seeks to combat threats of physical violence to individuals who are engaged in public political discourse whose active participation in democracy is chilled by such threats. In addition, the proposal targets child pornography.[77] Under the proposal, the BKA would undertake its own evaluation of flagged content, which includes obtaining the relevant IP address. The BKA would then hand over the investigation to the appropriate law enforcement agency, most likely state law enforcement.[78]

After the Bundestag adopted GBRH with the votes of the government coalition parties Christian Democrats (CDU/CSU) and Social Democrats (SPD), the Green party launched an inquiry with the Bundestag’s parliamentary research service to assess the law’s constitutionality. The report, which applied a recent decision of the Federal Constitutional Court concerning data access by law enforcement to GBRH,[79] concluded that core provisions of the law are likely unconstitutional. In particular, it determined that the provisions compelling providers to transmit IP addresses to federal law enforcement are incompatible with the Federal Constitutional Court’s recent decision because they are too vague.[80] In December 2020, the government coalition parties in the Bundestag submitted several legislative revisions pursuant to the Federal Constitutional Court’s decision.[81]

Further changes proposed by the federal government not contained in GBRH would enhance user protections via various procedural changes and create a more efficient dispute resolution process with respect to postings or takedowns.[82] The changes would also create private dispute resolution mechanisms by providing the foundations for accrediting private dispute resolution entities.[83] This mechanism, however, would not displace litigation and thus would not create compelled privatization of dispute resolution.[84]

The proposed changes also would make it easier for individuals to obtain data on criminal offenses (e.g., defamatory content) from the social media platforms.[85] With respect to illegal content that harms individuals, the current iteration of the law permits platforms to disclose data such as the author of the defamatory content upon court order, but it requires a separate application and court order to compel platforms to disclose.[86] The revised law would enable the initial court order to include a decision on the obligation to disclose.[87]

Also, the revisions would simplify flagging illegal content by requiring a mechanism to flag illegal content that must be accessible from the illegal post itself, rather than clicking elsewhere or copying and entering the text to flag it.[88] In addition, increased transparency requirements would demand that reporting include the target groups of hate speech to provide information on which groups are frequent targets of hate speech and whether there are discernible structures or coordination of messages among users who post illegal content.[89] Moreover, there would be more information on how platforms handle contested takedowns (e.g., how many “put backs”). The revisions would also demand increased transparency on the extent to which artificial intelligence is used for identifying illegal content.[90]

3. Federal Ministry of Justice and Consumer Protection (BMJV) Assessment of NetzDG

Pursuant to a requirement that NetzDG be evaluated within three years after taking effect and the report be submitted to the federal legislature, BMJV published its evaluation report in early September 2020.[91] The report was accompanied by an independent legal assessment,[92] which concluded that NetzDG has proven successful and is in accordance with European developments.[93] However, the assessment did identify a number of shortcomings and offered potential fixes. Among the most pertinent issues the assessment identified expanding user rights to protest removal although it did not detect systematic overblocking, that is, blocking more content than legally required.[94]

The BMJV report concluded that overall, NetzDG met its original goals and only required a few discreet changes.[95] These changes would include improving user-friendliness in reporting, expanding BMJV competences, adjustments with respect to service of process, and more robust user rights to counter the risk of overblocking.[96] Unsurprisingly, these changes are largely contained in the federal government’s proposals pending in the Bundestag.[97]


Ultimately, NetzDG will likely be amended. With respect to its political future, the right-wing party Alternative für Deutschland (AfD) demands abolishing NetzDG entirely, whereas the other parties represented in the Bundestag are discussing various aspects of reform in the context of debating the federal government’s draft revisions.

C. United States

In the United States, European regulatory efforts with respect to online speech in general, and NetzDG in particular, were received with deep skepticism.[98] There are a few exceptions praising the positive user experience resulting from regulation.[99] As one commentator reported, “I had enough of the trolling and toxic taunts on Twitter. So I reset my location and ‘moved’ to Germany, the safest social media state in the world.”[100] Many observers in the United States, however, seemed to have the opposite reaction.[101]

Unsurprisingly, German and European regulatory efforts largely contradict the dominant contemporary understanding of American free speech values and the First Amendment.[102] This tension in itself is not particularly noteworthy, given longstanding American free speech exceptionalism.[103] One point of criticism of regulatory efforts concerns “definitional ambiguity.” For example, Danielle Citron remarks with respect to the EU’s Code of Conduct:

Consider the Code’s definition of “illegal hate speech”: speech inciting violence or hatred against a group or a member of such a group based on race, religion, national, or ethnic origin. Inciting hatred against a group is an ambiguous concept. It could be interpreted to cover speech widely understood as hateful, such as describing members of a religious group as vermin responsible for crime and disease. But it could also be understood as covering speech that many would characterize as newsworthy.[104]

This mirrors general critiques of hate speech prohibitions that exist in several European countries, including Germany. But there is a solid body of cases interpreting these provisions. Judicial opinions and academic scholarship for the last half century has provided ample gloss on “hate speech,” despite U.S. concerns.[105]

While German and European online speech regulations are in conflict with predominant free speech values in the United States, it is important to note that as a matter of doctrine, speech regulations imposed by private platforms on online speech do not violate the First Amendment: “[a]s private actors, online platforms operate free from First Amendment concerns.”[106] In this space, private actors may be more restrictive than government actors.[107] In other words, speech that cannot be prohibited by the state under the First Amendment can be prohibited by private platforms online.[108] U.S. platforms are already engaging in speech regulation more closely aligned with the European model. Indeed, as Jennifer Daskal noted, “over time, what was once an unwavering devotion to free speech shifted.”[109] Even U.S. companies, ostensibly committed to U.S. free speech values, are now taking “increasingly robust steps to control content.”[110] Thus, online speech governance likely has escaped the grip of the First Amendment already.[111] The recent actions platforms have taken against former President Trump, most notably Twitter’s decision to suspend the @realDonaldTrump account, vividly illustrate this point.[112] In fact, the January 6, 2021, attack on the Capitol and the social media platforms’ reaction to associated online activities marked a turning point in the debate on regulating online speech in the United States.[113] This moment of reckoning may also mark an opportunity to move beyond resistance and toward more comparative engagement.[114]

II. Internal and External Sites of Conflict

The constitutional regimes that developed from the mid-twentieth century onward did not contemplate online speech. But it seems fair to assume that the constitutional balances that were struck created a reasonable expectation that the values underlying speech protection also guide online regulation. And attempts at regulating online speech first must be examined within the constitutional context in which they originate. This Part first traces German assessments of NetzDG’s constitutionality. Ultimately, NetzDG is perhaps best characterized as an imperfect attempt at attaining a constitutionally permissible goal. Arguably, on a militant democracy reasoning, this type of intervention may even be required in some instances.[115] Indeed, the German legal literature recognizes this dichotomy between laudable goals and deficient implementation.[116] But NetzDG is considered constitutionally problematic in Germany for reasons other than those primarily discussed in the United States.

This Part then more closely examines the argument that the constitutional framework might be considered orthogonal to the most important emergent regulatory questions. As Balkin explains, this is due to “a revolution in the infrastructure of free expression. That infrastructure, largely held in private hands, is the central battleground over free speech in the digital era.”[117] Under Balkin’s theory of “new school” speech regulation, a decisive factor is whether private actors, rather than the state, function as regulators. However, the theory that “free speech is a triangle,” developed in and for the U.S. context, has important limits in other constitutional regimes.[118]

Finally, this Part turns to questions of intersystemic compatibility. Whereas speech regulation offline is well-established in European constitutional contexts and largely disfavored in the United States, the nature of online speech brings these regimes into direct and unavoidable conflict. As Noah Feldman remarked with respect to the U.S. and European speech traditions, “[a]t one time, the different views existed in splendid isolation. Now, the internet and globalization are bringing them into conflict.”[119] Scholars have offered different approaches to address this conflict, and this Part will end by mapping the features contained in these proposals that most likely will successfully mitigate the conflict.

A. Constitutionality

German scholars have identified a range of constitutional infirmities plaguing NetzDG.[120] The affected platforms, most notably Facebook, decided to forego challenging the law before the Federal Constitutional Court.[121] Other constitutional challenges thus far have failed.[122]

One concern is overdeterrence. In an effort to avoid potential fines, platforms may err on the side of deleting content, creating a chilling effect. Some scholars have suggested that this chilling effect in itself constitutes a violation of Article 5 of the Basic Law.[123] The most likely outcome of any complaint will be that the content remains deleted because users are relatively unlikely to challenge the deletion in court.[124] Thus, whereas the individual’s interest in their concrete expression might be minimal—merely one post on Facebook or Twitter—there is an overwhelming societal interest in free expression. If the deletion of content is unwarranted, the problem is not just that the individual’s expression is curtailed, but that society loses a contribution to public discourse. This also means that speech will no longer reach the outer bounds of permissible speech. Platforms will start deleting content they consider too close to the line, which, in turn, moves the line, thus narrowing the scope of what is permissible speech online, risking that legal content will be swept up in the process.[125] This narrowing of permissible online expression, in turn, is considered problematic as a form of speech suppression. But even though overblocking constitutes a major free speech concern,[126] the data so far does not support claims of overblocking under NetzDG.[127]

A related concern is that NetzDG is not properly understood as a general law in the sense of Article 5(2) Basic Law that appropriately may limit free speech. The limitations clause, the argument goes, does not mean that any law that is capable of limiting free speech is permissible.[128] Rather, there is a reciprocal relationship between the limitation clause and the right itself (known as “interdependence doctrine” or Wechselwirkung).[129] In other words, any law that limits free speech must take into account the constitutional importance of free speech, and some argue that NetzDG is too restrictive of speech in light of its constitutional importance.[130]

With respect to the deletion of obviously illegal content within twenty-four hours, scholars have pointed out that, aside from practical problems,[131] this requirement conflicts with Article 14 of the European e-Commerce Directive, a provision about intermediary liability which requires that “the provider, upon obtaining such knowledge or awareness [of illegality], acts expeditiously to remove or to disable access to the information.”[132] Arguably, this more flexible wording is incompatible with the static, fixed twenty-four hour requirement under NetzDG.[133] If, on the other hand, the content is not obviously illegal, the platform has seven days to remove, but it still must be decided by the platform what is obvious and what is not obvious.

On the issue of private enforcement, some commentators suggest that the state places excessive responsibility on private parties, particularly in determining the illegality of content.[134] One alternative suggested in the German literature would instead strengthen state law enforcement since direct state action would be more speech protective.[135] Another alternative could be to alert users who flag content as illegal to the possibilities of filing a criminal complaint or seeking an injunction.[136]

Most critics of NetzDG clearly acknowledge that the law has a legitimate purpose, and none of the constitutional concerns suggest that the limits on free speech that NetzDG seeks to implement are themselves unconstitutional. Whereas a range of proposals suggest that NetzDG might be better tailored, the constitutional balance itself is not in doubt.

B. Regulation and Governance

Moving on from the first order constitutional question whether speech regulation is permissible at all, Balkin distinguishes between “old-school” and “new-school” speech regulation: “Traditional or ‘old-school’ techniques of speech regulation have generally employed criminal penalties, civil damages, and injunctions to regulate individual speakers and publishers.”[137] While these regulatory techniques continue to exist, “they are joined by ‘new-school techniques’ of speech regulation.”[138] The new techniques “regulate speech through control over digital networks and auxiliary services like search engines, payment systems, and advertisers; instead of focusing directly on publishers and speakers, they are aimed at the owners of digital infrastructure.”[139] Recall, for example, that Apple and Google eliminated Parler from their respective app stores following Parler’s refusal to engage in content moderation.[140] Three features are characteristic of new-school speech regulation: collateral censorship, public/private cooperation and co-optation, and digital prior restraint.[141]

While scholars in the United States are problematizing the private role of platforms, it is worth re-examining the constitutional backdrop against which an increasingly privatized regime of governance is emerging. Remember that in the United States, “[b]ecause platform owners are private actors, constitutional law permits them to engage in content-based regulation that would be prohibited under the First Amendment if they were treated as state actors.”[142] But the interactions among private actors, state actors, and the constitution are not the same in all countries’ systems.[143] Thus, NetzDG displays features of “new-school” speech regulation, but their salience is somewhat different.

In an important comparative wrinkle, the German system merges “old school” and “new school”-mode by constitutional default. This merge occurs because constitutional rights may indirectly bind private parties in their interactions. The prevailing view articulated by the Federal Constitutional Court since the seminal 1958 Lüth decision is that the fundamental rights—in that case, freedom of expression—reach into private law.[144] Though there is some debate as to its extent, the fundamental rights are interpreted to have indirect horizontal, or third-party, applicability (mittelbare Drittwirkung).[145] The importance of delegating state regulatory power to private parties thus is significantly diminished in the German context.

Although this framework means critiques along those lines are of limited value when transposed to the German constitutional context, this does not mean that “outsourcing” state functions to private parties is unproblematic, as the Federal Constitutional Court has acknowledged.[146] However, under NetzDG’s regulatory approach, the combination of state and private involvement has a different salience, which also has implications for framing the discussion. Whereas the move from speech “regulation” to “governance” mirrors the shift from public to private in the U.S. context,[147] its descriptive force does not equally apply in the German context where “regulation” remains the primary mode. And the absence of direct state involvement in online speech regulation also explains Chancellor Merkel’s negative reaction to Twitter’s decision to deplatform @realDonaldTrump.[148]

C. Compatibility

National constitutional regimes take divergent approaches to speech regulation, but online platforms operate across these regimes. The United States taking a fundamentally different approach to speech regulation than much of the rest of the world, however, is not surprising nor should this observation continue to dominate scholarly and policy debates. As Bloch-Wehba rightly notes, “the Internet’s global reach heightens substantive disagreements among nations about the scope of speech, privacy, and property protections.”[149] Thus, the question is how to account for these differences in a context in which regulation necessarily applies across national boundaries.

A related concern cautions against an ever-more fractured internet (or “splinternet” [150]). Representative of this argument, Balkin asserts:

Nothing would then prevent other countries—pursuing their own speech regulation policies—from requiring global filtering, blocking, or delinking of speech that these countries wish to regulate or censor. By promoting its parochial interests, each country will restrict access for end-users around the world. The result will be a race to the bottom (or to the top, depending on how you look at it). Currently the Internet is mostly governed by the values of the least censorious regime—that of the United States. If nation states can enforce global filtering, blocking, and delinking, the Internet will eventually be governed by the most censorious regime. This will undermine the global public good of a free Internet.[151]

The underlying assumption here is that “a free Internet” is normatively desirable. But even if that were the case, it is not necessarily the least regulated approach that yields the normatively most desirable result. Indeed, as already noted, increasingly robust content moderation in the United States suggests as much.

Whereas early reactions in U.S. legal discourse primarily focused on rejecting European-style regulation as incompatible with the American understanding of free speech, some later engagement became more nuanced as scholars suggested several approaches to address the inevitable conflict.

Illustrating the range of approaches, Bloch-Wehba draws from global governance principles;[152] Daskal takes a conflicts approach that starts from comity principles and proposes use of technical solutions such as geoblocking where feasible.[153] Evelyn Aswad offers an approach based on international human rights.[154] These approaches appear plausible, and the most likely to succeed will be approaches that acknowledge pluralism in the normative assessment of free speech, rather than those that attempt to shoehorn one approach into the other constitutional frame.

III. Toward a Normative View

The reflexive posture, from a contemporary American understanding of free speech, is to be generally skeptical toward any kind of speech regulation.[155] But the questions that should guide the discussion around online speech and its limits are more complicated.[156] This Part first examines normative concerns underlying online speech protection and animating speech regulation. I do not suggest that European-style speech regulation in general, or NetzDG in particular, simply ought to be transplanted into a U.S. context.[157] Indeed, this Article stops short of advocating for a fundamental rethinking of First Amendment doctrine for online speech.[158] Rather, I suggest that normative engagement likely would lead to a more fruitful discussion around online speech regulation within the currently existing constitutional frames. Normative engagement would acknowledge the normative convergence that is occurring outside of the constitutional framework in the United States, where private actors drive developments, with the values underlying constitutional frameworks in Europe. And as private platforms’ decisions become increasingly important for democratic public discourse,[159] platforms themselves may learn from constitutional systems that have engaged in “content moderation” to establish the boundaries of political discourse all along.[160]

This Part then turns to two major objections to online speech regulation: the “First Amendment-or-bust”-objection and the “slippery slope”-objection. The first objection contends that only the contemporary American understanding of free speech, framed by First Amendment doctrine and with a strong emphasis on the marketplace of ideas, is sufficiently speech protective. But there is neither a single justification for speech protection[161] nor one single American speech tradition.[162] In addition to the marketplace of ideas, there are theories of autonomy and democratic self-governance with deep roots in American free speech thought.[163] These alternative speech traditions likely provide a more useful lens for comparative dialogue. The second objection is concerned with regulation originating in democratic regimes that is then appropriated by nondemocratic regimes to nondemocratic ends. This seems especially troubling because current First Amendment doctrine places great emphasis on content-neutrality. Thus, speech regulation cannot distinguish between democratic and nondemocratic contexts and content. But connecting speech protection back to its role in democracy addresses both objections.

A. Shared Underlying Concerns

In the recent past, we have seen increasing doubt about the democracy-enhancing character of the internet in general and social media platforms in particular.[164] Mis- and disinformation, propaganda, and hate speech proliferate at scale.[165] Unregulated online speech challenges democratic self-government in novel ways.[166] The attack on the Capitol is a particularly vivid example. The threat that online speech poses to democracy, in short, has become obvious and tangible.[167] Yet, there is also wide agreement both on the value of online speech and on certain categories of speech that are particularly problematic.[168]

NetzDG is explicitly aimed at protecting democratic public discourse.[169] The purpose of enforcing criminal code provisions online is to protect individuals who actively participate in the project of democratic self-governance from threats or harassment in the same way they would be protected offline.[170] The focus of this regime is on the role of participants in democratic public discourse, rather than the acontextual, reflexive protection of speech. As more extremist and abusive content was disseminated online, improved regulation became necessary. This protective regulation aligned with the understanding that free speech has limits, developed before the advent and rise of social media.[171] In light of critiques, it is thus worth emphasizing that NetzDG is not a crackdown on speech but an implementation of a speech-protective constitutional framework that also permits regulation. From the German perspective, however, “[p]rohibiting certain types of speech is not self-serving, and therefore, merely referencing the criminal prohibitions is not a genuine justification.”[172] Instead, the limits on speech must reflect the values underlying the historically motivated decision to impose certain limits on public discourse. But to justify such limits, “speech falling short of criminally prohibited speech is to be permitted and criminal prohibitions are to be kept to a minimum.”[173]

Similarly, extremist content and propaganda online have become a pressing concern in the United States. Whereas the marketplace theory of speech protection would counsel in favor of counterspeech,[174] democratic self-government theory would likely counsel in favor of content moderation.[175] But while the marketplace is increasingly unable to stem speech that poses a direct threat to democracy,[176] concerns about speech suppression being undemocratic collide with calls for speech regulation to enable democracy.[177]

Considering certain speech outside the bounds of public discourse in order to protect democracy, however, is explicitly part of the constitutional framework elsewhere.[178] While NetzDG itself may be in part misguided in its current iteration, this type of legislation is not only permissible but might even be constitutionally required. The Basic Law goes further than simply protecting free speech, subject to a limitations clause. It also imposes a regime of “militant democracy” (wehrhafte Demokratie).[179] Fundamentally, “[f]reedom and democracy are paramount values of the ‘free, democratic, basic order’ and their defense is the paramount duty of public officials and citizens alike.”[180] Militant democracy is democratic self-defense: “Beyond the liberal protections it secures, the Basic Law contains a number of provisions that are meant to ensure that the enemies of democracy will never again be able to exploit the freedoms inherent in democracy.”[181] Among other elements, the Basic Law creates a constitutional requirement to guard against certain forms of propaganda and hate speech.[182] The militant democracy framework has been interpreted by the Federal Constitutional Court in various contexts, including that of neo-Nazi demonstrations.[183] Here, too, the Federal Constitutional Court has an eye to preserving a generally speech protective regime subject to certain limitations.[184] To reiterate, this does not mean that NetzDG is constitutionally required. But this type of law is of a piece with the larger approach to protect democratic public discourse and defend democracy itself.

Current First Amendment doctrine would not permit such democratic self-defense.[185] Nevertheless, the militant democracy theme is evident in contemporary U.S. legal commentary.[186] The question animating militant democracy is emerging in the opinion pages of U.S. newspapers: “How can anyone argue that democracy’s own core principles require us to let them tear it apart for as long as they want?”[187] In a warning that also resonates in militant democracy reasoning, Frank Pasquale likewise contends, “[i]f we do not move aggressively to protect our democracy from lies, conspiracy theories, extremism, and threats of violence, we will lose it.”[188]

State regulation and private governance mechanisms equally ought to seek to protect democratic public discourse to enable democratic self-governance. The irony, of course, is that in the United States, the democracy-securing function online is in the hands of private companies. Absence of the state from speech regulation, on the theory that this protects democracy, thus results in a democratic deficit. But reorienting the discussion around shared concerns of democratic self-government, rather than continued reliance on the marketplace of ideas, allows for deeper engagement with these questions. And, as the following discussion will demonstrate, this tradition also has deep roots in the United States.

B. First Amendment or Bust?

One objection to the position favoring comparative engagement articulated throughout this Article is that only U.S.-style free speech values guarantee the existence of the internet as we know it. We must choose the most speech-permissive regime in order to ensure a truly universal internet.[189] This position in turn means that national regulation must either be disallowed entirely or made compatible with the American concept of free speech. The marketplace theory and free speech near-absolutism in the United States have come to dominate the discussion surrounding online speech.[190]

But there are different understandings with deep historical roots that are sometimes glossed over in contemporary debates, resulting in an either-or posture. Take, for example, Feldman’s observation: “The underlying philosophical difference here is about the right of the individual to self-expression. Americans value that classic liberal right very highly—so highly that we tolerate speech that might make others less equal. Europeans value the democratic collective and the capacity of all citizens to participate fully in it—so much that they are willing to limit individual rights.”[191] But there is not a single free speech tradition in the United States. The focus on participants in democratic public discourse displayed by NetzDG parallels a classic strand displayed in American free speech theory, including a Meiklejohnian understanding.[192]

There are, of course, debates within democratic self-government theory about the viability of different strands. According to Robert Post, “the Meiklejohnian approach interprets the First Amendment primarily as a shield against the ‘mutilation of the thinking process of the community,’ whereas the participatory approach understands the First Amendment instead as safeguarding the ability of individual citizens to participate in the formation of public opinion.”[193] Post argues that the Meiklejohnian perspective has been “decisively reject[ed]” by proponents of participatory self-government.[194] Nonetheless, as Post points out, there are certain contexts—such as federal regulation of the broadcast media—that build on the Meiklejohnian theory.[195] There, the specific role of the “broadcast licensees as trustees for the speech of others” allowed an approach that Post deems “compatible with the participatory approach.”[196] The upshot is that a focus on the role of speech in democratic self-government, and on participants in public discourse, is not at all foreign to U.S. free speech theory.

The full breadth of American free speech theory, however, often gets lost in current discussions of online speech. It is worth remembering that a Madisonian democratic deliberation tradition exists alongside the Holmesian marketplace tradition.[197] Explaining both traditions’ continued relevance, Cass Sunstein noted in 1995 that

[I]t is foreseeable that free markets in communications will be a mixed blessing. They could create a kind of accelerating “race to the bottom,” in which many or most people see low-quality programming involving trumped-up scandals or sensationalistic anecdotes calling for little in terms of quality or quantity of attention. It is easily imaginable that well-functioning markets in communications will bring about a situation in which many of those interested in politics merely fortify their own unreflective judgments, and are exposed to little or nothing in the way of competing views. It is easily imaginable that the content of the most widely viewed programming will be affected by the desires of advertisers, in such a way as to produce shows that represent a bland, watered-down version of conventional morality, and that do not engage serious issues in a serious way for fear of offending some group in the audience.[198]

From today’s perspective, this description indeed looks familiar. And given the specific threats to democratic public discourse that unmoderated online speech poses, it might be worth refocusing the discussion around the speech tradition that is most responsive to the threat. In keeping with American free speech thought, democratic public discourse ought to be at the center of the debate.[199]

C. The Slippery Slope

The “slippery slope”-objection contends that European-style speech regulation might be acceptable in the European context, but the next regulatory framework might be imposed by a nondemocratic regime.[200] While NetzDG may be appropriate (though perhaps not at present appropriately tailored) for the German constitutional framework, its existence will be taken as an excuse by nondemocratic regimes to enforce their own, nondemocratic goals. This creates the beginning of a slippery slope, “a particular act, seemingly innocuous when taken in isolation, may yet lead to a future host of similar but increasingly pernicious events.”[201] Indeed, as Frederick Schauer has noted, “[a]lthough the first amendment has no monopoly on slippery slope arguments, these arguments appear commonly in discussions about freedom of speech.”[202] The key concern in this area is “that permitting one restriction on communication, a restriction not by itself troubling and perhaps even desirable, will increase the likelihood that other, increasingly invidious restrictions will follow.”[203] With respect to NetzDG, the transfer from a democratic to a nondemocratic context is at the core of the slippery slope concern.[204]

How is a slippery slope argument best addressed? As a first step, we must identify the “factors increasing the likelihood not only of slippage, but of slippage in the particular direction that takes us from the instant case to the danger case.”[205] In the case of NetzDG, the increased risk results from the absence of a constitutional democracy into which the regulatory regime is embedded. There is a line to be drawn between the “instant case” (NetzDG) and the “danger case” (a NetzDG-like law in a nondemocratic regime). Content- and viewpoint neutrality as core elements of free speech do not work across all speech contexts. And because context matters for slippery slope purposes, the Austrian Kommunikationsplattformen-Gesetz, for example, is not usually cited as problematic.[206]

Moreover, taking a cue from First Amendment doctrine, contemporary discussions of free speech in the United States tend to over-emphasize content neutrality.[207] Even within the First Amendment, the doctrine of content neutrality applied across all areas of speech is incoherent.[208] Applied to online speech, without the constraints of the First Amendment, a distinction between democratic and anti-democratic content is not only permissible but vital.


In early 2021, the debate over online speech regulation in the United States reached a turning point. Pasquale noted that “[w]e can no longer afford the indifference to truth that has become a hallmark of free-speech-absolutist ideology. . . . The Capitol riot was a watershed, revealing to the nation trends that had worried experts for years.”[209] Through the lens of the German NetzDG, this Article has illustrated efforts to align free speech regulation online and offline in a constitutional framework that has struck the balance between speech protection and speech regulation in a different way than the First Amendment.

Considering the normative values underlying the constitutional balance allows for engagement even in an area where the First Amendment does not apply directly, but rather influences the understanding of free speech. U.S. platforms, ostensibly committed to U.S. free speech values, are already engaging in speech regulation more closely aligned with the European model. Reform efforts in the United States will likely draw on comparative insights, making a deeper understanding of the comparative dimension of online speech regulation particularly salient.

  1. * Associate Professor of Law and Political Science, Northeastern University; Affiliate Fellow, Information Society Project, Yale Law School. Many thanks to Susanne Baer, Jack Balkin, danah boyd, Hannah Bloch-Wehba, Robyn Caplan, Rebecca Crootof, Ignacio Cofone, Giovanni De Gregorio, Nik Guggenberger, Hiba Hafiz, Rebecca Hamilton, Woody Hartzog, Amélie Heldt, Thomas Kadri, Ido Kilovaty, Nanette Levinson, Frank Pasquale, Karl-Nikolaus Peifer, Jackie Ross, Clare Ryan, Blaine Saito, Kim Scheppele, Andrew Selbst, Jessica Silbey, Michael Tolley, and audiences at Northeastern University School of Law, Washington & Lee School of Law, Yale Law School, the Data & Society Research Institute, the 2018 Annual Meeting of the American Political Science Association, the Boston Area Junior Faculty Roundtable, the 2019 ASCL/YCC Global Conference at McGill University Faculty of Law, and the 2021 Penn/Illinois/Princeton comparative law works-in-progress workshop.

  2. . Gesetz zur Verbesserung der Rechtsdurchsetzung in den sozialen Netzwerken [Netzwerkdurchsetzungsgesetz—NetzDG] [Act to Improve Enforcement of Law in the Social Networks], Sept. 1, 2017, Bundesgesetzblatt, Teil I [BGBl I] at BGBl I at 3352 (Ger.) translation at;jsessionid=92275CBB36905E837DBADFEEE79A0533.1_cid324?__blob=publicationFile&v=2 [].

  3. . Emily Bazelon, Why is Big Tech Policing Speech? Because the Government Isn’t, N.Y. Times (Jan. 26, 2021), [] (“The deplatforming troubled her because it came from a private company; instead, she said through a spokesman, the United States should have a law restricting online incitement, like the one Germany passed in 2017 [i.e., NetzDG] to prevent the dissemination of hate speech and fake news stories.”).

  4. . See, e.g., Germany Is Silencing “Hate Speech”, but Cannot Define It, Economist (Jan. 13, 2018), [] [hereinafter Germany is Silencing “Hate Speech”].

  5. . Like Jack Balkin, I prefer the term “speech regulation” over “censorship.” See Jack M. Balkin, Old-School/New-School Speech Regulation, 127 Harv. L. Rev. 2296, 2299 (2014) [hereinafter Balkin, Speech Regulation]. He gives three reasons for this choice, of which the first is most important for purposes of this Article. Balkin notes that “people generally consider ‘censorship’ as presumptively impermissible, but not all regulation of speech is unjustified.” Id. This is particularly true where constitutional provisions for speech protection explicitly permit regulation, as is the case in most constitutional regimes outside of the United States.

  6. . This assessment differs somewhat from the early days of the Internet when several commentators envisioned a truly separate sphere governed independent of the laws of nation states. See, e.g., David R. Johnson & David Post, Law and Borders—The Rise of Law in Cyberspace, 48 Stan. L. Rev. 1367 (1996); David G. Post, Governing Cyberspace, 43 Wayne L. Rev. 155 (1996); Joel R. Reidenberg, Governing Networks and Rule-Making in Cyberspace, 45 Emory L.J. 911 (1996); Lawrence Lessig, The Zones of Cyberspace, 48 Stan. L. Rev. 1403, 1406 (1996). But see Jack Goldsmith & Tim Wu, Who Controls the Internet?: Illusions of a Borderless World (2006).

    Summarizing this history of cyber-exceptionalism, Hannah Bloch-Wehba notes that in their attempt to escape “substantive disagreements among nations about the scope of speech, privacy, and property protections,” the cyber-exceptionalists sought to transcend the jurisdiction of “territorially-based sovereign[s],” insisting on online communities’ self-governance by which they “predicted that the Internet could escape these disagreements.” Hannah Bloch-Wehba, Global Platform Governance: Private Power in the Shadow of the State, 72 SMU L. Rev. 27, 39 (2019). However, Bloch-Wehba concludes, “[t]his prediction could not have been more wrong.” Id.

  7. . See, e.g., Anthony Faiola & Stephanie Kirchner, How Do You Stop Fake News? In Germany, With a Law., Wash. Post (Apr. 5, 2017), [] (quoting German Justice Minister Heiko Maas’s statement: “There must be just as little room for illegal hate speech on social networks as there is on the street.”).

  8. . See, e.g., Steven H. Shiffrin, What’s Wrong With the First Amendment? 1, 7 (2016) (arguing that the First Amendment “overprotects speech” and examining approaches to free speech in “other countries that are not infected with free speech idolatry”).

  9. . See generally Kyle Langvardt, Regulating Online Content Moderation, 106 Geo. L.J. 1353 (2018).

  10. . Bloch-Wehba explains:

    Arguments that the First Amendment provides the appropriate benchmark wrongly assume that the U.S. domestic context is the most relevant one. . . . But falling back on the First Amendment as the appropriate legal standard essentially doubles down on American unilateralism online. Far from offering an achievable solution to governments’ increasingly overlapping and conflicting demands in the areas of speech and privacy governance, overreliance on U.S. legal standards replicates the worst features of American exceptionalism; it uncritically assumes not only that American law does govern, but also that it is normatively preferable and should supply the baseline standard for a de facto global regulation.

    Bloch-Wehba, supra note 5, at 65­–66.

  11. . See, e.g., Frederick Schauer, The Exceptional First Amendment, in American Exceptionalism and Human Rights 29 (Michael Ignatieff ed., 2005). See also Norman Dorsen, Michel Rosenfeld, András Sajó, Susanne Baer & Susanna Mancini, Comparative Constitutionalism: Cases and Materials (3d ed. 2016) at 1002 (discussing the differences in treatment of speech “between the U.S. and the rest of the world”).

  12. . William P. Marshall, In Defense of the Search for Truth as a First Amendment Justification, 30 Ga. L. Rev. 1, 1 (1995).

  13. . See, e.g., Dawn Carla Nunziato, The Marketplace of Ideas Online, 94 Notre Dame L. Rev. 1519 (2019). This phenomenon is also evident in the comparative literature. See, e.g., Giovanni De Gregorio, Democratising Online Content Moderation: A Constitutional Framework, 36 Comput. L. & Sec. Rev. 1, 4–5 (2020) (characterizing Justice Holmes’s dissent in Abrams v. United States, 250 U.S. 616 (1919), as “the constitutional essence of freedom of expression in the United States as enshrined in the First Amendment.”).

  14. . See Bloch-Wehba, supra note 5, at 32 (noting that “most scholarly accounts have examined the substantive impact of these private governance arrangements on user speech through a domestic . . . lens”). See generally Danielle Keats Citron, Extremist Speech, Compelled Conformity, and Censorship Creep, 93 Notre Dame L. Rev. 1035 (2018); Kate Klonick, The New Governors: The People, Rules, and Processes Governing Online Speech, 131 Harv. L. Rev. 1598 (2018); Evelyn Douek, Governing Online Speech: From “Posts-As-Trumps” to Proportionality and Probability, 121 Colum. L. Rev. 759 (2021).

  15. . As a methodological matter, this small-N approach is defensible because the transatlantic debate thus far has dominated this area of the law. The large markets in the U.S. and the EU, and the regulatory approaches taken there, continue to fundamentally shape policy debates. Moreover, the constitutional treatment of speech is distinct in these otherwise similar constitutional democracies making comparative analysis on this point particularly fruitful. See also Claudia E. Haupt, Regulating Hate Speech: Damned If You Do and Damned If You Don’t—Lessons Learned from Comparing the German and U.S. Approaches, 19 B.U. Int’l L.J. 299, 301–02 (2005) [hereinafter Haupt, Regulating Hate Speech] (discussing comparisons between the United States and Germany). But see Rebecca J. Hamilton, Governing the Global Public Square, 62 Harv. Int’l L.J. 117 (2021).

  16. . See generally Citron, supra note 13.

  17. . See generally Balkin, Speech Regulation, supra note 4.

  18. . Jack M. Balkin, Free Speech in the Algorithmic Society: Big Data, Private Governance, and New School Speech Regulation, 51 U.C. Davis L. Rev. 1149, 1187 (2018) [hereinafter Balkin, Algorithmic Society].

  19. . See Citron, supra note 13, at 1036 (“American free speech values guided policy decisions in Silicon Valley long after the showdown with Senator Lieberman. Social media companies routinely looked to First Amendment doctrine in crafting speech policies.”); Klonick, supra note 13, at 1621 (noting the role of American lawyers in content moderation policy); Nunziato, supra note 12, at 1522 (noting that although online platforms’ self-regulation efforts in the United States “are not governed by the First Amendment, they are nonetheless inspired by First Amendment values”).

  20. . Jennifer Daskal, Speech Across Borders, 105 Va. L. Rev. 1605, 1611–12 (2019).

  21. . Twitter Inc., Permanent Suspension of @realDonaldTrump, Twitter (Jan. 8, 2021), [].

  22. . Some scholars do acknowledge the role of regulation even under a marketplace framework. See, e.g., Nunziato, supra note 12, at 1527 (“While government intervention in speech markets is and should continue to be subject to far more searching scrutiny than intervention in economic markets, modern First Amendment jurisprudence does not render the government powerless to provide narrowly tailored remedies directed to fixing the flaws in today’s marketplace of ideas. Of course, our hands are not as free as in other countries, which, like Germany, have the power to enact regulations directed to addressing the flaws in the online marketplace of ideas unfettered by the First Amendment’s constraints.”).

  23. . Balkin, Algorithmic Society, supra note 17, at 1151.

  24. . Douek, supra note 13, at 767.

  25. . See, e.g., Dorsen, Rosenfeld, Sajó, Baer & Mancini, supra note 10, at 1001.

  26. . At the same time, of course, the United States had a hand in the design of the German Basic Law. See, e.g., David P. Currie, The Constitution of the Federal Republic of Germany 8–10 (1994).

  27. . The U.S. hate speech literature is vast. Classic contributions include, for example, Richard Delgado, Words That Wound: A Tort Action for Racial Insults, Epithets, and Name-Calling, 17 Harv. C.R.-C.L. L. Rev. 133 (1982); Kent Greenawalt, Insults and Epithets: Are They Protected Speech?, 42 Rutgers L. Rev. 287 (1990); Kenneth Lasson, Racial Defamation as Free Speech: Abusing the First Amendment, 17 Colum. Hum. Rts. L. Rev. 11 (1985); Charles R. Lawrence, III, If He Hollers Let Him Go: Regulating Racist Speech on Campus, 1990 Duke L.J. 431 (1990); David Partlett, From Red Lion Square to Skokie to the Fatal Shore: Racial Defamation and Freedom of Speech, 22 Vand. J. Transnat’l L. 431 (1989); Robert C. Post, Racist Speech, Democracy, and the First Amendment, 32 Wm. & Mary L. Rev. 267 (1991); Rodney A. Smolla, Rethinking First Amendment Assumptions about Racist and Sexist Speech, 46 Wash. & Lee L. Rev. 171 (1990); Nadine Strossen, Regulating Racist Speech on Campus: A Modest Proposal, 1990 Duke L.J. 484. More recent key contributions are Jeremy Waldron, The Harm in Hate Speech (2012); James Weinstein, Hate Speech Bans, Democracy, and Political Legitimacy, 32 Const. Comment. 527 (2017).

  28. . See Haupt, Regulating Hate Speech, supra note 14, at 300 n.2 (providing an overview of the literature).

  29. . Cf. Jack M. Balkin, Digital Speech and Democratic Culture: A Theory of Freedom of Expression for the Information Society, 79 N.Y.U. L. Rev. 1 (2004); Jack M. Balkin, Cultural Democracy and the First Amendment, 110 Nw. U. L. Rev. 1053 (2016).

  30. . See, e.g., Alexander Brown, What Is So Special About Online (as Compared to Offline) Hate Speech?, 18 Ethnicities 297, 308 (2018) (exploring “whether or not the constantly evolving nature and variety of cyberhate (as compared to offline hate speech) poses an especially serious practical (as opposed to constitutional) challenge for the regulation of hate speech”); Jerome A. Barron, Internet Access, Hate Speech and the First Amendment, 18 First Amend. L. Rev. 1 (2020).

  31. . Citron, supra note 13, at 1039 n.21 (“European countries have a far different approach to free expression than the United States. As James Weinstein put it to me, the EU does not really even have a concept of protected speech.”) (citation omitted).

  32. . “Article 11 Freedom of Expression and Information[:] 1.  Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers. 2.  The freedom and pluralism of the media shall be respected.” Charter of Fundamental Rights of the European Union, Dec. 18, 2000, O.J. (L C 364) 11.

  33. . Article 10 Freedom of expression[:]

    1. Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers. This Article shall not prevent States from requiring the licensing of broadcasting, television or cinema enterprises. 2. The exercise of these freedoms, since it carries with it duties and responsibilities, may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law and are necessary in a democratic society, in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.

    Convention for the Protection of Human Rights and Fundamental Freedoms, as amended by Protocols 11 and 14, supplemented by Protocols 1, 4, 6, 7, 12 and 13, Nov. 4, 1950, 213 U.N.T.S. 221.

    See, e.g., De Gregorio, supra note 12, at 5 (providing a brief overview of the European framework of speech protection).

  34. . Citron, supra note 13, at 1038 n.16 (citing Art. 19 of the ICCPR for the proposition that “Unlike in the United States, in the European Union, there isn’t a heavy presumption against speech restrictions”); Evelyn Aswad, The Role of U.S. Technology Companies as Enforcers of Europe’s New Internet Hate Speech Ban, 1 Colum. Hum. Rts. L. Rev. 1, 5 (2016) (asserting that “the International Covenant on Civil and Political Rights . . . is the most relevant to the issue of hate speech”).

  35. . Delfi AS v. Estonia, App. No. 64569/09, 2015 Eur. Ct. H.R. 58.

  36. . Magyar Tartalomszolgáltatók Egyesülete and Zrt v. Hungary, App. No. 22947/13, 2016 Eur. Ct. H.R. 21–22.

  37. . See, e.g., Robert C. Post, Data Privacy and Dignitary Privacy: Google Spain, the Right to Be Forgotten, and the Construction of the Public Sphere, 67 Duke L.J. 981 (2018); Ignacio Cofone, Google v. Spain: A Right to Be Forgotten?, 15 Chi.-Kent J. Int’l & Comp. L. 1 (2015). See also generally The Right to Be Forgotten: A Canadian and Comparative Perspective (Ignacio N. Cofone ed., 2020).

  38. . See, e.g., Margot E. Kaminski, Binary Governance: Lessons from the GDPR’s Approach to Algorithmic Accountability, 92 S. Cal. L. Rev. 1529 (2019); Daskal, supra note 19, at 1616–21 (discussing the right to be forgotten and GDPR).

  39. . See generally Giovanni De Gregorio, The Rise of Digital Constitutionalism in the European Union, 19 Int’l J. Const. L. 41 (2020).

  40. . Bloch-Wehba, supra note 5, at 31. See also De Gregorio, supra note 12, at 9–11 (discussing EU regulatory efforts).

  41. . Citron, supra note 13, at 1040; Bloch-Wehba, supra note 5, at 44.

  42. . Citron, supra note 13, at 1040.

  43. . See Brown, supra note 29, at 15; Citron, supra note 13, at 1041–42; Bloch-Wehba, supra note 5, at 5­­–6 (discussing the Code of Conduct).

  44. . See, e.g., Daskal, supra note 19, passim (discussing France and Austria).

  45. . Case C-18/18, Glawischnig-Piesczek v. Facebook Ireland, Ltd., ECLI:EU:C:2019:821 (Oct. 3, 2019) (holding that courts of Member States may issue global takedown orders for unlawful content).

  46. . Article 5 [Freedom of expression, arts and sciences]:

    (1) Every person shall have the right freely to express and disseminate his opinions in speech, writing and pictures and to inform himself without hindrance from generally accessible sources. Freedom of the press and freedom of reporting by means of broadcasts and films shall be guaranteed. There shall be no censorship. (2) These rights shall find their limits in the provisions of general laws, in provisions for the protection of young persons, and in the right to personal honour.

    Grundgesetz Für Die Bundesrepublik Deutschland [GG] [Basic Law], May 23, 1949, as amended (Ger.), art. 5, translation at

  47. . The analysis proceeds as follows, as I’ve explained elsewhere:

    A potential violation of a constitutional right is subject to a multilevel analysis with three basic stages of inquiry. First, whether the matter is subject to the definitional coverage of the right (Schutzbereich); second, whether there is a possible limit posed by regulation or prohibition (Schranken); and third, whether the limitation is proportional (Verhältnismäßigkeit). . . . If [the] definitional coverage in fact extends to the activity in question, the activity is in principle protected; it may, however, be subject to regulation. This regulation must be an encroachment on the right which is allowed under an explicit or implicit limitation clause to the right. If, then, the state action is covered by such a limitation clause, the limitation to the right has to be found proportional. This proportionality test is comprised of three elements: the suitability of the means used to further a legitimate end (Geeignetheit), the absence of an equally yet less restrictive action (Notwendigkeit), and finally, the presence of an appropriate relationship between the goal to be achieved and the extent of the intrusion upon the protected right (Verhältnismäßigkeit im engeren Sinne).

    See Haupt, Regulating Hate Speech, supra note 14, at 321–22.

  48. . Donald P. Kommers & Russell A. Miller, The Constitutional Jurisprudence of the Federal Republic of Germany 536 (3d ed. 2012).

  49. . See Haupt, Regulating Hate Speech, supra note 14, at 323–33 (discussing cases); Kommers & Miller, supra note 47, at 441–536.

  50. . Kommers & Miller, supra note 47, at 537.

  51. . See, e.g., Melissa Eddy & Mark Scott, Delete Hate Speech or Pay Up, Germany Tells Social Media Companies, N.Y. Times (June 30, 2017), [].

  52. . The provisions of the criminal code constitute legitimate limitations upon the right to free speech. See Haupt, Regulating Hate Speech, supra note 14, at 322.

  53. . Amélie P. Heldt, Reading Between the Lines and the Numbers: An Analysis of the First NetzDG Reports, 8 Internet Pol’y Rev. 1, 3 (2019) (emphasizing “that no new criminal offences against online hate speech were created or added to [the German Penal Code, also known as] the StGB”); Stefan Theil, The German NetzDG: A Risk Worth Taking?, Verfassungsblog (Feb. 8, 2018), [] (“The obligations to delete illegal content are based on well-established limits to freedom of expression, to which NetzDG chiefly adds a more robust enforcement mechanism.”).

  54. . See Netzwerkdurchsetzungsgesetz—NetzDG, supra note 1, at § 1 (3) (enumerating various offenses including: §§ 86 [dissemination of propaganda material of unconstitutional organizations], 86a [using symbols of unconstitutional organizations], 89a [preparation of a serious violent offence endangering the state], 91 [encouraging the commission of a serious violent offence endangering the state], 100a [treasonous forgery], 111 [public incitement to crime], 126 [breach of the public peace by threatening to commit offences], 129–129b [forming criminal organizations, forming terrorist organizations, including application of these provisions abroad], 130 [incitement to hatred], 131 [dissemination of depictions of violence], 140 [rewarding and approving of offences], 166 [defamation of religions, religious and ideological associations], 184b in connection with 184d [distribution, acquisition and possession of child pornography in connection with distribution of pornographic performances by broadcasting, media services or telecommunication services], 185–187 [insult, defamation, intentional defamation], 201a [violation of intimate privacy by taking photographs], 241 [threatening commission of a felony] or 269 [forgery of data intended to provide proof] of the StGB).

  55. . Id. at § 2.

  56. . Id. at § 3.

  57. . Id. at § 4. The Federal Ministry of Justice and Consumer Protection also provides an English translation of the guidelines on regulatory fines: Network Enforcement Act Regulatory Fining Guidelines, BMJV (Mar. 22, 2018), [].

  58. . Kontroverse um Gesetzentwurf Gegen Hasskriminalitat im Internet, Deustscher Bundestag (May 19, 2017), []; See also Patrick Zurth, The German NetzDG as Role Model or Cautionary Tale? Implications for the Debate on Social Media Liability, 31 Fordham Intell. Prop. Media & Ent. L.J. 1084, 1103–05 (2021) (providing a short overview of the legislative background).

  59. . Kontroverse, supra note 57.

  60. . Id.

  61. . Zurth, supra note 57, at 1103–04.

  62. . Id. at 1103–04; Thomas Wischmeyer, ‘What is Illegal Offline is Also Illegal Online’–The German Network Enforcement Act 2017, in Fundamental Rights Protection Online: The Future Regulation of Intermediaries 28, 35 (Bilyana Petkova & Tuomas Ojanen, eds., 2019).

  63. . See, e.g., Heldt, supra note 52, at 4 (“[The NetzDG] has been under attack ever since its first draft was made public.”); Ryan Kraski, Combating Fake News in Social Media: U.S. and German Legal Approaches, 91 St. John’s L. Rev. 923, 954 (2017) (observing that NetzDG “has naturally been met with heavy criticism and is certain to face scrutiny in the courts”); Nunziato, supra note 12, at 1534 (“Not surprisingly, NetzDG has been the subject of intense debate in Germany in the months since its passage . . . .”); Zurth, supra note 57, at 1104 (“The proposed law triggered a major debate.”).

  64. . Cf. Heldt, supra note 52, at 4 (noting that deleting content pursuant to NetzDG is not a primary concern “because such content is illegal”); Mathias Hong, The German Network Enforcement Act and the Presumption in Favour of Freedom of Speech, Verfassungsblog (Jan. 22, 2018), [] (“There is nothing wrong with removing and blocking unlawful content.”). See also Zurth, supra note 57, at 1153 (concluding that the debate about NetzDG in the United States is “based on misconceptions, biases, and exaggerated assumed impacts”).

  65. . See infra Part II.A.

  66. . This Article takes into account legislative developments until December 31, 2020.

  67. . Christopher F. Schuetze & Melissa Eddy, Only a Locked Door Stopped a Massacre at a German Synagogue, N.Y. Times (Oct. 21, 2019), [].

  68. . Maßnahmenpaket zur Bekämpfung des Rechtsextremismus und der Hasskriminalität [Legislation Against Right Wing Extremism and Hate Crimes], Oct. 30, 2019 (Ger.). [].

  69. . Council Directive 2010/13/EU (L 95) [hereinafter Council Directive 2010].

  70. . Gesetzentwurf der Bundesregierung, Entwurf eines Gesetzes zur Änderung des Netzwerkdurchsetzungsgesetzes [NetzDG Draft Revisions], Apr. 27, 2020, bundestag 19/18792 [hereinafter NetzDG Revisions] [].

  71. . Regierung Will Das Netzwerk­durchsetzungs­gesetz ändern, Deutscher Bundestag (June 5, 2020), [].

  72. . Gesetz Gegen Rechts­extre­mismus und Hass­krimi­na­lität Beschlossen, Deutscher Bundestag (June 18, 2020), [].

  73. . Mögliche Auswirkungen des Beschlusses des Bundesverfassungsgerichts vom 27. Mai 2020, 1 BvR 1873/13 – Bestandsdatenauskunft II – auf das Gesetz zur Bekämpfung des Rechtsextremismus und der Hasskriminalität (BT-Drs. 19/17741 und 19/20163) und das Netzwerkdurchsetzungsgesetzänderungsgesetz, [Possible Effects of the Decision of the Federal Constitutional Court] Wissenschaftliche Dienste Deutscher Bundestag, WD 10–3000–037/20 (Sept. 16, 2020) [hereinafter Research Service GBRH Report], [].

  74. . Georg Mascolo & Ronen Steinke, Bedenken in Bellevue, Süddeutsche Zeitung (Sept. 17, 2020), []. The President’s competence to independently assess the constitutionality of laws prior to signing them is contested, but the current understanding is that the President can only reject “obviously” unconstitutional laws. See also Kommers & Miller, supra note 47, at 154.

  75. . See NetzDG Revisions, supra note 69.

  76. . See id.

  77. . See id.

  78. . See id.

  79. . BVerfG, 1 BvR 1873/13, May 27, 2020 [hereinafter Bestandsdatenauskunft II], [] (holding several provisions of the Telecommunications Act (Telekommunikationsgesetz, TKG) permitting data access by law enforcement fail proportionality analysis because they lack precise boundaries regarding permitted use of the data); see also BVerfG, 1 BvR 1299/05, Jan. 24, 2012 [hereinafter Bestandsdatenauskunft I], [].

  80. . Research Service GBRH Report, supra note 72, at 36.

  81. . Gesetzentwurf [Bill], Deutscher Bundestag: Drucksachen[BT]19/25294 (Ger.),

  82. . NetzDG Revisions, supra note 69, at 9.

  83. . Id. at 8–10.

  84. . This is fundamentally different than the fully privatized dispute resolution approach taken, for example, by Facebook’s oversight board. See Ben Smith, Trump Wants Back on Facebook. This Star-Studded Jury Might Let Him, N.Y. Times (July 12, 2021), []. See generally Klonick, supra note 13; Kate Klonick, The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression, 129 Yale L.J. 2418 (2020).

  85. . See Council Directive 2010, supra note 68.

  86. . NetzDG Revisions, supra note 69, at 8.

  87. . Id.

  88. . Id.

  89. . Id.

  90. . See id.

  91. . Bericht der Bundesregierung zur Evaluierung des Gesetzes zur Verbesserung der Rechtsdurchsetzung in Sozialen Netzwerken [Netzwerkdurchsetzungsgesetz – NetzDG] (Ger.) [hereinafter NetzDG Evaluation BMJV],

  92. . Martin Eifert, Michael von Landenberg-Roberg, Sebastian Theß & Nora Wienfort, Evaluation des NetzDG im Auftrag des BMJV (2020),

  93. . Id. at 151.

  94. . Id. at 152.

  95. . NetzDG Evaluation BMJV, supra note 90, at 44–45.

  96. . Id.

  97. . See supra notes 69–87 and accompanying text.

  98. . See supra note 63 and accompanying text.

  99. . See, e.g., Virginia Heffernan, Ich Bin Ein Tweeter, Wired (Feb. 5, 2018, 9:15 AM), [] (reporting on resetting Twitter location to Germany and praising the benefits of hate speech regulation online under NetzDG as having positive effects on the user experience).

  100. . Id.

  101. . See, e.g., Diana Lee, Germany’s NetzDG and the Threat to Online Free Speech, Case Disclosed (Oct. 10, 2017), [] (arguing that “[b]eyond incentivizing overenforcement, the NetzDG further threatens online free speech due, in part, to the breadth of Germany’s defamation law”); Jonathan Turley, Germany Moves to Impose Crippling Fines on Social Networks for “Fake News,” Res Ipsa Loquitur (Dec. 28, 2016), [] (alleging that German legislative efforts contribute “to the erosion of free speech in the West”); Germany is Silencing “Hate Speech,” supra note 3. See also supra note 63 and accompanying text.

  102. . See, e.g., Nunziato, supra note 12, at 1537 (“Separate and apart from the debate in Germany over NetzDG and its implementation, such an approach to fixing the flaws in the online marketplace of ideas could never pass constitutional muster in the United States.”).

  103. . See, e.g., Schauer, supra note 10, at 29; Noah Feldman, Free Speech in Europe Isn’t What Americans Think, Bloomberg (Mar. 19, 2017, 9:33 AM), [] (“Hate speech can only be banned in the U.S. if it is intended to incite imminent violence and is actually likely to do so. This permissive U.S. attitude is highly unusual. Europeans don’t consider hate speech to be valuable public discourse, and reserve the right to ban it. They consider hate speech to degrade from equal citizenship and participation. Racism isn’t an idea; it’s a form of discrimination.”).

  104. . Citron, supra note 13, at 1052.

  105. . See Haupt, Regulating Hate Speech, supra note 14, at 300 n.2 (providing a selection of comparative hate speech literature).

  106. . Citron, supra note 13, at 1036 n.7 (citing Gitlow v. New York, 268 U.S. 652, 666 (1925)).

  107. . See, e.g., Douek, supra note 13, at 767–68 (asserting that “content moderation will always go beyond what governments can constitutionally provide for. The First Amendment would not permit laws requiring removal of content like the Christchurch Massacre livestream, violent animal crush videos, or graphic pornography, for example, but few would disagree that platforms should have some license to moderate this content to protect their services from becoming unusable. How far this license should extend may be contested, but it is relatively uncontroversial that private actors can restrict more speech than governments.”).

  108. . Balkin, Algorithmic Society, supra note 17, at 1198; Daskal, supra note 19, at 1661 (noting that in the “United States, companies are permitted to, and in fact do, restrict a range of speech that is protected under the First Amendment”); Douek, supra note 13, at 768.

  109. . Daskal, supra note 19, at 1638.

  110. . Id.

  111. . See Douek, supra note 13, at 762 (“Instead of thinking about content moderation through an individualistic lens typical of constitutional jurisprudence, platforms, regulators and the public at large need to recognize that the First Amendment-inflected approach to online speech governance that dominated the early internet no longer holds. Instead, platforms are now firmly in the business of balancing societal interest and choosing between error costs on a systemic basis.”).

  112. . See Adam Liptak, Can Twitter Legally Bar Trump? The First Amendment Says Yes, N.Y. Times (Jan. 9, 2021), [].

  113. . See, e.g., Kara Swisher, Big Tech Has Helped Trash America, N.Y. Times (Jan. 15, 2021) []; David Golumbia, Trump’s Twitter Ban Is a Step Toward Ending the Hijacking of the First Amendment, Boston Globe (Jan. 9, 2021, 1:05 PM),; Sonja West & Genevieve Lakier, The Court, the Constitution, and the Deplatforming of Trump, Slate (Jan. 13, 2021, 5:14 PM), []; Genevieve Lakier, The Great Free-Speech Reversal, Atlantic (Jan. 27, 2021, 9:33 AM), [].

  114. . See, e.g., Daphne Keller, For Platform Regulation Congress Should Use a European Cheat Sheet, The Hill (Jan. 15, 2021, 1:00 PM), [] (“After years of hard work, European lawmakers have come up with some good ideas. U.S. lawmakers who are serious about changing CDA 230 should look at them.”). See generally Vicki C. Jackson, Constitutional Engagement in a Transnational Era (2010); Vicki C. Jackson, Constitutional Comparisons: Convergence, Resistance, Engagement, 119 Harv. L. Rev. 109 (2005).

  115. . See infra notes 159–164 and accompanying text (discussing militant democracy).

  116. . Cf. Nikolaus Guggenberger, Das Netzwerkdurchsetzungsgesetz – Schön Gedacht, Schlecht Gemacht, 50 Zeitschrift für Rechtspolitik 98 (2017) (arguing that despite NetzDG’s laudable goals, the law contains significant shortcomings).

  117. . Balkin, Speech Regulation, supra note 4, at 2296.

  118. . Jack M. Balkin, Free Speech Is a Triangle, 118 Colum. L. Rev. 2011 (2018).

  119. . Feldman, supra note 102.

  120. . Zurth, supra note 57, at 1107 n.126 (citing German scholarship); Wischmeyer, supra note 61, at 43.

  121. . Zurth, supra note 57, at 1108.

  122. . See, e.g., id.

  123. . Theil, supra note 52 (“[A]rguments alleging unconstitutionality rely primarily on the unsubstantiated contention that NetzDG will promote an overly aggressive deletion policy (so-called ‘overblocking’) that will have a ‘chilling effect’ on freedom of expression for users of social media platforms: reducing their readiness to make use of their rights. If overblocking does take place as a result of NetzDG, then this would indeed be would be [sic] problematic under the German Basic Law.”).

  124. . Guggenberger, supra note 115, at 100.

  125. . Cf. Theil, supra note 52.

  126. . See generally Hong, supra note 63; Zurth, supra note 57, at 1107.

  127. . Zurth, supra note 57, at 1148 n.389; Heldt, supra note 52; Wischmeyer, supra note 61, at 55 (“The reports can neither confirm nor refute the ‘censorship’ or the ‘over-blocking’ claim. They only demonstrate that some of the fears associated with the law have been clearly exaggerated.”). See also supra notes 67–68 and accompanying text (discussing evaluations of NetzDG).

  128. . See, e.g., Hong, supra note 63.

  129. . Haupt, Regulating Hate Speech, supra note 14, at 324–25.

  130. . See, e.g., Guggenberger, supra note 115, at 100.

  131. . See, e.g., Bernhard Rohleder, Opinion: Germany Set Out to Delete Hate Speech Online. Instead, It Made Things Worse, Wash. Post (Feb. 20, 2018), (suggesting that the requirement “puts companies under tremendous time pressure to check reported content”).

  132. . Council Directive 2000/31, of the European Parliament and of the Council of 8 June 2000 on Certain Legal Aspects of Information Society Services, in Particular Electronic Commerce, in the Internal Market O.J. (L 178) (EC).

  133. . Zurth, supra note 57, at 1137 n.323 (citing competing views); Wischmeyer, supra note 61, at 44–45.

  134. . See, e.g., Rohleder, supra note 130 (arguing that “the most problematic issue is that the new law tasks private companies, not judges, with the responsibility to decide whether questionable content is in fact unlawful. In other words, the state has privatized one of its key duties: enforcing the law.”).

  135. . Guggenberger, supra note 115, at 101. Cf. Hong, supra note 63 (stating that “[w]hat is prevented or deleted on Facebook, Youtube or Twitter is often censored or restricted more effectively than would be possible by a state ban.”).

  136. . Guggenberger, supra note 115, at 101.

  137. . Speech Regulation, supra note 4, at 2298 (noting that “[t]he landmark decisions in [New York Times v. Sullivan] and Pentagon Papers responded to old-school speech regulation: in both cases, the state had used penalties and injunctions directed at speakers and publishers in order to control and discipline their speech.”).

  138. . Id.

  139. . Id.

  140. . Jack Nicas, Parler Pitched Itself as Twitter Without Rules. Not Anymore, Apple and Google Said, N.Y. Times (Jan. 8, 2021), []; Jack Nicas & Davey Alba, Amazon, Apple and Google Cut Off Parler, an App That Drew Trump Supporters, N.Y. Times (Jan. 13, 2021), [].

  141. . Balkin, Speech Regulation, supra note 4, at 2298–99.

  142. . Balkin, Algorithmic Society, supra note 17, at 1195. See also supra notes 100–102 and accompanying text.

  143. . See, e.g., De Gregorio, supra note 12, at 14 n.93 (“Some constitutions around the world (e.g. South Africa) horizontally extend[] the application of fundamental rights in the relationship between private actors. In other case[s], horizontal application is not the result of a direct constitutional provision but the result of judicial interpretation.”).

  144. . Bundesverfassungsgericht [BVerfGE] [Federal Constitutional Court] Jan. 15, 1958, 7 Entscheidungen des Bundersverfassungsgericht [BVerfGE] 198 (F.R.G.). See also Haupt, Regulating Hate Speech, supra note 14, at 323–24 (discussing the Lüth decision).

  145. . Kommers & Miller, supra note 47, at 60–61.

    For recent articulations of the principle by the Federal Constitutional Court, see BVerfG, 1 BvR 3080/09, Apr. 11, 2018 [hereinafter Stadionverbot]

    Fundamental rights do not generally create direct obligations between private actors. They do, however, permeate legal relationships under private law; it is thus incumbent upon the regular courts to give effect to fundamental rights in the interpretation of ordinary law, in particular by means of general clauses contained in private law provisions and legal concepts that are not precisely defined in statutory law.

    See also BVerfG, 1 BvR 699/06, Feb. 22, 2011 [hereinafter Fraport]

    Thus, the direct binding force of the fundamental rights on publicly controlled enterprises differs in principle from the generally indirect binding force of the fundamental rights, which also binds private and state enterprises – in particular according to the principles of the indirect effect of fundamental rights between private parties and on the basis of protective duties of the state. Whilst one is based on a fundamental duty of accountability to citizens, the other serves to balance the freedom of citizens inter se and is thus from the outset relative. This does not, however, mean that the effect of the fundamental rights and thus the burden on private persons – whether it be direct or indirect – is in any event less far-reaching. Depending on the content of the guarantee and the circumstances of the case, the indirect binding force of the fundamental rights on private persons may instead come closer to or even be the same as the binding force of the fundamental rights on the state. This is relevant to the protection of communications, in particular when private enterprises themselves take over the provision of public communications and thus assume functions which were previously allocated to the state as part of its services of general interest—such as the provision of postal and telecommunications services.

  146. . See, e.g., Fraport, supra note 144.

  147. . See generally Balkin, Speech Regualtion, supra note 4.

  148. . Bazelon, supra note 2 (“When Twitter banned Trump, he found a seemingly unlikely defender: Chancellor Angela Merkel of Germany, who criticized the decision as a “problematic” breach of the right to free speech. This wasn’t necessarily because Merkel considered the content of Trump’s speech defensible.”); Kim Lane Scheppele, Re-impeachment, Verfassungsblog (Jan. 28, 2021), [] (“Her response makes sense against the backdrop of German constitutional law in which major concentrations of power, public and private, are accountable to public law norms even if their ownership structures are not public. When Big Tech separates a political leader from his followers, as happened to Trump, it seems like a constitutional problem. And it would be in Germany.”).

  149. . Bloch-Wehba, supra note 5, at 39.

  150. . Mark A. Lemley, The Splinternet (Stan. L. & Econ. Olin Working Paper No. 555, 2020),

  151. . Balkin, Algorithmic Society, supra note 17, at 1205–06.

  152. . Bloch-Wehba, supra note 5, at 32–33.

  153. . Daskal, supra note 19, at 1650–65.

  154. . Evelyn Mary Aswad, The Future of Freedom of Expression Online, 17 Duke L. & Tech. Rev. 26, 35 (arguing that “it is both feasible and desirable to ground corporate speech codes in international human rights standards”). But see Evelyn Douek, The Limits of International Law in Content Moderation, U.C. Irvine J. Int., Transnat. & Compar. L. (forthcoming)

  155. . See, e.g., Feldman, supra note 102 (“To Americans, the idea of the government forcing social media to censor posts may seem to resemble China’s internet censorship. Such legislation wouldn’t just be unconstitutional; it would be almost unthinkable.”).

  156. . For example, Sonja West and Genevieve Lakier helpfully identify several relevant questions resulting from the Trump deplatforming:

    Has contemporary free speech law overcorrected, and does it now impose too many constraints on speech regulators? Should the court’s rules regarding incitement or false speech be relaxed? Do we need new laws to govern the platforms—laws ensuring that the decisions they make regarding speech are more transparent, less ad hoc, and reflect values beyond their motivation to make money?

    West & Lakier, supra note 112.

  157. . But see Zurth, supra note 57, at 1138–52 (suggesting that Congress should model reforms to the Communications Decency Act (CDA) Section 230 on NetzDG).

  158. . Several prominent First Amendment scholars argue that the time for a broader rethinking may have come. See, e.g., West & Lakier, supra note 112 (“[W]hether the deplatforming of Trump violated the Constitution should be only the beginning of the First Amendment discussion, not the end of it.”); Thomas B. Edsall, Have Trump’s Lies Wrecked Free Speech? N.Y. Times (Jan. 6, 2021), []; Toni M. Massaro & Helen L. Norton, Free Speech and Democracy: A Primer for Twenty-First Century Reformers, 54 U.C. Davis L. Rev. 1631 (2021).

  159. . See, e.g., Kevin Roose, In Pulling Trump’s Megaphone, Twitter Shows Where Power Now Lies, N.Y. Times (Jan. 11, 2021), [].

  160. . Cf. Claudia E. Haupt, The Scope of Democratic Public Discourse: Defending Democracy, Tolerating Intolerance, and the Problem of Neo-Nazi Demonstrations in Germany, 20 Fla. J. Int’l L. 169, 199–271 (2008) [hereinafter Haupt, Democratic Public Discourse]; Frank Pasquale, The Bounds of Political Discourse: Why the Trump Bans Make Sense, Balkinization, Jan 10, 2021, (“To put it bluntly: democratic political discourse has bounds.”).

  161. . See generally Kent Greenawalt, Free Speech Justifications, 89 Colum. L. Rev. 119 (1989).

  162. . See generally Vincent Blasi, Ideas of the First Amendment (2d ed. 2011).

  163. . See, e.g., C. Edwin Baker, Scope of the First Amendment Freedom of Speech, 25 UCLA L. Rev. 964 (1978); Alexander Meiklejohn, Free Speech and Its relation to Self-Government (1948); Robert Post, Participatory Democracy and Free Speech, 97 Va. L. Rev. 477 (2011).

  164. . See, e.g., Nathaniel Persily, Kofi Annan Foundation, The Internet’s Challenge to Democracy: Framing the Problem and Assessing Reforms (2019),

  165. . See, e.g., Alice Marwick & Rebecca Lewis, Media Manipulation and Disinformation Online, Data & Soc’y (May 15, 2017), []. See also Anthony Nadler, Matthew Crain & Joan Donovan, Weaponizing the Digital Influence Machine: The Political Perils of Online Ad Tech, Data & Soc’y (Oct. 17, 2018), (focusing on the role of ad tech).

  166. . Pasquale, supra note 159.

  167. . Id.

  168. . Daskal, supra note 19, at 1635.

  169. . Netzwerkdurchsetzungsgesetz—NetzDG, supra note 1.

  170. . Id.

  171. . Haupt, Regulating Hate Speech, supra note 14, at 321–23; Haupt, Democratic Public Discourse, supra note 159, at 202–07.

  172. . Haupt, Democratic Public Discourse, supra note 159, at 218.

  173. . Id.

  174. . For the classic articulation, see Abrams v. United States, 250 U.S. 616, 630 (1919) (Holmes, J., dissenting) (“[T]he best test of truth is the power of the thought to get itself accepted in the competition of the market . . . .”).

  175. . See Robert Post, Reconciling Theory and Doctrine in First Amendment Jurisprudence, 88 Calif. L. Rev. 2353, 2367 (2000) (discussing Meiklejohnian theory and participatory self-government theory); see also Meiklejohn, supra note 162.

  176. . See, e.g., Pasquale, supra note 159 (“But what the recent bans reflect is a dawning realization among technology firms that this marketplace of ideas is dysfunctional. It is not self-correcting—or at least it is not self-correcting enough to prevent a significant group of persons (with the guns and votes to cause real havoc) from acting on false beliefs. . . .”).

  177. . See, e.g., West & Lakier, supra note 112 (“An unregulated marketplace of ideas can also make it extraordinarily difficult to distinguish truth from lies. In this respect, as the Trump years have demonstrated, it can threaten the same democratic values it is intended to foster.”).

  178. . See, e.g., The Militant Democracy Principle in Modern Democracies (Markus Thiel ed., 2009) (discussing restrictions on freedom of speech to protect democracy in different constitutional systems).

  179. . See Kommers & Miller, supra note 47, at 285–86. See also Ronald Krotoszynski, Jr., A Comparative Perspective on the First Amendment: Free Speech, Militant Democracy, and the Primacy of Dignity as a Preferred Constitutional Value in Germany, 78 Tul. L. Rev. 1549 (2004); Gregory H. Fox & Georg Nolte, Intolerant Democracy, 36 Harv. Int’l L.J. 1, 32–34 (1995); Haupt, Democratic Public Discourse, supra note 159, at 177–78; Haupt, Regulating Hate Speech, supra note 14, at 314–15 (discussing militant democracy).

  180. . Kommers & Miller, supra note 47, at 52 (explaining that “[t]he notion of a militant democracy differs radically from what has been called the ‘value neutrality’ of the Weimar Constitution.”).

  181. . Id. at 285.

  182. . See generally Haupt, Regulating Hate Speech, supra note 14; Haupt, Democratic Public Discourse, supra note 159.

  183. . See generally Haupt, Democratic Public Discourse, supra note 159.

  184. . Id. at 181–91 (discussing a series of decisions in the early 2000s in which the Federal Constitutional Court favored a more speech-permissive approach than other courts).

  185. . Pasquale, supra note 159 (“In the U.S., long-term litigation and public relations projects of the right (and some civil libertarians) have gradually expanded interpretation of the First Amendment so as to make almost any self-protective shaping or limitation on political discourse seem illegitimate if done by the state.”); Bazelon, supra note 2 (“In the United States, laws like these surely wouldn’t survive Supreme Court review, given the current understanding of the First Amendment — an understanding that comes out of our country’s history and our own brushes with suppressing dissent.”).

  186. . See, e.g., Edsall, supra note 157 (“In that context, Levinson raised the possibility that the United States might emulate post-WWII Germany, which “adopted a strong doctrine of ‘militant democracy,’” banning the neo-Nazi and Communist parties . . . ‘Most Americans rejected “militant democracy” in part, I believe, because we were viewed as much too strong to need that kind of doctrine. But I suspect there is more interest in the concept inasmuch as it is clear that we’re far less strong than we imagined.’”) (quoting Sanford Levinson); Kenneth Propp, Speech Moderation and Militant Democracy: Should the United States Regulate like Europe Does?, Atlantic Council (Feb. 1, 2021), [].

  187. . Golumbia, supra note 112 (further noting “As a small group of scholars and activists are arguing with increasing force, . . . it is manifestly possible to protect free speech — and thus enhance the political and democratic values free speech is meant to promote — while suppressing, or at least not actively encouraging, the efforts of those who want to turn democracies against themselves.”).

  188. . Pasquale, supra note 159.

  189. . See, e.g., Balkin, Algorithmic Society, supra note 17, at 63.

  190. . See, e.g., Nunziato, supra note 12, at 1522 n.13 (noting that “those at the helm of [U.S. social media] companies report that they are inspired by and committed to the First Amendment values in general and the marketplace of ideas model in particular”) (citing Marvin Ammori, The “New” New York Times: Free Speech Lawyering in the Age of Google and Twitter, 127 Harv. L. Rev. 2259, 2262 (2014)).

  191. . Feldman, supra note 102.

  192. . See generally Meiklejohn, supra note 162.

  193. . Post, supra note 162, at 2368.

  194. . Id. at 2369.

  195. . Id. at 2370.

  196. . Id. This opens up interesting theoretical questions around the viability of this approach in relation to platforms that might be conceptualized as trustees or “information fiduciaries.” See Claudia E. Haupt, Platforms as Trustees: Information Fiduciaries and the Value of Analogy, 134 Harv. L. Rev. F. 34 (2020); Jack M. Balkin, Information Fiduciaries and the First Amendment, 49 U.C. Davis L. Rev. 1183 (2016).

  197. . See, e.g., Cass Sunstein, The First Amendment in Cyberspace, 104 Yale L.J. 1757, 1759–65 (1995) (discussing both models in the context of online speech).

  198. . Id. at 1763–64.

  199. . Id. at 1804 (noting that “the goals of the First Amendment are closely connected with the founding commitment to a particular kind of polity: a deliberative democracy among informed citizens who are political equals”).

  200. . See, e.g., David Goldberg, Responding to “Fake News”: Is There an Alternative to Law and Regulation?, 47 Sw. L. Rev. 417, 423 (2018) (quoting journalists’ concerns that “one of the most troubling consequences of the NetzDG is not for German journalists, however, but for journalists in other countries. Countries with less democratic political cultures are using the NetzDG and global discourse about the dangers of fraudulent news as a ruse to clamp down on the free press.”); Nunziato, supra note 12, at 1522 n.12 (“At least three countries—Russia, Singapore, and the Philippines—have directly cited the German law as a positive example as they contemplate or propose legislation to remove ‘illegal’ content online.”) (quoting Germany: Flawed Social Media Law: NetzDG is Wrong Response to Online Abuse, Hum. Rts. Watch (Feb. 14, 2018)); Zurth, supra note 57, at 1104 (“One newspaper, for instance, claimed that the Belarusian autocrat Lukashenko was invoking the German law for his oppressive measures.”); Federico Guerrini, The Problems with Germany’s New Social Media Hate Speech Bill, Forbes (Mar. 3, 2020, 2:24 PM) (“Not bound by all the checks and balances of Western democracies and by using broad and undefined terms such as ‘anti-government propaganda’ or ‘fake news’, authoritarian regimes have used the NetzDG as a blueprint for restricting free speech online.”).

    A related but separate problem, not addressed in this Article, might be “a legitimate concern that authoritarian and repressive regimes will employ global injunctions to impose, or at least seek to impose, their restrictive views on the kinds of speech that should be available.” Daskal, supra note 19, at 1634.

  201. . Frederick Schauer, Slippery Slopes, 99 Harv. L. Rev. 361, 361–62 (1985).

  202. . Id. at 363.

  203. . Id.

  204. . See id. at 374 (“[T]he slippery slope phenomenon necessarily involves the transfer of a principle from the situation in which it was formulated to a new situation in which it may be applied.”).

  205. . Id. at 376.

  206. . Cf. Martin J. Riedl, A Primer on Austria’s ‘Communication Platforms Act’ Draft Law that Aims to Rein in Social Media Platforms, London Sch. Econ. & Pol. Sci. (Sept. 14, 2020), [] (suggesting that criticisms largely mirror those of NetzDG).

  207. . Content neutrality ordinarily requires the regulation of speech to be neutral as to its “communicative content,” since content-based regulations of speech “are presumptively unconstitutional.” Reed v. Town of Gilbert, 576 U.S. 155, 163 (2015).

  208. . See, e.g., Claudia E. Haupt, Professional Speech and the Content-Neutrality Trap, 127 Yale L.J.F. 150, 151 (2017) (arguing “that content neutrality should be rejected in the professional speech context”).

  209. . Pasquale, supra note 159.

Published by Claudia E. Haupt

Associate Professor of Law and Political Science, Northeastern University; Affiliate Fellow, Infor-mation Society Project, Yale Law School.

%d bloggers like this: