So the idea of ROOST is, again, decentralization, so that even small-time content hosts can very easily detect such unlawful content automatically. It doesn’t require a community vote. But if you want to tune or update it, you can—without violating an NDA with Big Tech or any law. I think free expression is required, but underneath that, we need robust online safety tools that detect unlawful—not just “awful”—content. Above that level, anything lawful should have both a fair support for free expression and a practical method for communities to own the moderation tools, so that algorithmic manipulation cannot create this false variety or fake polarization that wasn’t actually there. We don’t need to censor anything; we just need to share with communities a way to shape their own social media reality. By making sure there’s a “freedom first” way to label something as shared ground, people can have a conversation based on that. If something is truly divisive, we can represent balanced views rather than defaulting to one extreme take as if that’s all there is.

Keyboard shortcuts

j previous speech k next speech