Decentralized social network Bluesky, a competitor to X and Threads, announced on Wednesday that it’s making further changes to its moderation process. Specifically, the company said it’s introducing new updates around how it tracks violations of its Community Guidelines and enforces its policies. This includes the inclusion of new reporting categories in the app, changes to its “strike” system for violations, and more guidance provided to those who violate the rules.
The moderation changes are rolling out with the latest version of the Bluesky app (v. 1.110), which also includes a dark-mode app icon and a redesigned feature for controlling who can reply to your post.
The company says the moderation updates are a result of Bluesky’s rapid growth and the need for “clear standards and expectations for how people treat each other” on the platform.
“On Bluesky, people are meeting and falling in love, being discovered as artists, and having debates on niche topics in cozy corners. At the same time, some of us have developed a habit of saying things behind screens that we’d never say in person,” the company shared in an announcement, explaining the changes.
However, the news also follows the most recent moderation dust-up on the platform, which saw a user suspended for making a comment that Bluesky interpreted as a threat of violence. Author and influencer Sarah Kendzior had written in a post on Bluesky that she wanted to “shoot the author of this article just to watch him die” — a reference to a Johnny Cash song lyric. That choice of words was apt because she was commenting on an article about Johnny Cash that she didn’t like.
Bluesky’s team said that Kendzior was suspended because she expressed “a desire to shoot the author of the article”– a very literal reading of her commentary.
With the updated rules, Bluesky seems focused on ensuring the platform maintains a sense of community and doesn’t devolve into the toxicity that now fuels X, where snarky comments, dunks, and hateful commentary are often the norm.

For starters, Bluesky is expanding the reporting options on posts from six to nine, allowing users more precision in flagging issues and helping moderators take action on critical reports more quickly. For instance, you can now report things like Youth Harassment or Bullying or Eating Disorders, which would help to address Bluesky’s need to abide by the host of new laws designed to protect minors online. In addition, it will allow users to flag possible Human Trafficking content to meet the requirements of the U.K.’s Online Safety Act.
To aid in this, Bluesky has improved its internal tools to automatically track violations and enforcement actions in one place. The system will also make sure people get clear information about what happened and where they stand.
The company notes that it’s not changing what it enforces, only that it’s made its tooling better so it can be more consistent and transparent with its enforcement.
As part of this, Bluesky’s strike system will now assign content a severity rating, which will help dictate the enforcement action taken. For example, content flagged as a “critical risk” would result in a permanent ban. Other content may receive a lower, medium, or higher penalty. And if an account racks up violations, the user could also risk a permanent ban instead of a temporary suspension.
Plus, the company says users will be notified when they’re the subject of an enforcement action with information about which Community Guideline they violated, the severity level assigned, their total violation count, how close they are to the next account-level action threshold, and the duration and end date of any suspension. Enforcement actions can also be appealed, the company said.
The changes also follow Bluesky’s rollout of updated Community Guidelines in October, as part of its broader focus on becoming more aggressive about moderation and enforcement on the platform.
But even as the company emphasizes its stricter rules, some Bluesky users remain upset that the company still permits a user who is widely criticized for his writing on trans issues to maintain his account on the platform. This controversy erupted again in October, when Bluesky CEO Jay Graber appeared to dismiss users’ criticism in a handful of posts.
At the root of the issue is how Bluesky wants to be perceived versus what it actually is today.
The company doesn’t want to be known as just a leftist or liberal version of Twitter; it wants to be a home where many different communities can build out their networks and thrive, without the problems of a centralized social network. However, much of the community that adopted Bluesky did so as they no longer felt represented on Twitter/X, which became more right-leaning under new owner Elon Musk.
In addition to wanting to shape its image, Bluesky has to balance its goals with a growing number of laws and regulations that require social platforms to protect their users from harm or face potentially severe consequences, like massive fines. For instance, earlier this year Bluesky blocked its service in Mississippi, saying it didn’t have the resources to meet the state’s age assurance law, which would fine the network up to $10,000 per user for noncompliance.
