Bluesky on Friday published its moderation report for the past year, noting the sizable growth the social network experienced in 2024 and how that affected its Trust & Safety team's workload. It also noted that the largest number of reports came from users reporting accounts or posts for harassment, trolling, or intolerance — an issue that's plagued Bluesky as it's grown, and has even led to wide-scale protests at times over individual moderation decisions.
The company's report did not address or explain why it did or did not take action on individual users, including those on the most-blocked list.
The company added over 23 million users in 2024, as Bluesky became a new destination for former Twitter/X users for various reasons. Throughout the year, the social network benefited from several changes at X, including its decision to change how blocking works and train AI on user data. Other users left X after the results of the U.S. presidential election, based on how X owner Elon Musk's politics began to dominate the platform. The app also surged in users while X was temporarily banned in Brazil back in September.
To meet the demands caused by this growth, Bluesky increased its moderation team to roughly 100 moderators, it said, and is continuing to hire. The company also began offering team members psychological counseling to help them with the difficult job of being constantly exposed to graphic content. (An area we hope AI will one day address, as humans are not built to handle this type of work.)
In total, there were 6.48 million reports to Bluesky's moderation service, up 17x from 2023 when there were only 358,000 reports.
Starting this year, Bluesky will begin to accept moderation reports directly from its app. Similar to X, this will allow users to track actions and updates more easily. Later, it will support appeals in-app, too.
When Brazilian users flooded into Bluesky in August, the company was seeing as many as 50,000 reports per day, at the peak. This led to a backlog in addressing moderation reports and required Bluesky to hire more Portuguese-language staff, including through a contract vendor.
In addition, Bluesky began automating more categories of reports beyond just spam to help it address the influx, though this sometimes led to false positives. Still, the automation helped drop the processing time to just "seconds" for "high-certainty" accounts. Before automation, most reports were handled within 40 minutes. Now, human moderators are kept in the loop to address the false positives and appeals, if not always handling the initial decision.