The Facebook Insider Building Content Moderation for The AI Era
AI content moderation startup Moonbounce raises $12M to bring real-time, policy-driven safety to AI chatbots and image generators.
Matilda
The Facebook Insider Building Content Moderation for The AI Era
AI Content Moderation Is Broken — This Startup Is Fixing It A former Facebook insider has raised $12 million to solve one of the most persistent problems in tech: AI content moderation that actually works in real time. Moonbounce, a safety infrastructure startup, is now processing over 40 million daily content reviews across AI chatbots, image generators, and dating platforms — and its founders believe safety can be a product feature, not just an afterthought. Why AI Content Moderation Has Always Been a Coin Flip Content moderation at scale has never been solved cleanly. When Brett Levenson joined a major social media platform in 2019 to lead business integrity operations, he expected to find a technology problem. What he found was far messier. Human reviewers were handed a 40-page policy document — machine-translated into their language — and given roughly 30 seconds to assess flagged content. They had to decide not just whether something violated the rules, but what action to take: remo…