Beyond the ban-hammer: prosocial moderation
At the 2025 All Things in Moderation (ATIM) conference, Serena Snoad closed out Day One with a deeply thoughtful and resonant session on the case for prosocial moderation — and how it can build healthier, more sustainable communities.
Serena, founder of UK-based consultancy Good Community, drew on her decades of experience leading community teams across nonprofit, therapeutic and peer-support contexts to outline an approach to moderation that prioritises context, care, and constructive behaviour over punishment and exclusion.
Pro-social moderation isn’t too soft. It’s not about avoiding action — it’s about thoughtful, responsive action that balances safety and support.
Reframing moderation
Serena challenged the outdated stereotype of moderators as rule enforcers with ban-hammers in hand. Instead, she offered a more human (and more accurate) framing: moderators as guides, educators and stewards of community culture.
Our moderators aren’t just law enforcement. They welcome new members, offer support, model tone, and help people learn. Enforcement is just one hat we wear.
By shifting the purpose of moderation from punishment to cultural care, Serena argued, we open up space for learning, trust-building and resilience within our communities.
Why punitive approaches fail
Through case studies of members from support communities - including a participant with dementia and another reliving past trauma - Serena demonstrated how traditional “three-strikes” approaches often fail the very people who need community most. Simplistic moderation models that ignore personal context, she argued, risk creating more harm than good.
“If the only tool you have is a hammer, everything looks like a nail. But we’re not managing nails, we’re engaging with people.”
Prosocial moderation requires mindset shifts and reframing as a community manager. Copyright: Serena Snoad/Good Community
Gears of prosocial moderation
A key takeaway was Serena’s three gears of prosocial moderation, which offer a practical roadmap for shifting moderation practices:
Empathise – Understand the member’s context and perspective.
Educate – Offer clarity on norms, values, and reasons for decisions.
Enforce – Take action when safety (individual or community) is at risk, but do so fairly and transparently.
This layered approach not only protects communities but also fosters growth, trust and long-term health.
Small shifts & big impact
For teams wanting to adopt a prosocial moderation model, Serena recommended small, manageable changes:
Update internal policies to reflect a balance of safety and support.
Adapt community guidelines to explain the why behind rules and reinforce transparency.
Equip moderators with resources to support different roles — from empathic communication to de-escalation techniques.
Provide debriefs and care support for moderators, to avoid burnout and vicarious trauma.
Very few situations in moderation are true emergencies. Give yourself permission to pause, reflect, and respond — not just react.
A case for change
As Serena pointed out, prosocial moderation aligns with both corporate social responsibility and risk management goals. For teams in commercial or regulated environments, it’s a model that not only supports healthier communities, but also protects brand reputation and employee wellbeing.
Moderators deserve to be proud of their work. They are trust-builders, caretakers, and culture stewards — and they make digital life safer for all of us.
Serena’s session was a powerful reminder of the ethical core of community management work — and a call to action for community managers, moderators and decision-makers to reimagine what good governance really looks like online.
Missed it? Be sure to join us for All Things in Moderation 2026 to learn from more experts like Serena and explore the future of moderation and community building. Check out Serena’s work directly at Good Community.