How do we get people to care about moderation?

Ben Whitelaw, Everything In Moderation

With public trust in platforms declining, moderation is often misunderstood. So how do we get ordinary people to understand and care about it?

Ben Whitelaw is a media strategist who runs Everything in Moderation, a weekly newsletter about content moderation and online safety. He joined moderation and policy experts Kat Lo, Andrew Losowsky, and Laure Cast at the inaugural All Things in Moderation conference to unpack how we can create public support for moderation.

Why aren't people engaging with moderation?

Lack of transparency:

  • Users may not feel able to advocate for themselves due to a lack of transparency from major platforms about moderation practices.

  • Without the tools to understand and make appropriate critiques of moderation, users or community members will find it hard to communicate with platforms in a meaningful way.

Lack of trust and safety:

  • Dominant platforms are viewed by the public as extractive, leading to a distrustful relationship with platforms and moderation.

  • Within this environment, people may feel that they lack agency: that they have little impact on platforms or moderation policy.

  • Difficult, polarised dynamics have also been created in our online spaces where users feel they have to choose between freely expressing themselves and being safe.

How do we build trust in platforms?

  • Give users a sense of agency. We can ask users to care, but we must also create systems in which they can see their commitment to moderation having a tangible impact. Consider different pathways that members might reliably have an impact on your community- for example, donating or voting- and make these clear to users.

  • Think beyond your community. Online communities don’t exist in a vacuum. What is your brand or organisation’s overall engagement with the general public like? If the wider community does not trust your platform, it will be difficult to create trust on smaller scale.

  • Don’t make unrealistic promises. We can’t always guarantee safety in our online shared spaces. Instead, consider: What kind of space are we trying to create? What will happen if standards are breached and how can we support user wellbeing if this occurs? Who will be involved? Communicate these safety processes to members of our shared space.

  • Set your space up for success. Just like you would hosting an in-person conversation, put parameters and safeguards in place early. Invest in onboarding, advocate for guardrails and set norms early in your social space, whether it’s a community, an audience, or any other type of gathering. Consider: What are our responsibilities? Do we need to share these responsibilities among volunteers, or do we need professionals to create a framework for our space first?

  • Create accessible and specific transparency reports. Doing so will help users get a sense of what you’re really doing. For example, if you want to report how much hate speech has been removed from your space, it may be helpful to distinguish those instances of hate speech by identity. Collect, translate and make these reports accessible so that researchers, legislators and the general public can better understand and advocate for moderation.

Read Ben Whitelaw’s interview with ACM's Venessa Paech on Everything in Moderation

Does it matter if people understand moderation?

If users do not have a sense of trust and safety in platforms and moderation, they may not report issues with moderation practices, or worse, may quietly leave digital spaces that would benefit from their presence.

Anyone making policy or legislation about our shared digital social lives needs at minimum, to understand that moderation is a critical and complex practice that can make or break their regulatory visions.

If ordinary people understand moderation, we can harness their feedback to build better digital spaces and more cohesive communities.

All Things in Moderation ran online 11-12 May 2023 and featured over 25 expert contributors from around the world and across practitioner, academic and policy disciplines.

Previous
Previous

Securing institutional support for moderators

Next
Next

Moderating communities of practice