Moderating communities of practice

Gemma Jamison is a Community Consultant with Higher Logic in Australia. In this role she works with many and varied online communities of practice (often for professional organisations), and helps those community custodians build safe and effective shared digital spaces.

We asked Gemma to share some of her insights and experiences around moderation practice.

Can you share an overview of the moderation work you do working with Higher Logic?

At Higher Logic we have historically assisted our clients with pre and post community launch advice and hands-on support related to moderation and governance. This includes:

  • Walking through a community’s basic Terms of Use and Community Guidelines, and how they will be used to govern member participation - then facilitating deeper discussion surrounding what other conditions a partner may want to include based on their industry, ethics code and culture as an organisation;

  • Guiding our clients around our community platform’s moderation architecture, including:

    o   Moderation queue and how to manage it effectively;

    o   Email templates for handling moderation notices and our Three Strikes Policy framework;

    o   Watch words to help automatically hide or flag posts containing inappropriate or problematic content;

    o   Settings for flagging first time posters and trolling;

  • Consulting on governance procedures and dealing with moderation issues in the ‘grey’ such as:

    o   Escalation, ‘House Rules’ wording and staff training templates for organisations to employ in community launches and moderation scenarios;

    o   When to use more ‘black and white’ moderation approaches, and when it make senses to handle matters in an alternative way;

    o   Managing moderation communications with members.

Read more: How to manage emotional labour in communities

What are some common moderation challenges you experience with professional communities of practice?

Professional communities of practice tend to experience less of the common social media moderation challenges due to the way that these platforms are accessed and populated.

Creating a community in a secure, company-owned platform which is commonly linked with an organisation’s member portal and must be accessed separately to platforms like Facebook and LinkedIn creates clear boundaries, which has the effect of encouraging members to really think carefully about what they post.

This means we tend to avoid most spamming, trolling and misinformation. Members are also usually quite conscious of the fact that they are amongst a group of their professional peers, encouraging them to be more mindful of how they come across and reducing the likelihood of issues like defamation or bullying.

However, since professional organisations are also beholden to their members (their members are why they exist, after all) it’s much harder to make black and white decisions about removing posts or members from a community if there are moderation concerns or problematic behaviour. Part of the Higher Logic consultant’s role as an expert in this niche is guiding our partners towards the right path forward; working out how to negotiate the boundaries of what a professional community space should and shouldn’t permit e.g. discussing personal challenges at work, politics, advertising of your practice etc.

How would you explain the relationship between moderation and community culture?

I think this is a perfect example of where the pre-internet definition of moderation is relevant: to create a healthy relationship between moderation and community culture you must focus on ‘avoiding extremes of behavior or expression : observing reasonable limits’ (Merriam-Webster, 2023). The relationship between moderation and community culture in a professional space is a balancing act; how do you use the moderation tools and procedures you have at your disposal to make sure that all members get the chance to participate, whilst making sure that the extremes of opinion, personality and identity don’t take centre stage?

How do you find the right balance between an organisation’s policies, ethics codes and ways of communicating, whilst also realising that a community culture may require more informal communication and a relaxation of those policies and codes to be healthy?

This quest for balance is the challenge at the crux of moderation and community culture.

What do people most often misunderstand about online moderation?

The most common misconception is that most moderation issues will be ‘black and white’ e.g. ‘your post is being taken down because it is in violation of our Terms of Use.’ I can count on one hand the number of communities where I see these types of actions commonly arising (usually not more than a few times a year).

The majority of the scenarios I support our customers with require much more ‘in the grey’, sensitive, creative and thoughtful moderation work. For example, reading the subtext of why a member has asked a certain question, and contacting the member to talk through how we can help them rewrite their post. Our ultimate aim is to make sure that the post complies with the Terms of Use, but we also want to provide them with the chance to get peer-to-peer support with their challenge, keeping in mind the broader effect of the member’s question on building the community’s value and knowledge for the membership overall.

Another example: when a tricky discussion is developing surrounding criticism of the organisation itself. Do you respond in the community ‘publicly’ and use the opportunity to build trust with your members, or take the post down and talk through the matter with the member directly?

Thankfully having an owned, private platform makes these types of issues much easier to contain and deal with in a measured way without the pressure of a public social media platform to contend with.

What are your top tips for someone moderating a community of practice?

My 3 top tips are

  • Create internal alignment. Make sure everyone on your team that has to deal with moderation are aware of your moderation policies and procedures so you can act swiftly when an issue arises.

  • Don’t be scared of ‘moderating out loud’ and be transparent with your community when you can e.g. ‘Thank you so much for your feedback; we hear you, we are grateful that you have shared your feedback with us, and we want to find a way to resolve this that works for everybody. Here are the challenges to consider, what are your thoughts on ways this might be tackled in a better way in future?’

  • Think creatively about how to tackle bigger challenges that threaten the health of your community. For example, if you are seeing a huge amount of discussions around some tricky topics in your community, try creating a dedicated forum within your community to separate these conversations. This can help to preserve the balance of an existing community space, whilst allowing members who want to have these conversations to feel heard and continue the discussion in a setting which can be monitored effectively.

To learn about leading moderation issues and challenges in Australian online communities, download the State of Community Management Report.

Previous
Previous

How do we get people to care about moderation?

Next
Next

2022 Report: Hope for the future, despite challenging times