Response to ACCC report on digital platforms
The Australian Competition and Consumer Commission has released detailed recommendations after an 18-month inquiry into major digital platforms including Facebook and Google.
Australian Community Managers broadly welcome the report and the recommendations as a profitable step in our relationship to the platforms that scaffold our living and working lives.
Community professionals have long argued for platform accountability, transparency and agency over the tools available to us in our work. We can play a tangible role in addressing issues such as harmful content and behaviour, but only if we have the right equipment for the job.
In the 2019 Australian Community Manager survey, more than a third of community professionals said they'd like to discuss community management and moderation tools with platforms:
Respondents nominated four key areas of improvement they'd like to see from platforms:
• Consistency in response and outcome
• Responsiveness to ideas and issues raised
• Relevance of resources, e.g. case studies and guidance
• Transparency around roadmap and issue status
Less algorithm ‘chasing’ welcomed
We convene and manage many of our online communities within social networking platforms, which means we invest a considerable amount of time ‘algorithm chasing’ – trying to guess at consistent changes to the way content and users are categorised, sorted and prioritised, then modifying our community management tactics and programs based on those iterations.
This is time would preferably be spent on community management practices and on strategy for our members, customers, clients and stakeholders. It can be disempowering, demoralising work.
Improved access to broad algorithmic mechanics, and pre-emptive updates around substantive changes, would allow community professionals greater strategic agency.
Platform goals are not community goals
Algorithms on social networks have unintended consequences beyond amplifying harmful content and threatening competitiveness.
Part of the work of community managers is balancing voices and content within the community, with an eye on community objectives and member experience. We carefully nurture voices that may need support to be heard and may modulate voices that are dominating non-constructively. We prioritise content that serves community goals and de-prioritise that which does not.
Social media architectures are optimised for social media platform profitability. Algorithms can amplify and reward voices that community managers may prefer to strategically soften and depress or hide voices that may offer value contextual to the community purpose. They may reward content on platform criteria around 'value' that is not aligned with community criteria or norms.
This creates extra work for community professionals in identifying and managing those aberrations, if even possible. In reality, community managers frequently sacrifice these aspects of their practices 'to the algorithm' as there are limited mechanisms to override platform preferences.
Secrecy makes our work harder
We proactively moderate for healthy culture within our communities, atop the moderation that platform operators themselves commission. Even if the latter was being performed thoughtfully (which we know it is not), community managers work daily on cultural mediation via moderation and facilitation tactics, with the aim of creating healthy, safe spaces that can meet the objectives of members and the sponsoring business or organisation.
When moderation frameworks and approaches collide, this can create collateral damage for our communities, and social ripples we need to ‘massage out’ to move forward. For example, a public community of new parents are discussing breast feeding tips. Some images in the discussion are removed by Facebook, others are not. A community manager needs to calm tensions and tempers, unpack the action and choose on a forward course of action around the discussion in the best interests of the overall community.
When platform mechanics and positions appear chaotic, invisible and obfuscatory, this makes our work harder.
Time to revisit owned communities?
Owned communities – hosted on websites or dedicated community platforms – have remained popular since the early days of online communities in the 1990s. There have been impressive advances in community focused platform offerings, with solutions to suit a range of budgets, capabilities and intentions.
This fork in the road is an ideal time for an organisation thinking about building an online community to explore options outside the platform monopolies and their algorithmic agendas.
Owned communities offer a number of strategic advantages over social media; among them context, control, safety, data and discovery. Additionally, the many challenges in the social media landscape – such as those called out in the ACCC report – can make it a volatile and high-risk investment.
The Australian market is ripe with opportunities for community building outside the troubled terrain of social media. Perhaps your organisation is ready to be a pioneer.
Communities need management
No matter where a community is built—whether or not it is nurtured strategically or forged organically —it needs community management to meets its goals and endure over time.
The ACCC recommendations are a welcome salvo in the direction of platform accountability and a step toward the kind of 'new constitutionalism' described by Professor Nic Suzor.
In reality, platforms are the macro-community manager for the vast array of communities on their infrastructure, though most would prefer not to be. Their choices of when and where to act, of what governance to embed or eschew, have a collective impact on communities and their management.
Australian Community Managers - who serve as cultural intermediaries and brokers across our platforms - are enthusiastic about contributing to more frequent and more robust conversations around the topics of the ACCC inquiry.