Building digital havens for young people

In an era of rage-bait algorithms, polarised feeds, and mounting youth online safety concerns, the question of where and how young people can safely gather online has never been more urgent or more complex.

At All Things in Moderation (ATIM 2025), veteran community strategist Todd Nielsen hosted three community experts shaping youth-safe digital spaces to explore what these look and feel like, beyond social media feeds.

  • Lori Fahey (Livewire): overseeing moderators and community builders in stewarding playful, affirming online spaces for young people living with chronic illness.

  • Dan de Sousa (Quiip): community manager with expertise in high risk and vulnerable user groups across different platforms.

  • Megan Jacobs (The Truth Initiative): community and behaviour change strategist behind the US based 'EX community for tobacco cessation.

Together, they discussed how to build, sustain, and evolve digital spaces that meet young people where they are, with care, clarity, and courage.

Read more: Reclaiming Online Communities for Young People

Safety for young people online is dynamic, context-aware, and co-constructed. [Image: Pexels]

Not just rules, but relationships

One of their first provocations? Redefining safety. The panel discussed the importance of psychological, emotional and cultural safety, as both preconditions for community to form, and characteristics of healthy community over time.

For Dan, safety is less about top-down enforcement and more about “emotional stewardship” - helping young people build social skills and navigate online culture with empathy. Lori noted how Livewire’s safety journey has shifted from a ‘cyber-stranger-danger’ approach, to supporting young people facing mental health struggles and trauma in culturally and socially embedded ways. Megan, who is dealing with younger and mature age users in shared spaces, offered a reminder that overly rigid access restrictions can unintentionally signal someone’s age and invite different types of risk.

The panelists agreed that safety is dynamic, context-aware, and co-constructed. When done well, it’s often invisible (not because it isn’t there, but because it’s embedded into the culture).

Culture > Policy

Technical safeguards are necessary for scaffolding online spaces for young people, but aren’t sufficient on their own. The real magic is in cultural cues.

  • At Livewire, inclusive design starts from the homepage, with symbols of welcome and pride, and symbols that connect to youth culture.

  • At Truth Initiative’s EX community, older users are rewarded for modelling respectful behaviour and helping younger users feel seen, not surveilled.

  • Dan championed community managers as anchors, at the centre, not the top, helping shape norms and encouraging self-moderation over time.

In short, "designing for kindness" works. As Lori put it, “Behaviour is communication, and when we listen well, we can guide that communication toward shared care.”

Inclusion doesn’t mean isolation

Should young people always have separate online spaces?

Not necessarily. When the Truth Initiative piloted a youth-only Discord server, they found something surprising: the absence of older, experienced quitters left many young people unsupported. “It was like the blind leading the blind,” Megan noted. By contrast, the multi-generational EX community offered richer stories, encouragement, and stability - especially for younger users without support at home.

Lori offered an example of youth-exclusive spaces for those facing specific vulnerabilitie, such as chronically ill or geographically isolated young people. In those cases, the boundaries offered localised context and respected youth people’s need to disclose and gather with discretion and agency. Even in this space, Livewire’s model includes designed pathways for older peers to age out, offering mentorship and other transitional opportunities.

Segment where it serves, blend where it benefits.

Moderation as supportive structure

Whether it’s managing grief in a close-knit community or gently guiding someone away from learned trolling behaviours, the labour of moderation was characterised by our panelists as both skilled and sacred.

Megan shared a few practical tools used at The Truth Initiative:

  • Pre-moderation settings for vulnerable users

  • Private blogging options for emotional expression

  • A human-centred approach to any technical or automated moderation

Lori noted how Livewire hosts regular “community chat” events where members co-create rules, promoting ownership and accountability. Dan reminded us that young people are more emotionally literate than ever. What they need isn’t surveillance; they need supportive structure, clarity, and space to grow.

Asked to distill their wisdom, the panellists offered some practical mantras:

  • Listen deeply.

  • Let young people help define the rules.

  • Model the behaviour you want to see.

  • Be authentic. Don’t pretend to be one of them.

  • Moderate with empathy, not just enforcement.

As the ATIM chat lit up with gratitude and bookmarks, one thing was clear: safe online spaces for young people aren’t just possible, they already exist. What they need now is recognition, resourcing, and replication.

Missed it? Be sure to join us for All Things in Moderation 2026 - the annual gathering for humans who moderate and build online community.

Venessa Paech

Venessa is Director and Co-Founder of ACM.

http://www.venessapaech.com
Previous
Previous

Australia's hate speech bill: what community managers need to know

Next
Next

What is Community Management?