The Future of Your Community: Chatting with Sharon Fisher About AI and Trust & Safety

The Future of Your Community: Chatting with Sharon Fisher About AI and Trust & Safety

Podcasts

March 3, 2026

Greg Posner

The Future of Your Community: Chatting with Sharon Fisher About AI and Trust & Safety

Podcasts

March 3, 2026

If you’ve been following the show for a while, you know that we don’t just talk about game mechanics or the latest hardware. We talk about the soul of gaming: the players. But with great communities come great responsibilities—specifically, how do we keep these spaces safe without stripping away the fun? I recently sat down with a long-time friend of the show, Sharon Fisher. Sharon is a powerhouse in the Trust and Safety (T&S) world. She’s been in the trenches since the early days of Club Penguin, moved through the ranks at Two Hat and Keywords Studios, and is now consulting with cutting-edge AI platforms like Checkstep.

We took a deep dive into where the industry is heading in 2026, and honestly? It’s a bit of the "Wild Wild West" out there. If you’re a studio head, a moderator, or even just a concerned parent, you’ll want to hear what she has to say about the collision of AI and human empathy.

From Club Penguin to the "Wild West" of 2026

It’s wild to think that 17 years ago, "moderation" mostly meant manually blocking the f-word. Sharon reminded me that when Club Penguin started, the goal was simple: give kids a place to express themselves. But as the internet grew, so did the "shadowing" and the risks.

Back then, we relied on pre-made phrases and a ton of manual labor. Fast forward to today, and the market is exploding with new tools every single day. But here’s the kicker: as legacy companies phase out or get acquired, a new generation of AI-first tools is stepping in to handle a level of complexity we couldn’t have imagined a decade ago.

Is the Human Moderator Going Extinct?

This is the million-dollar question, or as Sharon says, the billion-dollar question. With AI getting so good, do we still need people?

Sharon’s take is a relief for those of us who value the human touch. She believes the role isn't disappearing; it’s evolving. Moderators are becoming data analysts. AI can handle the "yes/no" volume, but it can’t always understand the nuance of pop culture or the specific "vibe" of a community.

Think about it:

  • Cultural Context: AI might miss a joke that’s trending on TikTok today but wasn't in the training data yesterday.


  • Bias Detection: Humans are still the ultimate check against the biases that can creep into a machine-learning model.


  • Community Soul: A moderator who lives and breathes the game's culture provides value that automation just can’t replicate.

The Silo Problem: Breaking Down the Walls

One of the things that really got me excited during our chat was Sharon’s discussion on how disjointed big gaming companies can be. You’ve got Customer Support (CS) doing great work in one corner and Moderation in another, but they rarely talk.

New tools like Checkstep are changing that by bringing image, voice, text, and CS data into a single platform. Imagine being able to understand your user’s journey across every touchpoint. It’s not just for safety; it’s a goldmine for marketing and quality control too.

A Message to Parents: Digital "Stranger Danger"

Sharon isn’t just an industry expert; she’s a mom. And she’s worried. We talked a lot about the gap in education for parents and caregivers.

Her advice for the "playground" of the internet is something every parent needs to hear: "Know where your kid is playing." Just like you wouldn't let your child wander into a sketchy neighborhood at night, you shouldn't let them roam unmonitored in digital spaces without understanding who else is there.

She warns that "stranger danger" has evolved. Predators often use links to pull kids out of a safe, moderated game environment and into unmoderated apps like WhatsApp. We need to teach kids that unless they know someone in real life, that "friend" online is still a stranger.

Looking Ahead: Predictions for 2026 and 2027

As we look toward the rest of 2026, Sharon predicts a period of high stress. Companies are slashing costs and rushing to adopt AI, sometimes dropping tools as fast as they pick them up when things don't work perfectly.

But she’s hopeful for 2027. She thinks that after the "AI gold rush," we’ll find a balance. A world where we use the right tools to handle the data, but humans stay in the loop to make sure the internet remains a place where our kids (and we!) actually want to spend time.

Share

LinkedIn

Facebook

Copy link