Breaking Systems, Building Safety: How One Gamer Hacker Is Reshaping Trust in Games

Podcasts

August 12, 2025

From Chaos Gremlin to Trust Architect

Christina Camilleri didn’t just grow up playing games. She lived in them. Raised by a single mom working three jobs, Christina found companionship, identity, and her earliest online communities in titles like MapleStory, WoW, and RuneScape. She wasn’t just grinding levels — she was figuring out how systems worked, where they broke, and how people behaved inside them.

It’s no coincidence she grew into an ethical hacker, a social engineer, and now the head of Trust and Safety product strategy at Netflix Games. Christina’s story isn’t just about protecting players. It’s about the deeply human relationship between play, design, and the systems that shape us.

🎮 Games Shape Us But We Shape Them Too

“Games were my second home,” Christina recalls. “They helped me find people like me. But they also exposed me to risk.”

Her early experiences were a contradiction — communities that offered belonging but also vulnerability. That duality now fuels her approach to safety by design. It means intentionally shaping systems to prevent harm before it happens.

She’s not out to sanitize games. In fact, Christina believes tension, rivalry, and chaos are essential to good gameplay. The goal is nuance — preserving emotional intensity without enabling cruelty. It’s a balancing act studios often ignore until it’s too late.

🧠 “Not all intense or competitive interactions are toxic. But systems should support rivalry without enabling real-world harm.”

🛠️ Safety Isn’t a Checklist. It’s a Design Pillar.

At BAE, Christina learned how to break into highly secure systems. At Riot, she shifted from finding flaws to fixing them. But too often, even when she flagged critical vulnerabilities, companies treated the insights like laundry they’d never wash.

That’s when it clicked. She didn’t want to report problems. She wanted to solve them.

At Riot, she investigated high-sensitivity threats, often mapping player behavior patterns that escalated into real-life consequences. But more importantly, she began exploring how safety tooling could become proactive, not just reactive.

She emphasizes that developers must go deeper than blocking and reporting. It’s about anticipating harmful dynamics — especially when you throw low-trust users into high-trust systems like competitive team games.

💸 Safety by Design Equals Retention by Design

For studios still seeing safety as overhead, Christina has a counter-punch. Retention and revenue depend on it.

In one study she cited:

  • 65% of players would quit a game after harmful interactions

  • 61% would reduce their in-game spending following negative behavior

And the inverse? Generosity is contagious. Games like Sky: Children of Light saw increased engagement and player loyalty when pro-social actions were visible and rewarded.

💥 “Safe games are sticky games.”

Safety and monetization aren’t at odds. They’re partners in long-term growth.

🔧 Don’t Just Add Controls. Make Them Meaningful.

Christina shared case studies where safety tooling failed because it felt hollow. Players would mute, block, or report — but nothing happened. No response, no feedback, no change.

She highlights Overwatch’s approach as a model. When action is taken based on a player’s report, they get notified. That small feedback loop builds trust. It makes players feel heard and signals that behavior standards are actually enforced.

She also warns of systems being turned into weapons. Without safeguards, features like mass reporting or open voice chat can be exploited by bad actors. It’s not just about designing tools. It’s about testing how people will abuse them.

🧩 Accessibility Through Safety

One of Christina’s final takeaways was deceptively simple. Safety is accessibility.

Players from marginalized communities — or anyone new to a game — need more than controls. They need control.

Roblox, Fortnite, and Destiny 2 offer powerful examples of opt-in systems. From ID-verified voice to granular privacy settings. These aren’t just good features. They’re signals. They tell the player: you matter here. Your experience is yours to shape.

And that’s what safety really is. Not an afterthought, but a commitment baked into every interaction.

🔎 Callout Box: Spicy POV

🗣️ “When players understand what is and isn’t acceptable, and you tell them proactively, they’re more likely to change. Most of the time, they just didn’t know.”

🎯 Takeaway for Studio Leaders

Safety is no longer optional. In a hyper-connected, live service world, how you manage risk, identity, and player dynamics is a core part of your product. Christina’s work reminds us that thoughtful systems design isn't about playing defense. It’s about building worlds people want to stay in.

✅ Ask yourself:

  • Where are you still patching instead of designing?

  • Do your controls feel real to players or performative?

  • What part of your player experience assumes trust without earning it?

Start there.

From Chaos Gremlin to Trust Architect

Christina Camilleri didn’t just grow up playing games. She lived in them. Raised by a single mom working three jobs, Christina found companionship, identity, and her earliest online communities in titles like MapleStory, WoW, and RuneScape. She wasn’t just grinding levels — she was figuring out how systems worked, where they broke, and how people behaved inside them.

It’s no coincidence she grew into an ethical hacker, a social engineer, and now the head of Trust and Safety product strategy at Netflix Games. Christina’s story isn’t just about protecting players. It’s about the deeply human relationship between play, design, and the systems that shape us.

🎮 Games Shape Us But We Shape Them Too

“Games were my second home,” Christina recalls. “They helped me find people like me. But they also exposed me to risk.”

Her early experiences were a contradiction — communities that offered belonging but also vulnerability. That duality now fuels her approach to safety by design. It means intentionally shaping systems to prevent harm before it happens.

She’s not out to sanitize games. In fact, Christina believes tension, rivalry, and chaos are essential to good gameplay. The goal is nuance — preserving emotional intensity without enabling cruelty. It’s a balancing act studios often ignore until it’s too late.

🧠 “Not all intense or competitive interactions are toxic. But systems should support rivalry without enabling real-world harm.”

🛠️ Safety Isn’t a Checklist. It’s a Design Pillar.

At BAE, Christina learned how to break into highly secure systems. At Riot, she shifted from finding flaws to fixing them. But too often, even when she flagged critical vulnerabilities, companies treated the insights like laundry they’d never wash.

That’s when it clicked. She didn’t want to report problems. She wanted to solve them.

At Riot, she investigated high-sensitivity threats, often mapping player behavior patterns that escalated into real-life consequences. But more importantly, she began exploring how safety tooling could become proactive, not just reactive.

She emphasizes that developers must go deeper than blocking and reporting. It’s about anticipating harmful dynamics — especially when you throw low-trust users into high-trust systems like competitive team games.

💸 Safety by Design Equals Retention by Design

For studios still seeing safety as overhead, Christina has a counter-punch. Retention and revenue depend on it.

In one study she cited:

  • 65% of players would quit a game after harmful interactions

  • 61% would reduce their in-game spending following negative behavior

And the inverse? Generosity is contagious. Games like Sky: Children of Light saw increased engagement and player loyalty when pro-social actions were visible and rewarded.

💥 “Safe games are sticky games.”

Safety and monetization aren’t at odds. They’re partners in long-term growth.

🔧 Don’t Just Add Controls. Make Them Meaningful.

Christina shared case studies where safety tooling failed because it felt hollow. Players would mute, block, or report — but nothing happened. No response, no feedback, no change.

She highlights Overwatch’s approach as a model. When action is taken based on a player’s report, they get notified. That small feedback loop builds trust. It makes players feel heard and signals that behavior standards are actually enforced.

She also warns of systems being turned into weapons. Without safeguards, features like mass reporting or open voice chat can be exploited by bad actors. It’s not just about designing tools. It’s about testing how people will abuse them.

🧩 Accessibility Through Safety

One of Christina’s final takeaways was deceptively simple. Safety is accessibility.

Players from marginalized communities — or anyone new to a game — need more than controls. They need control.

Roblox, Fortnite, and Destiny 2 offer powerful examples of opt-in systems. From ID-verified voice to granular privacy settings. These aren’t just good features. They’re signals. They tell the player: you matter here. Your experience is yours to shape.

And that’s what safety really is. Not an afterthought, but a commitment baked into every interaction.

🔎 Callout Box: Spicy POV

🗣️ “When players understand what is and isn’t acceptable, and you tell them proactively, they’re more likely to change. Most of the time, they just didn’t know.”

🎯 Takeaway for Studio Leaders

Safety is no longer optional. In a hyper-connected, live service world, how you manage risk, identity, and player dynamics is a core part of your product. Christina’s work reminds us that thoughtful systems design isn't about playing defense. It’s about building worlds people want to stay in.

✅ Ask yourself:

  • Where are you still patching instead of designing?

  • Do your controls feel real to players or performative?

  • What part of your player experience assumes trust without earning it?

Start there.

From Chaos Gremlin to Trust Architect

Christina Camilleri didn’t just grow up playing games. She lived in them. Raised by a single mom working three jobs, Christina found companionship, identity, and her earliest online communities in titles like MapleStory, WoW, and RuneScape. She wasn’t just grinding levels — she was figuring out how systems worked, where they broke, and how people behaved inside them.

It’s no coincidence she grew into an ethical hacker, a social engineer, and now the head of Trust and Safety product strategy at Netflix Games. Christina’s story isn’t just about protecting players. It’s about the deeply human relationship between play, design, and the systems that shape us.

🎮 Games Shape Us But We Shape Them Too

“Games were my second home,” Christina recalls. “They helped me find people like me. But they also exposed me to risk.”

Her early experiences were a contradiction — communities that offered belonging but also vulnerability. That duality now fuels her approach to safety by design. It means intentionally shaping systems to prevent harm before it happens.

She’s not out to sanitize games. In fact, Christina believes tension, rivalry, and chaos are essential to good gameplay. The goal is nuance — preserving emotional intensity without enabling cruelty. It’s a balancing act studios often ignore until it’s too late.

🧠 “Not all intense or competitive interactions are toxic. But systems should support rivalry without enabling real-world harm.”

🛠️ Safety Isn’t a Checklist. It’s a Design Pillar.

At BAE, Christina learned how to break into highly secure systems. At Riot, she shifted from finding flaws to fixing them. But too often, even when she flagged critical vulnerabilities, companies treated the insights like laundry they’d never wash.

That’s when it clicked. She didn’t want to report problems. She wanted to solve them.

At Riot, she investigated high-sensitivity threats, often mapping player behavior patterns that escalated into real-life consequences. But more importantly, she began exploring how safety tooling could become proactive, not just reactive.

She emphasizes that developers must go deeper than blocking and reporting. It’s about anticipating harmful dynamics — especially when you throw low-trust users into high-trust systems like competitive team games.

💸 Safety by Design Equals Retention by Design

For studios still seeing safety as overhead, Christina has a counter-punch. Retention and revenue depend on it.

In one study she cited:

  • 65% of players would quit a game after harmful interactions

  • 61% would reduce their in-game spending following negative behavior

And the inverse? Generosity is contagious. Games like Sky: Children of Light saw increased engagement and player loyalty when pro-social actions were visible and rewarded.

💥 “Safe games are sticky games.”

Safety and monetization aren’t at odds. They’re partners in long-term growth.

🔧 Don’t Just Add Controls. Make Them Meaningful.

Christina shared case studies where safety tooling failed because it felt hollow. Players would mute, block, or report — but nothing happened. No response, no feedback, no change.

She highlights Overwatch’s approach as a model. When action is taken based on a player’s report, they get notified. That small feedback loop builds trust. It makes players feel heard and signals that behavior standards are actually enforced.

She also warns of systems being turned into weapons. Without safeguards, features like mass reporting or open voice chat can be exploited by bad actors. It’s not just about designing tools. It’s about testing how people will abuse them.

🧩 Accessibility Through Safety

One of Christina’s final takeaways was deceptively simple. Safety is accessibility.

Players from marginalized communities — or anyone new to a game — need more than controls. They need control.

Roblox, Fortnite, and Destiny 2 offer powerful examples of opt-in systems. From ID-verified voice to granular privacy settings. These aren’t just good features. They’re signals. They tell the player: you matter here. Your experience is yours to shape.

And that’s what safety really is. Not an afterthought, but a commitment baked into every interaction.

🔎 Callout Box: Spicy POV

🗣️ “When players understand what is and isn’t acceptable, and you tell them proactively, they’re more likely to change. Most of the time, they just didn’t know.”

🎯 Takeaway for Studio Leaders

Safety is no longer optional. In a hyper-connected, live service world, how you manage risk, identity, and player dynamics is a core part of your product. Christina’s work reminds us that thoughtful systems design isn't about playing defense. It’s about building worlds people want to stay in.

✅ Ask yourself:

  • Where are you still patching instead of designing?

  • Do your controls feel real to players or performative?

  • What part of your player experience assumes trust without earning it?

Start there.

Share

Twitter

Facebook

Copy link

Share

Twitter

Facebook

Copy link

Share

Twitter

Facebook

Copy link

© Player Driven

2025

Blog

Podcasts

Communities

Subscribe

Subscribe for player.driven updates

© Player Driven

2025

Blog

Podcasts

Communities

Subscribe

Subscribe for player.driven updates

© Player Driven

2025

Blog

Podcasts

Communities

Subscribe

Subscribe for player.driven updates