Skip to main content
Back to previous page
Supercharged Edition 05 Banner with text that reads Spring 2025 Supercharged: Insights from Keywords Studios with a space themed background

05: AI's Hard Truth

Supercharged: Insights from Keywords Studios

Welcome back to Supercharged: Insights from Keywords Studios, our quarterly newsletter.

This edition of Supercharged confronts some uncomfortable truths about the use of AI in the industry and explores the common misconceptions around why integration often fails to deliver meaningful results, shifting the blame from models themselves to the production infrastructure behind the scenes, or lack thereof. From the four critical gaps that kill AI pilots before they reach scale, to the shift from reactive support tickets to proactive player engagement, the studios finding real success with AI share one thing in common: they stopped chasing the "wow" moment and started engineering reliable workflows. The technology is only as good as the plumbing behind it.

The Uncomfortable Truth about AI: Why 90% of Game Developers Use AI, but 52% Hate It

From Jon Gibson, Global Head of Transformation at Keywords Studios and Tara Phillips, Art Service Line Director at Keywords Studios

I have a confession to make: I am not an "AI person." I don’t build LLMs, I don’t write prompts for a living, and I’d have to Google the difference between a transformer and a diffusion model.

What I am is a game maker. I’ve spent 30 years in the trenches - running studios, shipping titles, missing milestones, and living inside production pipelines. Today, my job is corporate transformation at Keywords Studios. My focus isn't on whether AI is "impressive" - it clearly is - but whether it can actually survive contact with a real-world production pipeline.

Right now, the industry is living through a bizarre paradox. Research from Google Cloud and The Harris Poll shows that 90% of game developers are using AI in their workflows. Yet, according to GDC’s State of the Industry report, 52% of professionals believe generative AI is having a negative impact.

How can 90% of us use a tool that more than half of us distrust?

The answer isn't about the technology. It’s about the "plumbing" - the boring, unsexy production infrastructure that determines whether a tool is a breakthrough or just "gameslop."

The "Box of Chocolates" Problem

Generative AI is non-deterministic, brittle, and notoriously difficult to control. Every time you hit "generate," it’s like opening a box of chocolates. You might get a caramel delight; you might get coconut.

In the concepting or brainstorming phase, surprise is a feature. But in production, surprise is a defect. If a client orders a hundred specific coffee nougats by Thursday, "prompting and praying" isn't a strategy - it's a liability.

Most AI tools today are optimised for the "wow" moment of a single-user demo. But real production requires:

  • Repeatability: Getting the same quality every time.
  • Quality Gates: Knowing what "good" looks like before you start.
  • Audit Trails: Knowing exactly where data came from and who approved it.

Without these, we don't get games; we get "gameslop" - the low-effort, low-quality noise that has already forced platforms like Steam to implement stricter curation.

Where AI Pilots Go to Die

When we look at AI adoption across dozens of global studios, we see a recurring pattern. Pilots don't fail because the models are weak; they fail because of the gaps around them:

  1. The Handoff Gap: A brilliant proof-of-concept is built, the lead gets promoted, and the tool dies because no one was assigned to maintain or scale it.
  2. The Integration Gap: The AI doesn't talk to Perforce, Shotgrid, or GitHub. Artists end up manually moving files back and forth, eating up the time the AI was supposed to save.
  3. The Governance Gap: Legal steps in at the eleventh hour because no one vetted the IP status of the training data.
  4. The Culture Gap: If an Art Director doesn't trust the output, or a team feels threatened by the "black box," adoption stalls regardless of how good the tech is.

 

From "Prompt and Pray" to "Steer and Iterate"

The studios winning with AI right now aren't the ones with the "best" models - everyone has access to the same LLMs. The winners are the ones building a Production Framework.

This means shifting from a "magic wand" mindset to an engineering mindset. In a production-ready workflow, the AI generates, but a human steers, edits, and validates. We aren't replacing artists; we are creating new roles for people who can orchestrate these complex, multi-step pipelines.

As Larian Studios’ Swen Vincke recently noted, using AI hasn't stopped them from hiring more artists. It has simply changed the shape of the work.

The Path Forward: Five Steps for Monday Morning

If you want to move beyond the "chaos phase" of AI, stop looking at the models and start looking at your workflow.

  1. Start with the bottleneck, not the tool. Don't buy an AI tool and look for a problem; map your pipeline and find the tool that fixes a specific delay.
  2. Define quality gates first. If you can’t describe the standard, the AI certainly can’t hit it.
  3. Treat AI selection like vendor management. Vet tools for IP safety and pipeline compatibility before they touch your build.
  4. Keep humans at every decision point. This isn't about slowing down; it's about catching the "coconut chocolates" before they ship to players.
  5. Build governance from day one. Retrofitting policies onto a live workflow is ten times harder than building them in at the start.
three men and a lady sat around laptop

Conclusion: Chaos into Capability

We’ve been here before. We saw it with real-time 3D in the 90s, with motion capture, and with live services. Each time, a revolutionary technology arrived, followed by a period of chaos, which ended only when the industry built the production infrastructure to manage it.

AI is extraordinary, but production is hard. The most exciting unsolved problem in game development isn't building a better model - it’s building the production layer that turns AI from a box of random chocolates into a reliable engine for creativity.

Helpshift Reimagined: Our Evolution into AI-Native Player Engagement

Proving AI Can Produce Results: The Evolution from Support Tickets to Player Engagement

By Rakesh Mistry, Head of Product at Helpshift and Erik Ashby, Senior Director - R&D at Helpshift

In the modern gaming landscape, the traditional model of player support is failing both developers and players alike. As we discussed this year at GDC 2026, the industry is at a critical juncture where reactive support must evolve into proactive player engagement. By leveraging generative AI, we can finally bridge the gap between efficiency and high-quality player experiences.

The Cost of Disengagement

Player disengagement is driven by several key factors that directly impact a game's bottom line:

  • Excessive Advertising: One-third of players quit new games due to ads, and nearly half (47%) abandon their regular games for the same reason.
  • Poor Support Frustration: Long wait times lead to churn for 60% of players, while 49% will quit after receiving an unhelpful response.
  • Toxic Communities: 67% of players are more willing to continue spending if they see swift action taken against toxic behaviour

Conversely, excellent support is a primary driver of growth. 78% of players are likely to recommend a game - and spend more - if it offers speedy, high-quality support.

The Evolution: From Reactive to Proactive

Helpshift’s mission is to power millions of conversations across billions of devices, moving beyond simple ticket resolution. Our framework for this evolution involves three distinct stages:

StageFocusExperience
Player SupportHandling support tickets for customer care channels.Reactive: Experiences are built for a reactive customer journey.
Bots & AutomationDeterministic workflow automations for specific nodes in the journey.Efficiency-Driven: Focused on cost reduction through basic automation.
Player EngagementCreating dynamic conversations across the entire player experience.Proactive: Behaviour insights drive decisions to minimize disengagement before it happens.

Introducing the Care AI Agent

The centerpiece of this evolution is the Care AI Agent, an agentic AI-powered agent designed to resolve inquiries directly in-game. Unlike traditional bots, Care AI is trained on specific game knowledge and has access to player context.

The results from our beta partners, such as Huuuge Games, demonstrate the strategic value of this approach:

  • High Satisfaction: An average 4.2 CSAT due to rich, natural, and personalized responses.
  • Out-of-the-Box Efficiency: A +17% improvement over traditional automation with a 2x reduction in contact reopen rates. This meant 2X more players back in the game
  • Scale and Resolution: Care AI can instantly resolve over 70% of player queries, significantly reducing operational costs while boosting player sentiment.
Man wearing over-ear headphones in a dark room with a faint purple glow emitting from the screen in front of him

The Player Engagement Playbook

To drive meaningful results with generative AI, studios should follow this strategic playbook:

  1. Turn Tickets into Knowledge: Unleash insights from your support history to create a robust knowledge base.
  2. Focus on Conversations: Treat every interaction as a conversation to define clear procedures and goals. Know where the player came from, and where they are heading.
  3. Integrate for Personalization: Connect AI agents to your tech stack to provide deep player context and immersion.
  4. Deliver Strategic Value: Use AI to not only resolve issues but to proactively retain and grow your most valuable players.

Whether through technology, AI, or human services, the singular goal remains the same: meaningful conversations that keep players in the game.

The Ethical Evolution of AI Voice in Gaming

By Sudhanshu Kumar, Principal Data Scientist at Keywords Studios

As we navigate through 2026, the gaming landscape has reached a definitive turning point. The conversation has shifted away from the raw technical question of "Can AI talk?" toward something far more consequential: "How do we scale human creativity with the enhancement of AI?"

 

The 2025 Ethical Reset: Consent as a Non-Negotiable, Not a Consensus

 

The rapid acceleration of voice technology in recent years forced the industry into conversations around consent, ownership, and artistic control. The July 2025 Interactive Media Agreement, ratified with an overwhelming 95.04% approval, marked a critical milestone. It established an "ethics first" lens through which voice technology advances to scale responsibly.

 

  • Non-Negotiable Consent: AI cannot clone a voice without clear and conspicuous written consent. This protects the personality rights of performers, treating their voice as a unique digital fingerprint.
  • Economics with Guardrails: By requiring that synthetic performances be paid on-scale with in-person work, the industry removes incentives to use AI purely as a cost-cutting mechanism. While AI can be more cost-efficient for clearly defined use cases, human actors remain central for lead roles and emotionally complex performances where creative collaboration, direction, and nuance define quality.
  • Transparency & Accountability: Studios must provide usage reports for digital replicas, and new laws like California’s AB 2602 ensure these protections are backed by state power.

But let’s be clear, no single, unified ethical model exists right now. While explicit consent for any form of voice replication has become a shared baseline, the reality on the ground remains complex and fragmented. What we see across projects and regions is that, depending on the country and applicable agreements, AI training may be permitted under tightly defined conditions, restricted to specific use cases, or rejected outright.

Many artists also continue to emphasise the importance of studio presence, artistic direction, and the embodied nature of performance as inseparable from their craft. Rather than signaling a finished framework, 2025 marked the turning point at which ethical boundaries became impossible to ignore.

A control panel illuminated by blue light

The "Human Plus" Model: Support, Not Substitution

Against this backdrop, the industry is grappling with a "Human Plus" philosophy. Rather than replacing talent, AI is being positioned as a support layer that allows the human element to shine in more places at once.

  • Preserving the Emotional Core: Professional voice actors provide the soul of a game. AI is increasingly used to handle the long tail of production: thousands of minor NPC lines, localized greeting variants, or player-name pronunciations. This allows human performers to focus their energy on the high-impact, emotionally complex scenes that define a game’s identity.
  • Localization at Scale: Indie developers can now localize games into dozens of languages in weeks instead of months. However, the most successful projects use AI to handle the technical fit (matching syllable timing), while human linguists and actors remain the final authority on cultural nuance, humour and accessibility.

Breakthroughs with Boundaries

The technical milestones of 2026 are best viewed as tools for artistic scale, designed to bridge the gap between human intent and massive digital worlds.

TechnologyCapabilityHuman Context
Zero-Shot CloningReplicates timbre in 3 seconds.Used for personalised player messages with strict actor consent.
GLM-TTSAchieves a 0.89 Character Error Rate.Reduces technical "noise" so audio engineers spend less time on manual cleanup.
Precision ControlMatches localized speech to the millisecond.Ensures localized dubs preserve the original actor's performance timing.

The Future: Expanding the Stage, Not Replacing the Performer

As voice technology scales across games, its true impact is becoming clearer. AI is not redefining what performance means; it is redefining how far performance can travel. In worlds that demand thousands of characters, constant updates, and global reach, AI is increasingly absorbing the technical burden of scale so that human creativity can remain focused where it matters most.

As we look beyond 2026, authentic human performance becomes a mark of distinction rather than a default. In a world where AI can generate infinite variations of standard speech, the nuance, intention, and emotional depth of a human actor stand out more clearly, not less.

The future of voice in games isn't about AI taking over. It is an expansion of the stage, where technology extends reach, and human artistry provides the reason players listen, connect, and care.