A group of former Google DeepMind researchers has developed an AI behavior engine to transform traditional video games into more dynamic experiences by enhancing the behavior and interactions of non-playable characters (NPCs).
Many companies use AI to generate more realistic NPCs, but Canada-based Artificial Agency, fresh out of stealth with $16 million in funding, believes its behavior engine will help it stand out.
Traditionally, NPCs are guided by decision trees and pre-written scripts, limiting the variety of player experiences. Most NPCs respond to player actions with repetitive dialogues, often feeling unrealistic and monotonous.
Artificial Agency’s behavior engine revolutionizes this framework, positioning the game developer more as a stage manager. Developers must provide each NPC with a set of motivations, rules, and goals, which dictate how the NPC responds to the player. This technology can integrate into existing video games or serve as the foundation for new ones.
Based in Edmonton, Alberta, Artificial Agency enters a competitive market. Competitors include Inworld, which also offers AI-generated NPC behaviors, and Nvidia, which has been developing AI-powered NPCs for some time.
Artificial Agency believes integrating AI-generated NPCs into a game’s design is the future.
“The conversations we often have with these studios are not about if, it’s about when,” co-founder and CEO Brian Tanner told TechCrunch. “This sort of dynamic interaction and dynamic response that our system allows is going to be table stakes in the games industry just a few years from now.”
The startup recently raised $12 million in a seed round co-led by Radical Ventures and Toyota Ventures, adding to a previous $4 million pre-seed round from Radical Ventures, bringing the total to $16 million. Other participants in the latest seed round include Flying Fish, Kaya, BDC Deep Tech, and TIRTA Ventures.
Who Wants AI NPCs?
A significant question for these startups is whether gaming studios will adopt their AI technology. Some worry that large studios might develop the technology themselves or hesitate to incorporate generative AI into their flagship games, given the risks of hallucinations and the technology’s untested nature.
Although Artificial Agency didn’t name them, it claims to be working with “several notable AAA studios” to develop its behavior engine and expects the technology to be widely available by 2025.
“When we reached out to gaming studios, some were starting to build some of these behaviors themselves, when in reality, they’re just trying to build games,” said Radical Ventures investor Daniel Mulet. “Once you see like 20, 30 groups that are trying to build this themselves, there is an opportunity to build a platform and make it available to everyone.”
Game developers seem open to using generative AI in game development, but there is still some hesitation. Nearly half of the 3,000 game developers surveyed by GDC and Game Developer for the 2024 State of the Game Industry report use generative AI in some aspect of development, particularly for repetitive tasks. However, only about 21% expect generative AI to positively impact the industry, and 42% are “very concerned” about the ethics of using generative AI.
Mulet expressed confidence in Artificial Agency’s founding team, given their extensive experience with Google DeepMind. DeepMind, after all, has a long history of developing cutting-edge AI for games, such as AlphaGo, the first computer program to beat a world champion at Go.
Around the time Google shifted focus to the Gemini model, Tanner and his team branched out to develop video game agents to replace NPCs.
From NPC to Co-op Companion
In a demo of the technology shared with TechCrunch, co-founder Alex Kearney created an NPC powered by the behavior engine in Minecraft (the startup would not disclose the games it’s currently working on). The NPC, named Aaron, was designed to be friendly and helpful, with access to basic functions such as movement, opening chests, digging, and placing blocks.
At one point, Kearney’s in-game character asked Aaron to gather supplies for a mining adventure. Although it wasn’t explicitly programmed to do so, the NPC visited multiple chests to gather armor, helmets, tools, and food, then delivered the supplies back to Kearney’s character. When Kearney told Aaron she was gluten-free after it brought back some bread, the NPC apologized and offered a gluten-free option instead: cooked chicken.
This simple demo illustrated how Artificial Agency’s AI NPCs could not only communicate but also perform complex actions without explicit instructions. Aaron showed a level of awareness and created a unique experience without script-writing or programming. At the very least, the technology could save game developers time.
Will Gamers Pay the Price for AI?
Tanner estimated that the roughly five-minute demo cost $1 in AI inferencing costs, but noted that a year ago, it would have cost $100. Artificial Agency expects costs to continue decreasing, thanks to GPU efficiencies and AI model optimizations. Currently, the startup uses open-source models, including Meta’s Llama 3. Tanner expects the five-minute demo to cost one cent or less in a year.
But regardless of the cost, who will pay for these inferencing costs? Artificial Agency believes AI NPCs likely won’t increase game prices for end users, but Radical Ventures’ Mulet is less certain. He said his venture firm is confident game studios will pay to license Artificial Agency’s technology, but it could result in a monthly fee for gamers once deployed.
“The fact that there’s inference costs associated with running these systems means that it has to be a bit of a premium feature,” said Mulet. “Will you, as a gamer, pay $2.99 a month or $12.99 a month? That’s a little bit early to tell.”