AI Usage at a Glance
Apr 3, 2018
ModerationPractice documented: Blizzard began experimenting with machine learning to automatically detect toxic language and behavior in Overwatch, without waiting for players to submit reports. Game Director Jeff Kaplan publicly confirmed these experiments in April 2018. The current operational status of a dedicated ML moderation system for Overwatch specifically — separate from the ToxMod system used in Call of Duty — is not fully documented.
Practice DocumentedView practice →Apr 12, 2018
ModerationNew evidence: Blizzard is experimenting with using an AI to moderate its games
Evidence AddedView practice →Feb 1, 2019
ProductivityPractice documented: King built an internal AI-powered testing tool called BAIT (Bot for AI Testing) that automatically checks Candy Crush Saga levels for visual bugs — like missing textures, broken text, and overlapping graphics — before updates are released to players. It works through all 4,500+ levels automatically, handling testing tasks that would otherwise fall to human quality assurance staff.
Practice DocumentedView practice →Jul 10, 2019
OtherPractice documented: Blizzard Entertainment partnered with Google DeepMind to develop an AI called AlphaStar that learned to play StarCraft II by studying millions of professional games and then playing against itself. By October 2019, it reached the top rank of Grandmaster — placing it above more than 99% of all active human players — making it the first AI to reach the top tier of a major real-time strategy game.
Practice DocumentedView practice →Oct 30, 2019
OtherNew evidence: Grandmaster level in StarCraft II using multi-agent reinforcement learning
Evidence AddedView practice →Oct 30, 2019
OtherNew evidence: AlphaStar: Grandmaster level in StarCraft II using multi-agent reinforcement learning
Evidence AddedView practice →Dec 30, 2019
ProductivityNew evidence: How King uses AI to test Candy Crush Saga
Evidence AddedView practice →Apr 25, 2022
Creative GenPractice documented: Activision Blizzard filed a patent in 2022 for a system that would use AI to generate a unique musical score for each player in real time, adjusting the soundtrack based on how they play, how skilled they are, and what is happening in the game. No game has been publicly confirmed to use this system as of March 2026.
Practice DocumentedView practice →Jun 13, 2022
ProductivityPractice documented: King, the maker of Candy Crush Saga, now uses AI to help design and test virtually every level in its games before they reach players. AI-controlled bots play each level thousands of times in about five minutes — replacing a process that once took human testers a week — to check difficulty, find bugs, and suggest adjustments. This practice began around 2019 and accelerated significantly through 2024. In July 2025, approximately 200 King employees — including many level designers — were laid off, with sources saying AI tools had replaced their roles.
Practice DocumentedView practice →Jun 13, 2022
RecommendationPractice documented: King runs a large-scale machine learning platform that makes approximately 180 million automated decisions per day about what individual Candy Crush players see and experience — including which offers they receive, how difficult their levels feel, and which in-game events are shown to them. The platform spans around 300 different use cases across King's game portfolio.
Practice DocumentedView practice →Jul 4, 2022
ModerationNew evidence: When Gamers Get Nasty Researchers grapple with subjectivity as they develop algorithms to detect toxicity in online gaming
Evidence AddedView practice →Oct 16, 2022
Creative GenNew evidence: Activision Patents To Generate Unique In-Game Music For Each Player
Evidence AddedView practice →May 22, 2023
Creative GenPractice documented: Blizzard Entertainment built its own internal generative AI tool, called Blizzard Diffusion, to help designers and artists generate concept art during game development. Unlike publicly available AI tools, it was trained only on Blizzard's own art assets. The tool's existence was revealed in May 2023 following an internal company email that described it as "a major evolution" in how Blizzard builds games.
Practice DocumentedView practice →May 23, 2023
Creative GenNew evidence: Blizzard has trained a new A.I. concept art tool on its own assets
Evidence AddedView practice →May 23, 2023
Creative GenNew evidence: Blizzard says new AI concept art tool represents "major evolution"
Evidence AddedView practice →Jun 29, 2023
OtherPractice documented: ctivision uses a machine learning system called RICOCHET to automatically detect cheaters in Call of Duty multiplayer. It watches how players aim, move, and behave during a match, and flags accounts that appear to be using software to cheat — like aimbots or wallhacks. It launched in October 2021 and has expanded significantly since 2023.
Practice DocumentedView practice →Jul 12, 2023
RecommendationNew evidence: Gaming Gets Smarter: Insights from King's Head of AI
Evidence AddedView practice →Aug 30, 2023
ModerationPractice documented: Activision partnered with a company called Modulate to add an AI system called ToxMod to Call of Duty's voice chat. ToxMod listens to in-game voice conversations and automatically flags or acts on harassment, hate speech, and other toxic behavior — without waiting for players to report each other. It launched in beta in August 2023 and rolled out globally with Modern Warfare III in November 2023.
Practice DocumentedView practice →Nov 9, 2023
OtherNew evidence: RICOCHET: Anti-Cheat Progress Report – Launch Readiness, Machine Learning and New Features
Evidence AddedView practice →Jan 1, 2024
ModerationNew evidence: How ToxMod’s AI impacted toxicity in Call of Duty voice chat | case study
Evidence AddedView practice →Jan 29, 2024
OtherPractice documented: Call of Duty uses an automated system to decide which players are placed together in online matches. The system considers factors like connection speed, how quickly a match can be found, and player skill — though Activision says skill is not the most important factor. In January 2024, Activision published its first-ever public explanation of how this system works.
Practice DocumentedView practice →Mar 1, 2024
RecommendationNew evidence: King researchers talk about the results of using AI at GDC 2024
Evidence AddedView practice →Apr 1, 2024
Creative GenPractice documented: Activision uses generative AI tools to generate some in-game visual assets for Call of Duty titles including Black Ops 6, Warzone, and Black Ops 7, such as calling cards, loading screens, prestige icons, and at least one paid cosmetic bundle. Activision confirmed this via mandatory Steam disclosures beginning in February 2025, after players flagged suspicious AI artifacts in multiple assets. A Wired investigation published in July 2024 reported that remaining 2D artists were required to use AI tools following layoffs of 2D artist positions.
Practice DocumentedView practice →Apr 1, 2024
OtherNew evidence: Matchmaking Series: The Role of Skill in Matchmaking
Evidence AddedView practice →Apr 4, 2024
OtherNew evidence: [[UPDATED]] Call of Duty: An Inside Look at Matchmaking
Evidence AddedView practice →Apr 9, 2024
ProductivityNew evidence: How King balances human and AI-powered design in Candy Crush Saga
Evidence AddedView practice →Jul 25, 2024
Creative GenPractice documented: Blizzard's parent company Microsoft began including contract clauses related to AI voice use when contracting voice actors for World of Warcraft and other Blizzard games. In September 2025, a French voice actress publicly refused to sign a new contract over these clauses and gave up a role she had held for seven years. A separate SAG-AFTRA video game performers' strike — which listed Activision Productions as a named party and was explicitly described as being "over AI" — ran from July 2024 to June 2025, resulting in a landmark agreement with new protections for performers' digital likenesses.
Practice DocumentedView practice →Jan 1, 2025
Creative GenNew evidence: Call of Duty: Black Ops 6 Steam Store Page
Evidence AddedView practice →Feb 25, 2025
Creative GenNew evidence: Activision admits to using AI assets in Call of Duty following Steam policy change
Evidence AddedView practice →Mar 3, 2025
Data AnalysisPractice documented: In March 2025, Activision Blizzard used generative AI to create advertisements for three games that do not exist — Guitar Hero Mobile, Crash Bandicoot Brawl, and Call of Duty: Zombie Defender — and ran them on social media. Clicking the ads led to a survey asking whether players would want to play such a game. The ads were removed after they attracted attention for their low quality and visible AI artifacts.
Practice DocumentedView practice →Mar 5, 2025
Data AnalysisNew evidence: Finding a new and inventive way to annoy everybody, Activision has company use AI to generate fake advertisements for games that don't exist
Evidence AddedView practice →Mar 13, 2025
Customer SvcPractice documented: Microsoft is rolling out an AI assistant called Gaming Copilot that Xbox and PC players can talk to while playing games. It can answer questions about where to find items, suggest strategies, recommend new games based on play history, and explain game mechanics — acting like a knowledgeable friend who knows your gaming habits. It launched in beta on the Xbox mobile app in May 2025 and is scheduled to reach Xbox consoles in 2026.
Practice DocumentedView practice →Jun 1, 2025
Creative GenNew evidence: Videogame voice actors strike 'suspended' following agreement with game companies: 'All SAG-AFTRA members are instructed to return to work'
Evidence AddedView practice →Jul 1, 2025
Creative GenPractice documented: Blizzard has faced repeated accusations from players that promotional images for Diablo Immortal and Overwatch 2 were made using generative AI, based on visible distortions — such as a melting necklace, merged fingers, and impossible geometry — that are common artifacts of AI image generation. Blizzard has not confirmed the use of generative AI for most of these images and in some cases has denied it.
Practice DocumentedView practice →Jul 15, 2025
ProductivityNew evidence: Laid off King staff set to be replaced by the AI tools they helped build, say sources
Evidence AddedView practice →Jul 30, 2025
Creative GenNew evidence: https://www.windowscentral.com/gaming/activision/abk-shares-ai-images-for-diablo-immortal-event
Evidence AddedView practice →Sep 11, 2025
Creative GenNew evidence: Will Blizzard Use AI for WoW Voice Acting?
Evidence AddedView practice →Sep 19, 2025
Creative GenNew evidence: Blizzard Responds to Overwatch 2 AI Art Accusations
Evidence AddedView practice →Nov 1, 2025
OtherNew evidence: Ready for Launch: How RICOCHET Anti-Cheat Is Preparing for Day One
Evidence AddedView practice →Feb 9, 2026
OtherPractice documented: Blizzard Entertainment maintains a centralized team focused specifically on overseeing how AI can and should be used in its game development process. Individual game teams — such as the Overwatch and World of Warcraft teams — retain the right to opt out of using AI-generated content in player-facing parts of their games. This governance structure was confirmed publicly by Blizzard President Johanna Faries in 2025.
Practice DocumentedView practice →Feb 24, 2026
OtherNew evidence: Exclusive: Talking to new Xbox CEO Asha Sharma and CCO Matt Booty — "This team has brought it back before, and I'm here to help us do it again."
Evidence AddedView practice →Mar 1, 2026
OtherNew evidence: World of Warcraft: Midnight's lead composer feels 'very lucky and happy that we're not using generative AI'
Evidence AddedView practice →Mar 16, 2026
Customer SvcNew evidence: Microsoft’s Xbox Gaming Copilot AI Coming to Consoles in 2026
Evidence AddedView practice →Blizzard Entertainment maintains a centralized team focused specifically on overseeing how AI can and should be used in its game development process. Individual game teams — such as the Overwatch and World of Warcraft teams — retain the right to opt out of using AI-generated content in player-facing parts of their games. This governance structure was confirmed publicly by Blizzard President Johanna Faries in 2025.
Blizzard President Johanna Faries confirmed the existence of a cross-functional AI governance team at Blizzard, describing it as looking not just at current uses but at implications for five to ten years ahead. Despite this central function, individual teams operate with significant autonomy. Overwatch Director Aaron Keller stated that the team is not comfortable putting AI-generated content in front of players, describing their games as a handcrafted universe — though he noted this is not a permanent policy. World of Warcraft: Midnight's lead composer Leo Kaliski confirmed that no one on the WoW team is using generative AI for the game's music. At the Microsoft parent level, then-Chief Content Officer Matt Booty stated that no company-wide directives on AI are being issued and that teams are free to use any technology that benefits their work. No formal external-facing responsible AI policy document has been published by Activision Blizzard or Microsoft Gaming.
Activision Blizzard filed a patent in 2022 for a system that would use AI to generate a unique musical score for each player in real time, adjusting the soundtrack based on how they play, how skilled they are, and what is happening in the game. No game has been publicly confirmed to use this system as of March 2026.
Blizzard has faced repeated accusations from players that promotional images for Diablo Immortal and Overwatch 2 were made using generative AI, based on visible distortions — such as a melting necklace, merged fingers, and impossible geometry — that are common artifacts of AI image generation. Blizzard has not confirmed the use of generative AI for most of these images and in some cases has denied it.
Blizzard's parent company Microsoft began including contract clauses related to AI voice use when contracting voice actors for World of Warcraft and other Blizzard games. In September 2025, a French voice actress publicly refused to sign a new contract over these clauses and gave up a role she had held for seven years. A separate SAG-AFTRA video game performers' strike — which listed Activision Productions as a named party and was explicitly described as being "over AI" — ran from July 2024 to June 2025, resulting in a landmark agreement with new protections for performers' digital likenesses.
Blizzard Entertainment built its own internal generative AI tool, called Blizzard Diffusion, to help designers and artists generate concept art during game development. Unlike publicly available AI tools, it was trained only on Blizzard's own art assets. The tool's existence was revealed in May 2023 following an internal company email that described it as "a major evolution" in how Blizzard builds games.
Activision uses generative AI tools to generate some in-game visual assets for Call of Duty titles including Black Ops 6, Warzone, and Black Ops 7, such as calling cards, loading screens, prestige icons, and at least one paid cosmetic bundle. Activision confirmed this via mandatory Steam disclosures beginning in February 2025, after players flagged suspicious AI artifacts in multiple assets. A Wired investigation published in July 2024 reported that remaining 2D artists were required to use AI tools following layoffs of 2D artist positions.
Blizzard Entertainment maintains a centralized team focused specifically on overseeing how AI can and should be used in its game development process. Individual game teams — such as the Overwatch and World of Warcraft teams — retain the right to opt out of using AI-generated content in player-facing parts of their games. This governance structure was confirmed publicly by Blizzard President Johanna Faries in 2025.
Blizzard Entertainment partnered with Google DeepMind to develop an AI called AlphaStar that learned to play StarCraft II by studying millions of professional games and then playing against itself. By October 2019, it reached the top rank of Grandmaster — placing it above more than 99% of all active human players — making it the first AI to reach the top tier of a major real-time strategy game.
Call of Duty uses an automated system to decide which players are placed together in online matches. The system considers factors like connection speed, how quickly a match can be found, and player skill — though Activision says skill is not the most important factor. In January 2024, Activision published its first-ever public explanation of how this system works.
ctivision uses a machine learning system called RICOCHET to automatically detect cheaters in Call of Duty multiplayer. It watches how players aim, move, and behave during a match, and flags accounts that appear to be using software to cheat — like aimbots or wallhacks. It launched in October 2021 and has expanded significantly since 2023.
King built an internal AI-powered testing tool called BAIT (Bot for AI Testing) that automatically checks Candy Crush Saga levels for visual bugs — like missing textures, broken text, and overlapping graphics — before updates are released to players. It works through all 4,500+ levels automatically, handling testing tasks that would otherwise fall to human quality assurance staff.
King, the maker of Candy Crush Saga, now uses AI to help design and test virtually every level in its games before they reach players. AI-controlled bots play each level thousands of times in about five minutes — replacing a process that once took human testers a week — to check difficulty, find bugs, and suggest adjustments. This practice began around 2019 and accelerated significantly through 2024. In July 2025, approximately 200 King employees — including many level designers — were laid off, with sources saying AI tools had replaced their roles.
Blizzard began experimenting with machine learning to automatically detect toxic language and behavior in Overwatch, without waiting for players to submit reports. Game Director Jeff Kaplan publicly confirmed these experiments in April 2018. The current operational status of a dedicated ML moderation system for Overwatch specifically — separate from the ToxMod system used in Call of Duty — is not fully documented.
Activision partnered with a company called Modulate to add an AI system called ToxMod to Call of Duty's voice chat. ToxMod listens to in-game voice conversations and automatically flags or acts on harassment, hate speech, and other toxic behavior — without waiting for players to report each other. It launched in beta in August 2023 and rolled out globally with Modern Warfare III in November 2023.
Have evidence about Activision Blizzard's AI practices? Submit a report.
Submit a report →AI Trace is free and nonprofit. Support our work
Microsoft is rolling out an AI assistant called Gaming Copilot that Xbox and PC players can talk to while playing games. It can answer questions about where to find items, suggest strategies, recommend new games based on play history, and explain game mechanics — acting like a knowledgeable friend who knows your gaming habits. It launched in beta on the Xbox mobile app in May 2025 and is scheduled to reach Xbox consoles in 2026.
Gaming Copilot is a Microsoft parent-level initiative that uses large language model technology to provide context-aware in-game assistance. It launched first integrated with Minecraft in 2024, moved to Xbox mobile app beta in May 2025, and expanded to Windows 11 and the ROG Xbox Ally handheld gaming device before a planned Xbox Series X/S console launch in 2026. Usage data shared by Microsoft shows that roughly 30% of player interactions involve game strategy questions, 25% involve game discovery and recommendations, and 19% are conversational. Xbox CVP Fatima Kardar described the tool as a "personal gaming sidekick" that can provide information about specific game mechanics — such as explaining where to find a particular item drop in Diablo 4. New Microsoft Gaming CEO Asha Sharma, appointed in February 2026 from Microsoft's Core AI division, has pledged to avoid producing what she called "soulless AI slop" while pursuing deeper AI integration across the gaming portfolio.
Activision Blizzard filed a patent in 2022 for a system that would use AI to generate a unique musical score for each player in real time, adjusting the soundtrack based on how they play, how skilled they are, and what is happening in the game. No game has been publicly confirmed to use this system as of March 2026.
The patent, filed with WIPO (the World Intellectual Property Organization) in April 2022, describes a machine learning system that would generate dynamic in-game music tailored to individual players. Variables the system would monitor include the player's skill level, current in-game actions, pacing, interactions with NPCs, perceived mood, and quest or objective completion. Each player could theoretically hear a different musical score even in the same session. The patent was identified by games press in 2023 and attracted interest as a signal of Activision Blizzard's AI research directions. However, WIPO patent filings do not confirm that a technology is deployed or even in active development — they indicate that a company has protected a concept. No Activision Blizzard title has announced or confirmed the use of this system.