Journal article
Combinatorial Multi-armed Bandits for Real-Time Strategy Games
The Journal of artificial intelligence research, Vol.58, pp.665-702
01 Jan 2017
Abstract
Games with large branching factors pose a significant challenge for game tree search algorithms. In this paper, we address this problem with a sampling strategy for Monte Carlo Tree Search (MCTS) algorithms called naive sampling, based on a variant of the Multi-armed Bandit problem called Combinatorial Multi-armed Bandits (CMAB). We analyze the theoretical properties of several variants of naive sampling, and empirically compare it against the other existing strategies in the literature for CMABs. We then evaluate these strategies in the context of real-time strategy (RTS) games, a genre of computer games characterized by their very large branching factors. Our results show that as the branching factor grows, naive sampling outperforms the other sampling strategies.
Metrics
1 Record Views
Details
- Title
- Combinatorial Multi-armed Bandits for Real-Time Strategy Games
- Creators
- Santiago Ontanon - Drexel Univ, Dept Comp Sci, Philadelphia, PA 19104 USA
- Publication Details
- The Journal of artificial intelligence research, Vol.58, pp.665-702
- Publisher
- Ai Access Foundation
- Number of pages
- 38
- Resource Type
- Journal article
- Language
- English
- Academic Unit
- Computer Science (Computing)
- Identifiers
- 991019167776904721
UN Sustainable Development Goals (SDGs)
This output has contributed to the advancement of the following goals:
Source: InCites
InCites Highlights
These are selected metrics from InCites Benchmarking & Analytics tool, related to this output
- Web of Science research areas
- Computer Science, Artificial Intelligence