The following competitions are being held at CIG 2018:
- Short Video Competition
- Hearthstone AI
- The Ms. Pac-Man Vs Ghost Team Competition
- Fighting Game AI Competition
- microRTS Competition
- Hanabi Competition
- StarCraft AI Competition
- The General Video Game AI Competition – Learning Track
- 3rd Angry Birds Level Generation Competition
- The Text-Based Adventure AI Competition
- Visual Doom AI Competition 2018
Computational Intelligence in Games – Short Video CompetitionOrganizers:
- Simon Lucas (Queen Mary University of London, UK; email@example.com)
- Alexander Dockhorn (University of Magdeburg, Germany; (firstname.lastname@example.org)
- and Jialin Liu (Queen Mary University of London, UK; email@example.com)
The video can be about any topic relevant to the conference, and videos demonstrating papers in the conference are encouraged, but it is not limited to that. The videos should be informative and watchable. Submissions might include, but are not limited to:
- A demo of a game with interesting problem definitions for humans and/or AI (designed by human or AI). “New” can refer to very new or variants based on an existing game. Games designed and created by students are highly recommended
- A demo of a human/AI playing one or more games to demo a technique or behavior, e.g. strategies and examples of human like behavior, simulating social interactions, AI in games in general, etc.
Participants must submit a video which is not longer than 5 minutes, no lower bound.
- The video should be subtitled if there will be someone talking.
- The video should include a title page at the beginning, with a title and a very short description/highlights (no standard format of the title page is given, any format is fine).
- Each video mush mention that it is an entry for the IEEE CIG 2018 Short Video Competition.
Everybody can enter the competition. However, the IEEE prizes will only be awarded to the IEEE student/young professional members. More about the awarding policy can be found at:
Entries can be submitted by registering the video and uploading the required document at the competition page. The final rating of the submissions will be done by a vote from the conference delegates. The organisers reserve the right to exclude any video they deem to be offensive or inappropriate.
Competition Webpage: TBA
Competition Entry Deadline: July 15th 2018 23:59 UTC-12
- Alexander Dockhorn (University of Magdeburg, Germany; (firstname.lastname@example.org)
- and Sanaz Mostaghim (University of Magdeburg, Germany; email@example.com)
Description: The collectible online card game Hearthstone features a rich testbed and poses unique demands for generating artificial intelligence agents. The game is a turn-based card game between two opponents, using constructed decks of thirty cards along with a selected hero with a unique power. Players use their limited mana crystals to cast spells or summon minions to attack their opponent, with the goal to reduce the opponent’s health to zero. The competition aims to promote the stepwise development of fully autonomous AI agents in the context of Hearthstone. During the game, both players need to play the best combination of hand cards, while facing a large amount of uncertainty. The upcoming card draw, the opponent’s hand cards, as well as some hidden effects played by the opponent can influence the player’s next move and its succeeding rounds. Predicting the opponent’s deck from previously seen cards, and estimating the chances of getting cards of the own deck can help in finding the best cards to be played. Card playing order, their effects, as well as attack targets have a large influence on the player’s chances of winning the game. Despite using premade decks players face the opportunity of creating a deck of 30 cards from the over 1000 available in the current game. Most of them providing unique effects and card synergies that can help in developing combos. Generating a strong deck is a step in consistently winning against a diverse set of opponents. Tracks: The competition will encourage submissions to the following two separate tracks, which will be available in the first year of this competition: Premade Deck Playing”-track: the participants will receive a list of decks and play out all combinations against each other. Determining and using the characteristics of player’s and the opponent’s deck to the player’s advantage will help in winning the game. User Created Deck Playing-track: invites all participants in creating their own decks or choosing from the vast amount of decks available online. Finding a deck that can consistently beat a vast amount of other decks will play a key role in this competition track. Additionally, it gives the participants the chance in optimizing the agents’ strategy to the characteristics of their chosen deck.
The Ms. Pac-Man Vs Ghost Team Competition Organizers:
- Piers Williams (University of Essex, UK; firstname.lastname@example.org),
- Simon M Lucas (Queen Mary University of London, UK; email@example.com)
- and Diego Perez-Liebana (Queen Mary University of London, UK; firstname.lastname@example.org)
Description: The aim of this competition is to investigate co-operation in a fairly complex environment. This competition is a revival of the previous Ms Pac-Man versus Ghost Team competition that ran for many successful years. The previous two competition tracks are being altered into two different tracks. The first track in the new competition will ask competitors to submit controllers for Ms Pac-Man operating under a Partial Observability constraint. The second track will ask competitors to submit 4 controllers to control a ghost each under Partial Observability constraints. The game will enable controlled communication to allow co-operation without dictatorship control. Adding Partial Observability to the game forces the ghosts to co-operate and communicate in order to reach their full potential against Ms Pac-Man.
Fighting Game AI CompetitionOrganizers:
- Ruck Thawonmas (Ritsumeikan University, Japan; email@example.com)
Description: What are promising techniques to develop fighting-game AIs whose performances are robust against a variety of settings? As the platform, Java-based FightingICE is used that also supports Python programming and development of visual-based AIs. Two leagues (Standard and Speedrunning) are associated to each of the three character types: Zen, Garnet, and Lud (data unknown in advance). Standard League considers the winner of a round as the one with the HP above zero at the time its opponent (another entry AI)’s HP has reached zero. In Speedrunning League, the league winner of a given character type is the AI with the shortest average time to beat our sample MCTS AI. The competition winner is decided based on the 2015 Formula-1 scoring system.
- Santiago Ontañon (Drexel University, US; firstname.lastname@example.org)
Description: The microRTS competition has been created to motivate research in the basic research questions underlying the development of AI for RTS games, while minimizing the amount of engineering required to participate. Also, a key difference with respect to the StarCraft competition is that the AIs have access to a “forward model” (i.e., a simulator), with which they can simulate the effect of actions or plans, thus allowing for planning and game-tree search techniques to be developed easily.
Competition Webpage: https://sites.google.com/site/micrortsaicompetition/home
- Joseph Walton-Rivers (University of Essex, UK; email@example.com)
Description: Write an agent capable of playing the cooperative partially observable card game Hanabi. Agents are written in Java and submitted via our online submission system. In Hanabi, agents cannot see their own cards but can see the other agent’s cards. On their turn, agents can either choose to play a card from their hard, discard a card from their hand or spend an information token to tell another player about a feature (rank or suit) of the cards they have. The players must try to play cards for each suit in rank order. If the group makes 3 errors when executing play actions the game is over. Agents will be paired with either copies of their own agent or a set of unknown agents. The winner is the agent that achieves the highest score over a set of unknown deck orderings.
Tracks: Two tracks:
Mixed-track: Agents will play with a set of unknown agents.
Mirror-track: Agents will play with copies of the submitted agent.
Competition Webpage: hanabi.aiclash.com
Submission Server: https://comp.fossgalaxy.com/competitions/t/11
StarCraft AI CompetitionOrganizers:
- Kyung-Joong Kim (Sejong University, Republic of Korea; firstname.lastname@example.org),
- Seonghun Yoon (Sejong University, Republic of Korea; email@example.com)
Description: IEEE CIG StarCraft competitions have seen quite some progress in the development and evolution of new StarCraft bots. For the evolution of the bots, participants used various approaches for making AI bots and it has fertilized game AI and methods such as HMM, Bayesian model, CBR, Potential fields, and reinforcement learning. However, it is still quite challenging to develop AI for the game because it should handle a number of units and buildings while considering resource management and high-level tactics. The purpose of this competition is developing RTS game AI and solving challenging issues on RTS game AI such as uncertainty, real-time process, managing units. Participants are submitting the bots using BWAPI to play 1v1 StarCraft matches.
3rd Angry Birds Level Generation CompetitionOrganizers:
- Matthew Stephenson (Australian National University; firstname.lastname@example.org),
- Jochen Renz (Australian National University, email@example.com),
- Lucas Ferreira (UC Santa Cruz, firstname.lastname@example.org)
- and Julian Togelius (New York University, US; email@example.com)
Description: This year we will run our third Angry Birds Level Generation Competition. The goal of this competition is to build computer programs that can automatically create fun and challenging Angry Birds levels. The difficulty of this competition compared to similar competitions is that the generated levels must be stable under gravity, robust in the sense that a single action should not destroy large parts of the generated structure, and most importantly, the levels should be fun to play and challenging, that is, difficult but solvable. Participants will be able to ensure solvability and difficulty of their levels by using open source Angry Birds AI agents that were developed for the Angry Birds AI competition. This competition will evaluate each level generator based on the overall fun or enjoyment factor of the levels it creates. Aside from the main prize for “most enjoyable levels”, two additional prizes for “most creative levels” and “most challenging levels” will also be awarded. This evaluation will be done by an impartial panel of judges. restrictions will be placed on what objects can be used in the generated levels (in order to prevent pre-generation of levels). We will generate 100 levels for each submitted generator and randomly select a fraction of those for the competition. There will be a penalty if levels are too similar. Each entrant will be evaluated for all prizes. More details on the competition rules and can be found on the competition website aibirds.org. The competition will be based on the physics game implementation “Science Birds” by Lucas Ferreira using Unity3D.
Competition Webpage: https://aibirds.org (last year: https://aibirds.org/other-events/level-generation-competition.html)