Full Report for Iris by Craig Duncan

Full Report for Iris by Craig Duncan



Black plays first.

On the first turn, black plays a SINGLE stone to any gray cell. Thereafter, each player plays TWO stones per turn, subject to the following restrictions:

  1. If the first stone of a turn is played on a coloured (perimeter) cell, then the second stone MUST be played on the corresponding same-coloured cell on the opposite side of the hexhex board. (In the case of a corner cell, the corresponding same-coloured cell is the opposite corner cell.)
  2. If the first stone of a turn is played on a gray (interior) cell, then the second stone MUST be played on any empty non-adjacent gray cell. (If all remaining empty gray cells are adjacent to the first stone, then the second stone is forfeited.)
  3. Goal

    When the board is full or when both players consecutively pass, the game ends and players score their groups of stones. A group’s score is equal to the number of coloured cells that it occupies. The player with the highest scoring group wins. If scores are tied, then players ignore their highest scoring group, and compare the highest scoring of their remaining groups. If still tied, then players compare their next highest scoring groups, etc., until the tie is broken. (Despite there being an even number of coloured cells, for subtle reasons having to do with board geometry, it turns out that draws are impossible.)


General comments:

Play: Combinatorial

Family: Connection,Scoring,Strict Placement,Combinatorial 2019

Mechanism(s): Scoring

Components: Board

Level: Standard

BGG Stats

BGG EntryIris
BGG Rating0
BGG Weight0

BGG Ratings and Comments

mrraow7Interesting - and visually appealing - game with a fine balance between claiming outer cells and connecting them. There is an inspired rule that prevents playing two adjacent central stones, which makes bridges still valid most of the time, despite the 12* protocol; allowing your hex intuition to still function. there are, however, cases where you can attack two bridges in a way that only allows the opponent to defend one of them.

Levels of Play

AIStrong WinsDrawsStrong Losses#GamesStrong Win%p1 Win%Game Length
Grand Unified UCT(U1-T,rSel=s, secs=0.01)360036100.0044.4494.14
Grand Unified UCT(U1-T,rSel=s, secs=0.03)36044090.0040.00106.12
Grand Unified UCT(U1-T,rSel=s, secs=0.07)36064285.7150.00109.33
Grand Unified UCT(U1-T,rSel=s, secs=0.20)36064285.7159.52109.74
Grand Unified UCT(U1-T,rSel=s, secs=0.55)36094580.0048.89109.71

Level of Play: Strong beats Weak 60% of the time (lower bound with 90% confidence).

Draw%, p1 win% and game length may give some indication of trends as AI strength increases; but be aware that the AI can introduce bias due to horizon effects, poor heuristics, etc.

Kolomogorov Complexity Estimate

Size (bytes)29828
Reference Size10293

Ai Ai calculates the size of the implementation, and compares it to the Ai Ai implementation of the simplest possible game (which just fills the board). Note that this estimate may include some graphics and heuristics code as well as the game logic. See the wikipedia entry for more details.

Playout Complexity Estimate

Playouts per second19673.50 (50.83µs/playout)
Reference Size350324.05 (2.85µs/playout)
Ratio (low is good)17.81

Tavener complexity: the heat generated by playing every possible instance of a game with a perfectly efficient programme. Since this is not possible to calculate, Ai Ai calculates the number of random playouts per second and compares it to the fastest non-trivial Ai Ai game (Connect 4). This ratio gives a practical indication of how complex the game is. Combine this with the computational state space, and you can get an idea of how strong the default (MCTS-based) AI will be.

Win % By Player (Bias)

1: Black win %49.10±3.09Includes draws = 50%
2: White win %50.90±3.10Includes draws = 50%
Draw %0.00Percentage of games where all players draw.
Decisive %100.00Percentage of games with a single winner.
Samples1000Quantity of logged games played

Note: that win/loss statistics may vary depending on thinking time (horizon effect, etc.), bad heuristics, bugs, and other factors, so should be taken with a pinch of salt. (Given perfect play, any game of pure skill will always end in the same result.)

Note: Ai Ai differentiates between states where all players draw or win or lose; this is mostly to support cooperative games.

Playout/Search Speed

LabelIts/sSDNodes/sSDGame lengthSD
Random playout36,9151194,050,63512,8501106

Random: 10 second warmup for the hotspot compiler. 100 trials of 1000ms each.

Other: 100 playouts, means calculated over the first 5 moves only to avoid distortion due to speedup at end of game.

Mirroring Strategies

Rotation (Half turn) lost each game as expected.
Reflection (X axis) lost each game as expected.
Reflection (Y axis) lost each game as expected.
Copy last move lost each game as expected.

Mirroring strategies attempt to copy the previous move. On first move, they will attempt to play in the centre. If neither of these are possible, they will pick a random move. Each entry represents a different form of copying; direct copy, reflection in either the X or Y axis, half-turn rotation.


Game length108.80 
Branching factor56.14 
Complexity10^176.23Based on game length and branching factor
Samples1000Quantity of logged games played

Computational complexity (where present) is an estimate of the game tree reachable through actual play. For each game in turn, Ai Ai marks the positions reached in a hashtable, then counts the number of new moves added to the table. Once all moves are applied, it treats this sequence as a geometric progression and calculates the sum as n-> infinity.

Move Classification

Distinct actions128Number of distinct moves (e.g. "e4") regardless of position in game tree
Good moves34A good move is selected by the AI more than the average
Bad moves94A bad move is selected by the AI less than the average
Samples1000Quantity of logged games played

Change in Material Per Turn

This chart is based on a single playout, and gives a feel for the change in material over the course of a game.


This chart shows the best move value with respect to the active player; the orange line represents the value of doing nothing (null move).

The lead changed on 11% of the game turns. Ai Ai found 0 critical turns (turns with only one good option).

Overall, this playout was 88.07% hot.

Position Heatmap

This chart shows the relative temperature of all moves each turn. Colour range: black (worst), red, orange(even), yellow, white(best).

Good/Effective moves

MeasureAll playersPlayer 1Player 2
Mean % of effective moves71.8168.4474.67
Mean no. of effective moves31.3131.2831.34
Effective game space10^146.3510^67.4510^78.90
Mean % of good moves40.0779.596.58
Mean no. of good moves23.3741.807.75
Good move game space10^80.6610^68.8510^11.81

These figures were calculated over a single game.

An effective move is one with score 0.1 of the best move (including the best move). -1 (loss) <= score <= 1 (win)

A good move has a score > 0. Note that when there are no good moves, an multiplier of 1 is used for the game spce calculation.


Table: branching factor per turn.

Action Types per Turn

This chart is based on a single playout, and gives a feel for the types of moves available over the course of a game.

Red: removal, Black: move, Blue: Add, Grey: pass, Purple: swap sides, Brown: other.

Unique Positions Reachable at Depth


Note: most games do not take board rotation and reflection into consideration.
Multi-part turns could be treated as the same or different depth depending on the implementation.
Counts to depth N include all moves reachable at lower depths.
Inaccuracies may also exist due to hash collisions, but Ai Ai uses 64-bit hashes so these will be a very small fraction of a percentage point.

Shortest Game(s)


91 solutions found at depth 3.

Opening Heatmap

Colour shows the success ratio of this play over the first 10moves; black < red < yellow < white.

Size shows the frequency this move is played.