Full Report for Low vs High by Stephen Tavener

Full Report for Low vs High by Stephen Tavener

A simple stacking game where one player makes stacks in increasing order, the other descreasing.



At the end of the game, the player with the highest stack wins; if there is a tie, look at the second highest stack, etc.


Each turn, move a stack like a chess rook, according to the following constraints:



In Sudden Death mode, a player wins as soon as they make a stack of maximum height.

Design notes

This started off as a game where the players were trying to assemble rainbows, either forwards or backwards.

This was very confusing, so I added numbers.

Then I realised that the colours weren't necessary, and the game wasn't limited to a 7x7 board.

That's the way game design goes sometimes; I left the colours in the Ai Ai implementation for sentimental reasons; but you could play with number or letter tiles if so inclined.


General comments:

Play: Combinatorial,Random Setup

Mechanism(s): Scoring,Stacking

Components: Board

BGG Stats

BGG EntryLow vs High
BGG Ratingnull
BGG Weightnull

Levels of Play

AIWins vs prev levelDrawsLosses#GamesWin%
Grand Unified UCT(U1-T,rSel=u, secs=0.01)360036100.00
Grand Unified UCT(U1-T,rSel=u, secs=0.07)36054187.80
Grand Unified UCT(U1-T,rSel=u, secs=0.20)352135072.00
Grand Unified UCT(U1-T,rSel=u, secs=1.48)360104678.26

Level of Play: strong beats weak 60% of the time (lower bound with 90% confidence).

Kolomogorov Complexity Estimate

Size (bytes)28734
Reference Size10328

Ai Ai calculates the size of the implementation, and compares it to the Ai Ai implementation of the simplest possible game (which just fills the board). Note that this estimate may include some graphics and heuristics code as well as the game logic. See the wikipedia entry for more details.

Playout Complexity Estimate

Playouts per second20590.07 (48.57µs/playout)
Reference Size545970.74 (1.83µs/playout)
Ratio (low is good)26.52

Tavener complexity: the heat generated by playing every possible instance of a game with a perfectly efficient programme. Since this is not possible to calculate, Ai Ai calculates the number of random playouts per second and compares it to the fastest non-trivial Ai Ai game (Connect 4). This ratio gives a practical indication of how complex the game is. Combine this with the computational state space, and you can get an idea of how strong the default (MCTS-based) AI will be.

Win % By Player (Bias)

1: Player 1 (Low) win %56.22±0.34Includes draws = 50%
2: Player 2 (High) win %43.78±0.34Includes draws = 50%
Draw %0.49Percentage of games where all players draw.
Decisive %99.51Percentage of games with a single winner.
Samples80658Quantity of logged games played

Note: that win/loss statistics may vary depending on thinking time (horizon effect, etc.), bad heuristics, bugs, and other factors, so should be taken with a pinch of salt. (Given perfect play, any game of pure skill will always end in the same result.)

Note: Ai Ai differentiates between states where all players draw or win or lose; this is mostly to support cooperative games.

Mirroring Strategies

Rotation (Half turn) lost each game as expected.
Reflection (X axis) lost each game as expected.
Reflection (Y axis) lost each game as expected.
Copy last move lost each game as expected.

Mirroring strategies attempt to copy the previous move. On first move, they will attempt to play in the centre. If neither of these are possible, they will pick a random move. Each entry represents a different form of copying; direct copy, reflection in either the X or Y axis, half-turn rotation.


Game length36.04 
Branching factor27.89 
Complexity10^41.03Based on game length and branching factor
Computational Complexity10^8.48Sample quality (100 best): 15.15
Samples80658Quantity of logged games played

Move Classification

Distinct actions590Number of distinct moves (e.g. "e4") regardless of position in game tree
Good moves420A good move is selected by the AI more than the average
Bad moves169A bad move is selected by the AI less than the average
Samples80658Quantity of logged games played

Change in Material Per Turn

This chart is based on a single playout, and gives a feel for the change in material over the course of a game.


This chart shows the best move value with respect to the active player; the orange line represents the value of doing nothing (null move).

The lead changed on 8% of the game turns. Ai Ai found 2 critical turns (turns with only one good option).

Overall, this playout was 70.59% hot.

Position Heatmap

This chart shows the relative temperature of all moves each turn. Colour range: black (worst), red, orange(even), yellow, white(best).


Table: branching factor per turn.

Action Types per Turn

This chart is based on a single playout, and gives a feel for the types of moves available over the course of a game.

Red: removal, Black: move, Blue: Add, Grey: pass, Purple: swap sides, Brown: other.

Unique Positions Reachable at Depth


Note: most games do not take board rotation and reflection into consideration.
Multi-part turns could be treated as the same or different depth depending on the implementation.
Counts to depth N include all moves reachable at lower depths.
Inaccuracies may also exist due to hash collisions, but Ai Ai uses 64-bit hashes so these will be a very small fraction of a percentage point.

Shortest Game(s)

No solutions found to depth 4.





Player 2 (High) to win in 13 moves

Player 1 (High) to win in 13 moves

Player 2 (Low) to win in 11 moves

Player 2 (High) to win in 9 moves

Player 2 (Low) to win in 7 moves

Player 1 (High) to win in 5 moves

Player 1 (High) to win in 7 moves

Player 2 (High) to win in 8 moves

Player 1 (High) to win in 7 moves

Player 2 (Low) to win in 5 moves

Player 2 (High) to win in 5 moves

Player 1 (High) to win in 3 moves

Selection criteria: first move must be unique, and not forced to avoid losing. Beyond that, Puzzles will be rated by the product of [total move]/[best moves] at each step, and the best puzzles selected.