Full Report for Web of Flies (2p, rocks) by Stephen Tavener

Full Report for Web of Flies (2p, rocks) by Stephen Tavener

The fight for the web



The board is filled randomly.

After the initial setup, each player may swap any two pieces in turn; this modified pie rule allows the players to balance unlucky start positions. Note: in Ai Ai, the swaps have been deduplicated, so if you can'd do the swap you want with one piece, try the other one!


Spiders move any distance in a straight line, over friends and empty spaces but must end on an enemy spider or a fly. The number associated with a piece is its strength; a spider may only eat a piece of lower or equal strength. Captured pieces must have a number <= capturing piece.

Movement is compulsory if possible; if you cannot move you must pass; but may be able to move later in the game.

Game End

The game ends when all players pass in succession.

The player with the largest spider(s) at the end of the game wins.


There are several spaces on the web that are not occupied by spiders at the start of the game. By default, these are occupied by tasty flies, which can be captured by any spider. Instead, you can play with rocks, which cannot be captured.

Design Notes

Web of Flies is an evolution of an earlier design of mine, Box of Spiders. The original version was inspired by a magazine article on farming spider silk, which noted that If you start with a box of spiders, pretty soon you'll be left with just one rather mean-looking spider.


General comments:

Play: Random Setup,Combinatorial,Themed

Mechanism(s): Capture,Scoring

BGG Stats

BGG EntryWeb of Flies (2p, rocks)
BGG Rating6.96429
BGG Weight3

BGG Ratings and Comments

Oliphant4.5Cute name. Cool spider graphics. Not my favorite PnP abstract game but definitely worth a play through!
mrraow10Disclaimer; I'm the inventor. Mini design story... The first iteration of this game was a card game called Box of Spiders, which I designed for a card game competition. (BoF was, itself, inspired by a magazine article on commercial uses of spider silk, which pointed out that if you start with a box of spiders, it isn't long before you end up with a box containing one big, mean looking spider. But I digress.) The game was fun, with a feel similar to DVONN, but the endgame was somewhat lacking. After the competition, I played the game a few more times, and realised that it would work better on a hexagonal grid, something that has since become an axiom - if it works on a square grid, try it on a hex grid; it will probably be better :) ... and it was. Anyhow, I'm really, REALLY pleased with Web of Flies. The asymmetry (move over friendly pieces, not over enemy pieces) gives lots of scope for revealed threats, sacrifices, and the like. Look at the end of the rules here on BGG for some puzzles, which will show you some of the potential of the game. AI now available - send me a geekmail with your email address if you want a copy (java 8 required)
Kaffedrake4This is reminiscent of a much simplified Tzaar: a number of stacks of constant size bashing each other on a hexagonal grid. Play revolves around capturing your opponent's biggest pieces, your weapons including move starvation and sniping from behind your own pieces. The result feels a little like Stratego without any way to recover from a setback. Note that there are starting positions that cannot be balanced with a piece swap, making the game seem unfinished.
slaqr6[url=http://www.youtube.com/watch?v=yPrsjtw4dsk]Dice Tower Reviews - Web of Flies[/url]
zefquaavius8As a 2-player game, this is a tricky game, well-themed with the spiders: Sneaking around and keeping yourself poised to trap and devour the important pieces is crucial. Sacrifices can't be made lightly, due to the run-off tie-breaker: You may win by having more 5-leggers (or weaker ones still!) than your opponent.
The Player of Games7Nice little abstract game with a cute spider and fly theme. Interesting by itself and easy to get to the table (when playing kids and casual gamers) due to the funny theme. I have the expansion for 3-4 players.

Levels of Play

AIStrong WinsDrawsStrong Losses#GamesStrong Win%p1 Win%Game Length
Grand Unified UCT(U1-T,rSel=s, secs=0.01)360036100.0061.1128.08
Grand Unified UCT(U1-T,rSel=s, secs=0.07)36023894.7463.1629.66
Grand Unified UCT(U1-T,rSel=s, secs=0.55)360114776.6055.3229.77

Level of Play: Strong beats Weak 60% of the time (lower bound with 90% confidence).

Draw%, p1 win% and game length may give some indication of trends as AI strength increases; but be aware that the AI can introduce bias due to horizon effects, poor heuristics, etc.

Kolomogorov Complexity Estimate

Size (bytes)26324
Reference Size10293

Ai Ai calculates the size of the implementation, and compares it to the Ai Ai implementation of the simplest possible game (which just fills the board). Note that this estimate may include some graphics and heuristics code as well as the game logic. See the wikipedia entry for more details.

Playout Complexity Estimate

Playouts per second25830.18 (38.71µs/playout)
Reference Size1379310.34 (0.72µs/playout)
Ratio (low is good)53.40

Tavener complexity: the heat generated by playing every possible instance of a game with a perfectly efficient programme. Since this is not possible to calculate, Ai Ai calculates the number of random playouts per second and compares it to the fastest non-trivial Ai Ai game (Connect 4). This ratio gives a practical indication of how complex the game is. Combine this with the computational state space, and you can get an idea of how strong the default (MCTS-based) AI will be.

Win % By Player (Bias)

1: White win %60.15±3.07Includes draws = 50%
2: Black win %39.85±2.99Includes draws = 50%
Draw %0.50Percentage of games where all players draw.
Decisive %99.50Percentage of games with a single winner.
Samples1000Quantity of logged games played

Note: that win/loss statistics may vary depending on thinking time (horizon effect, etc.), bad heuristics, bugs, and other factors, so should be taken with a pinch of salt. (Given perfect play, any game of pure skill will always end in the same result.)

Note: Ai Ai differentiates between states where all players draw or win or lose; this is mostly to support cooperative games.

Playout/Search Speed

LabelIts/sSDNodes/sSDGame lengthSD
Random playout28,851119889,9693,691312

Random: 10 second warmup for the hotspot compiler. 100 trials of 1000ms each.

Other: 100 playouts, means calculated over the first 5 moves only to avoid distortion due to speedup at end of game.

Mirroring Strategies

Rotation (Half turn) lost each game as expected.
Reflection (X axis) lost each game as expected.
Reflection (Y axis) lost each game as expected.
Copy last move lost each game as expected.

Mirroring strategies attempt to copy the previous move. On first move, they will attempt to play in the centre. If neither of these are possible, they will pick a random move. Each entry represents a different form of copying; direct copy, reflection in either the X or Y axis, half-turn rotation.


Game length30.11 
Branching factor54.76 
Complexity10^28.56Based on game length and branching factor
Computational Complexity10^6.01Sample quality (100 best): 31.95
Samples1000Quantity of logged games played

Computational complexity (where present) is an estimate of the game tree reachable through actual play. For each game in turn, Ai Ai marks the positions reached in a hashtable, then counts the number of new moves added to the table. Once all moves are applied, it treats this sequence as a geometric progression and calculates the sum as n-> infinity.

Move Classification

Distinct actions908Number of distinct moves (e.g. "e4") regardless of position in game tree
Good moves311A good move is selected by the AI more than the average
Bad moves596A bad move is selected by the AI less than the average
Terrible moves38A terrible move is never selected by the AI
Terrible moves: g4-c4,g2-e6,a5-e1,b7-b3,b5-f1,c4-e2,c4-f1,c4-g4,c4-a4,c4-b4,b4-a7,b4-a6,d4-a4,d4-b4,f4-f2,a6-f1,e3-a4,f3-e3,d1-c4,a6-c7,e1-c7,f3-b5,e1-d4,f1-c7,f1-b6,a4-b5,g1-f5,f1-d6,b4-e4,g1-d7,g1-e4,f1-a5,b3-d7,d3-e6,c3-b6,d3-a7,e3-d7,d3-c5
Samples1000Quantity of logged games played

Change in Material Per Turn

This chart is based on a single playout, and gives a feel for the change in material over the course of a game.


This chart shows the best move value with respect to the active player; the orange line represents the value of doing nothing (null move).

The lead changed on 3% of the game turns. Ai Ai found 1 critical turn (turns with only one good option).

Overall, this playout was 45.45% hot.

Position Heatmap

This chart shows the relative temperature of all moves each turn. Colour range: black (worst), red, orange(even), yellow, white(best).

Good/Effective moves

MeasureAll playersPlayer 1Player 2
Mean % of effective moves61.9858.2365.96
Mean no. of effective moves14.3321.766.44
Effective game space10^17.2810^11.5410^5.74
Mean % of good moves36.7271.290.00
Mean no. of good moves3.947.650.00
Good move game space10^12.0810^12.0810^0.00

These figures were calculated over a single game.

An effective move is one with score 0.1 of the best move (including the best move). -1 (loss) <= score <= 1 (win)

A good move has a score > 0. Note that when there are no good moves, an multiplier of 1 is used for the game spce calculation.


Table: branching factor per turn.

Action Types per Turn

This chart is based on a single playout, and gives a feel for the types of moves available over the course of a game.

Red: removal, Black: move, Blue: Add, Grey: pass, Purple: swap sides, Brown: other.

Unique Positions Reachable at Depth


Note: most games do not take board rotation and reflection into consideration.
Multi-part turns could be treated as the same or different depth depending on the implementation.
Counts to depth N include all moves reachable at lower depths.
Inaccuracies may also exist due to hash collisions, but Ai Ai uses 64-bit hashes so these will be a very small fraction of a percentage point.

Shortest Game(s)

No solutions found to depth 3.

Opening Heatmap

Colour shows the success ratio of this play over the first 10moves; black < red < yellow < white.

Size shows the frequency this move is played.



Black to win in 19 moves

Black to win in 19 moves

Black to win in 19 moves

Black to win in 16 moves

Black to win in 19 moves

Black to win in 19 moves

Black to win in 15 moves

Black to win in 17 moves

White to win in 17 moves

Black to win in 17 moves

White to win in 11 moves

Black to win in 18 moves

White to win in 13 moves

Black to win in 17 moves

Black to win in 13 moves

White to win in 15 moves

Black to win in 15 moves

Black to win in 13 moves

White to win in 13 moves

Black to win in 15 moves

Black to win in 12 moves

Black to win in 11 moves

White to win in 11 moves

Black to win in 21 moves

White to win in 11 moves

White to win in 13 moves

Black to win in 13 moves

Selection criteria: first move must be unique, and not forced to avoid losing. Beyond that, Puzzles will be rated by the product of [total move]/[best moves] at each step, and the best puzzles selected.