A game of planting and harvesting magical flowers.
Generated at 01/06/2020, 19:42 from 1000 logged games.
Representative game (in the sense of being of mean length). Wherever you see the 'representative game' referred to in later sections, this is it!
Players take turns to place two pieces up in the playing area, so no two pieces touch orthogonally.
Each turn, a player performs the following steps:
Points are scored using the flowers themselves, so the pool gradually depeles as the game continues.
Black flowers are deemed more valuable than white flowers, so the last player to complete a black row will win in the case of a tie.
The game ends when no further moves can be made. The three possible reasons a player cannot make a move are that there areno seeds left to play, there are no seeds left on the board to plant next to (the board is empty), or there are no spaces lefton the board to plant on (the board is full).
The player who has harvested the most flowers wins
Family: Combinatorial 2019,Minimal
Mechanism(s): Scoring,Shared Pieces
|BGG Entry||Wizard's Garden|
|tjgames||10||Hey I made it, so I have to give it a 10 right. Well, the truth is if I didn't I probably give it at least an 8 more likely a 9. I love playing this game and I still play it via SuperDuperGames.org regularly. Drop by and tell them BlueIstari sent you.|
|mrraow||7||Quick, clever game on a small board with shared pieces.|
|metallaio||10||always in my backpack|
|PBrennan||5||You'd probably call this a micro-abstract, kind of a mini-GIPF meets Connect-4, aiming to get 4 in a row in the same colour in a 4x4 grid. Each time you do you keep one of those tokens as your score and the rest of the row gets wiped. Eventually the pieces run out and whoever's scored the most rows wins. The cool thing was that you could play a token in either colour, and then each orthogonally adjacent token fiipped to the same colour. That certainly made things a bit thinky. It was probably just inexperience and look-ahead failure, but we kept getting into situations where it didn't matter what you played, the opponent was going to score, and that lessened the game for us. I'm not a huge fan of abstracts to begin with so it didn't win me over, but I didn't mind playing it either.|
|cerulean||7||A standout among four-in-a-row games. Look before you leap!|
|The Duke of Kestrel||7.5|
|nycavri||4||Hurts my head.|
|fivecats||N/A||Copy for review in GNM|
|i7dealer||7||A nice abstract game. I think it's hard to see what's happening in your first game or two. It seems to be somewhat of a game of errors, in that you manuever not to give your opponent anything, until a row is forced to be given up.|
|Wentu||6.3||Extremely simple and fast abstract with nice components. First time I see the little bag for the pieces doubling as a board. That's smart! The game is very simple and interesting but I am convinced it has a too small decision tree and it could be easily solved.|
|codinh||7.5||I'm amazed how such a simple game with basic components can bring something new to the table. WG is very clever in its mechanic, and requires a few plays to really get it. After that, it's a neat abstract game with a low footprint and very portable (the bag is the playing field!), easy to learn but hard to master. Good for a quick game and as a smart puzzle with your spouse, a friend or whoever you want.|
|mnemonicuz||5||A pretty good abstract 4-in-a-row game where you have to focus on not setting your opponent up for a score rather than score out yourself. It works well, but I think the games in the GIPF series does most aspects of this game better. It also has a nice starving mechanism where the game will end when there are no more pieces to add, but again, the GIPF games does it better in both starving as well as self-penalizing for catch-up when you score.|
|Striton||8||I really love this game. I've been trying to pin down why, exactly, because the mechanics are quite simple. Maybe it's just "where I am" right now in my gaming, but I find this game a breath of fresh air. I highly recommend it.|
|Niart||N/A||sehr elegantes kleines Spiel - Komponenten meiner Meinung nach gut ausgenutzt|
|Abdul||7||Connect Four meets Othello. Satisfying amount of brain burn for such a compact package. Recommended if you like abstracts.|
|_Randi_||7||[RESERVED] 9€ The game and its components are good, but... green + red ?! it's not a color blind friendly choice at all! :( Next time maybe use blue + red or choose different flower's designs. (my copy came with 17 seeds, so I use the extra one as the tie breaker, instead of the Wizard's Tome a.k.a. the rulebook)|
|Tritrow||5||A more complex version of connect four. Interesting idea and plays fast but at the end of the day it's nothing special.|
|KubaP||8.2||A VERY smart abstract. Really good.|
|Kytty||6||PnP from about.com Nice little twist using Reversi pieces|
|herace||N/A||Wishlist(3) = Print & play game -or- Never was a boxed set|
|Nischtewird||6||16E in Libraria|
|glaurent||N/A||Playable using 4x4 grid, othello pieces, pawn.|
|daveo1234||7||Another one of the "bag is the board" games. Fun little abstract. After two plays, we definitely got better at it as we went. It's fun, especially when first learning it. I have a nagging feeling that it's solvable though. We'll see! (maybe:) 2 2p - 7|
Ai Ai calculates the size of the implementation, and compares it to the Ai Ai implementation of the simplest possible game (which just fills the board). Note that this estimate may include some graphics and heuristics code as well as the game logic. See the wikipedia entry for more details.
|Playouts per second||45448.14 (22.00µs/playout)|
|Reference Size||537230.04 (1.86µs/playout)|
|Ratio (low is good)||11.82|
Tavener complexity: the heat generated by playing every possible instance of a game with a perfectly efficient programme. Since this is not possible to calculate, Ai Ai calculates the number of random playouts per second and compares it to the fastest non-trivial Ai Ai game (Connect 4). This ratio gives a practical indication of how complex the game is. Combine this with the computational state space, and you can get an idea of how strong the default (MCTS-based) AI will be.
Random: 10 second warmup for the hotspot compiler. 100 trials of 1000ms each.
Other: 100 playouts, means calculated over the first 5 moves only to avoid distortion due to speedup at end of game.
Rotation (Half turn) lost each game as expected.
Reflection (X axis) lost each game as expected.
Reflection (Y axis) lost each game as expected.
Copy last move lost each game as expected.
Mirroring strategies attempt to copy the previous move. On first move, they will attempt to play in the centre. If neither of these are possible, they will pick a random move. Each entry represents a different form of copying; direct copy, reflection in either the X or Y axis, half-turn rotation.
|1: Player 1 win %||37.75±2.95||Includes draws = 50%|
|2: Player 2 win %||62.25±3.05||Includes draws = 50%|
|Draw %||4.50||Percentage of games where all players draw.|
|Decisive %||95.50||Percentage of games with a single winner.|
|Samples||1000||Quantity of logged games played|
Note: that win/loss statistics may vary depending on thinking time (horizon effect, etc.), bad heuristics, bugs, and other factors, so should be taken with a pinch of salt. (Given perfect play, any game of pure skill will always end in the same result.)
Note: Ai Ai differentiates between states where all players draw or win or lose; this is mostly to support cooperative games.
|AI||Strong Wins||Draws||Strong Losses||#Games||Strong Win%||p1 Win%||Game Length|
|Grand Unified UCT(U1-T,rSel=s, secs=0.01)||36||0||0||36||100.00||41.67||39.61|
|Grand Unified UCT(U1-T,rSel=s, secs=0.07)||35||2||8||45||80.00||46.67||47.62|
|Grand Unified UCT(U1-T,rSel=s, secs=1.48)||35||3||10||48||76.04||40.62||48.04|
Level of Play: Strong beats Weak 60% of the time (lower bound with 90% confidence).
Draw%, p1 win% and game length may give some indication of trends as AI strength increases; but be aware that the AI can introduce bias due to horizon effects, poor heuristics, etc.
|Branching factor||13.66|| |
|Complexity||10^54.84||Based on game length and branching factor|
|Samples||1000||Quantity of logged games played|
Computational complexity (where present) is an estimate of the game tree reachable through actual play. For each game in turn, Ai Ai marks the positions reached in a hashtable, then counts the number of new moves added to the table. Once all moves are applied, it treats this sequence as a geometric progression and calculates the sum as n-> infinity.
|Distinct actions||33||Number of distinct moves (e.g. "e4") regardless of position in game tree|
|Good moves||15||A good move is selected by the AI more than the average|
|Bad moves||17||A bad move is selected by the AI less than the average|
|Response distance||2.47||Mean distance between move and response; a low value relative to the board size may mean a game is tactical rather than strategic.|
|Samples||1000||Quantity of logged games played|
A mean of 99.73% of board locations were used per game.
Colour and size show the frequency of visits.
Game length frequencies.
This chart is based on a single representative* playout, and gives a feel for the change in material over the course of a game. (* Representative in the sense that it is close to the mean length.)
Table: branching factor per turn, based on a single representative* game. (* Representative in the sense that it is close to the mean game length.)
This chart is based on a single representative* game, and gives a feel for the types of moves available throughout that game. (* Representative in the sense that it is close to the mean game length.)
Red: removal, Black: move, Blue: Add, Grey: pass, Purple: swap sides, Brown: other.
This chart shows the best move value with respect to the active player; the orange line represents the value of doing nothing (null move).
The lead changed on 10% of the game turns. Ai Ai found 5 critical turns (turns with only one good option).
This chart shows the relative temperature of all moves each turn. Colour range: black (worst), red, orange(even), yellow, white(best).
|Measure||All players||Player 1||Player 2|
|Mean % of effective moves||59.24||53.14||65.58|
|Mean no. of effective moves||8.37||7.92||8.83|
|Effective game space||10^37.67||10^17.92||10^19.75|
|Mean % of good moves||26.94||41.83||11.42|
|Mean no. of good moves||3.55||5.12||1.92|
|Good move game space||10^17.07||10^11.27||10^5.79|
These figures were calculated over a single game.
An effective move is one with score 0.1 of the best move (including the best move). -1 (loss) <= score <= 1 (win)
A good move has a score > 0. Note that when there are no good moves, an multiplier of 1 is used for the game space calculation.
|Hot turns||71.43%||A hot turn is one where making a move is better than doing nothing.|
|Momentum||20.41%||% of turns where a player improved their score.|
|Correction||36.73%||% of turns where the score headed back towards equality.|
|Depth||2.01%||Difference in evaluation between a short and long search.|
|Drama||0.72%||How much the winner was behind before their final victory.|
|Foulup Factor||38.78%||Moves that looked better than the best move after a short search.|
|Surprising turns||4.08%||Turns that looked bad after a short search, but good after a long one.|
|Last lead change||44.90%||Distance through game when the lead changed for the last time.|
|Decisiveness||14.29%||Distance from the result being known to the end of the game.|
These figures were calculated over a single representative* game, and based on the measures of quality described in "Automatic Generation and Evaluation of Recombination Games" (Cameron Browne, 2007). (* Representative, in the sense that it is close to the mean game length.)
Colour shows the success ratio of this play over the first 10moves; black < red < yellow < white.
Size shows the frequency this move is played.
Note: most games do not take board rotation and reflection into consideration.
Multi-part turns could be treated as the same or different depth depending on the implementation.
Counts to depth N include all moves reachable at lower depths.
Inaccuracies may also exist due to hash collisions, but Ai Ai uses 64-bit hashes so these will be a very small fraction of a percentage point.
4 solutions found at depth 5.
Player 1 to win in 2 moves
Selection criteria: first move must be unique, and not forced to avoid losing. Beyond that, Puzzles will be rated by the product of [total move]/[best moves] at each step, and the best puzzles selected.