The Applied Games Group Blog

New stuff directly from Microsoft Research.

Silicon Minds Challenge (+ submission sample)

Silicon Minds Challenge (+ submission sample)

  • Comments 3
  • Likes

On the 8th December 2007 the Silicon Minds challenge was launched at the Machine Learning and Games workshop of the NIPS conference in the picturesque Canadian ski resort of Whistler.

The contest is to be judged on 3 criteria:

  • Game A.I. (60%) – Game A.I. will be evaluated based on how it serves the game design, and how it contributes to the game experience. Novelty and originality of the AI is appreciated but not required: a novel use of an existing state-of-the AI method is in itself considered to be innovative. General applicability of the AI concepts/techniques is also valued.
  • Fun Factor (20%) – The fun factor will be evaluated based on how the game design creates a positive user experience. This may include how intellectually challenging, relaxing, stimulating or satisfying the game is. A key indicator for the fun factor will be the desire to keep playing.
  • Production Quality (20%) – Make your game world as polished as possible and hook your judges with exciting, entertaining action. Production quality will be evaluated based on how seamless the overall game play is, the quality of the assets used, and the structure, readability and level of documentation of the code.

The submission must include the following:

  • Your finished game in .ccgame XNA Creators Club Game Package form.
  • The source code and content of your game.
  • A design summary.
  • Three screenshots of your finished game.

To this end over the last week we have been having fun creating a mini-sample submission attached based on the board game Go demonstrating:

  • Using Monte Carlo methods to score the board - thanks to David Stern.
  • Some well documented source - thanks to Ralf Herbrich.
  • A sample design summary - thanks to Thore Graepel.
  • Reused code and graphics - thanks to the XNA Minjie board game sample.

Disclaimer: The mini-game sample itself actually uses a rule based AI and should not be seen as an example of AI or game quality - more a sample of documentation quality and the submission files required.

What we do hope is that you will find the sample a useful reference on how to document your code (specifically the XGoLibrary project), how to comment code you have taken from other sources and how to write the design summary.

On coding standards, if you are starting afresh for the competition we suggest you take a look at the MSDN section ".Net Framework General Reference - Design Guidelines for Class Library Developers" for guidance. In summary this is what we would like to see in the code: 

Don't forget though 60% of the score is based on Game AI... hope you enjoy the contest - we are really looking forward to seeing your submitted games.

Last but not least a quick special thanks to Joaquin Quiñonero Candela for working so hard organising the challenge.

Update 1: We have noticed that the PDF still had some review comments; they are removed now. Also, rendering during the scoring was accidently reset to constant black (rather than an alpha value proportional to the expected propability of the territory outcome). Finally, we slightly refined the passing algorithm of the AI: If the human player passes then the AI uses the Monte Carlo scorer to estimate whether or not each piece is already determined in colour up to a probability of at least 80%. If so, the AI passes already.

Update 2: We have noticed a couple of further bugs and easy improvements that can be made: The rule-based AI would start to fill in own eyes if the eye is either on the side of the board or in one of the four corners. This was not intended and has been fixe now. Also, the Monte Carlo sampling can be sped up significantly by only checking for move validity and "eye-ness" after a random move from the list of empty vertices has been determined. Checking a vertex for being empty is a lot faster than checking if a move is valid on that vertex.

Attachment: XGo Silicon Minds Sample Submission 1.2.zip
Comments
  • The guys from the Microsoft Research group have released a sample entry for the Dream Build Play warm

  • The guys from the Microsoft Research group have released a sample entry for the Dream Build Play warm

  • The guys from the Microsoft Research group have released a sample entry for the Dream Build Play warm up Comp " Silicon Minds ". This sample goes through the basics of what the judges are looking for, for those who are entering please have a

Your comment has been posted.   Close
Thank you, your comment requires moderation so it may take a while to appear.   Close
Leave a Comment