Inference Group
.



.

ICFP Programming Contest 2002

This is an outline of the bot that myself and Leon Brocard came up with for the London.pm entry to the ICFP Programming Contest 2002.

So, how did we do it?

Well, poorly. We didn't start work until half way through Saturday; almost a day after the competition started. We didn't get to work on it as much as we'd have liked on Monday, because we were both at work. And we actually went to sleep during the competition, which I'm told is incredibly poor form.

Language

We coded Marvin in Perl, perhaps a controversial choice. We're both Perl coders (and members of Perl Mongers) so it was the obvious choice for us. It was also good for the network and data munging. We did worry about the speed issue, but it turned out not to be a problem at all: we didn't even worry about optimising or caching our path finding.

Basic Client

We read up around the problem and built a client library which encapsulated all the client-server communication and returned changes back to us as data structures. We also built a Curses frontend so that we could check what was going on. Once we had the basic framework, we had the tools to build and test the rest.

Navigation

Leon found a Java implementation of A* and got to work porting it to Perl, while I wrote up a class to read the map from the server and return a data structure from it - these ended up as Maze::AStar and Maze::Map respectively. In a very short amount of time (around two hours from our start time) we had a decent traversal algorithm going, and moved on to looking at packages.

Packages

Here's an overview of how packages in the contest worked. You get told the location of 'home bases' on each map by the server. These may or may not contain package(s). On top of that, robots on the map may drop packages when they're pushed. In both cases, you know where the packages are, and when you're on top of them you know their weight, and the co-ordinates of their destination.

The problems of choosing which packages to pick up from a base are representative of two larger problems, which seem to be the real guts of the competition - the Travelling Salesman and Knapsack problems. Travelling Salesman is involved because the order of which packages you pick up has an effect on the route you're taking, and whether it's an optimal one; how long you stay in the game depends on how well you use your fuel, and so there's a correlation to your score. As each package has a weight, and you have a maximum capacity, the Knapsack problem involves choosing which packages to take to maximise your use of your capacity. Both of these problems are in NP-space, and the differences we see in competition bots will involve different heuristics being used to simplify the problems.

As for Marvin, we went the very simple way. Our choice of which packages to deliver is limited to picking up packages going to the same place if multiple destinations are present in the package pile. In the best case, this involves a number of trips equivalent to the number of destinations in the package pile, rather than equivalent to the number of packages.

Deciding how to fill our capacity with packages bound for the same place was a much harder problem. We looked at using a Monte Carlo Simulation before having an epiphany over what we really wanted from package selection. Here's what we decided:

Having a small number of 'heavy' packages that optimally fill your capacity is not as valuable as having a large number of less heavy packages.

The rationale is that you can get pushed by other robots during the game. When this happens, you drop one random package. We'd rather drop a light package than a heavy one, since the weight is proportional to points.

We also pick up packages we run over while on a route somewhere if we have extra capacity.

Bidding

Bidding was a late entry in our bot. Marvin bids "1" most of the time. Marvin bids higher when dropping a package on its destination. If (and only if) there's another robot in the four squares surrounding us, we bid to the value of the packages we're carrying. We also bid high if there is water and a robot around us.

ASCII Art Movies

So you want to see how Marvin works but you can't be bothered to download the server and try it yourself? Well, through the wonders of Perl, the Imager module, netpbm, and gifsicle, we can present animated GIFs of Marvin running.

The first movie (11K, 290 frames) shows Marvin's A*-based pathfinding. Marvin starts at the bottom right and attempts to find a path to the "T". We put a 'v' for each location that the path finding algorithm has visited.

The second movie (86K, 296 frames) shows Marvin trundling around towards home bases, picking up packages, going to their destination, dropping them, and starting all over again. Note the score go up when we drop packages and the fact that Marvin can carry more than one package at a time.

In the final movie (168K, 222 frames) I've pitted Marvin against some other bots from the competition: entries by Bernd Paysan, Berkeley State Machines Incorporated and Simon Funk. Marvin is 'M', other robots are 'R'. They have to pick packages up in the middle and deposit them at the bottom left and top right while avoiding the water '~'. Note that when a robot gets bumped it can drop a package which another robot may pick up. Also note that some of the other robots seems to have less effective path finding and that sometimes robots get stuck in a loop. One robot has overbid by the end, run out of fuel and is out of the game. Cute, huh?

Todo

If we'd had more time or more people on our team there are a couple of more things we would have added. These include: keeping track of dropped packages, keeping track of home bases with packages, more intelligent pathfinding (avoid water edges or other robots), push robots into the water if we have the chance, minimax searching to guess what robots around us would do, and more. The whole point of Marvin is to be simple, and not waste fuel trying to be clever.

Why We Suck

I've been told I didn't make this clear enough. The reason our bot sucks is that we wimped out of the real challenge - working out particularly clever solutions to the two main problems of base-selection and object packing. Another reason is that we just ignore combat completely, and there's plenty of room for strategy; not going near water when there's danger of you being pushed into it, for example, or just avoidance techniques in general.

The code

The code comes in two versions - the light version that was submitted, and our development version, which includes a UI. To use the development version, you'll need Perl's Storable module and Curses wrapper installed. Be warned that they unpack in the current directory, since that was the judges' preferred archive layout. If that all sounds like too much work, I have a screenshot of an early version of marvin running, and some of the code. In the shot, the v characters represent A*'s search path, and the P characters represent the critical path found.

I think that's about everything. Our bot represents a combined number of man-hours of somewhere under 20, so it's well prepared to be beaten by just about everyone. It's been a fun experience all the same.

Site last modified Mon Jun 15 23:12:35 BST 2009