Simplicity is king, and JavaSpaces proves that point. The interface consists
of only seven methods, of which two form pairs: `write()`,
`read()`/`readIfExists()`, `take()`/`takeIfExists()`, `notify()` and `snapshot()`. Still,
the possibilities are dazzling.

The basis for JavaSpaces is the concept of "tuplespace computing", pioneered
at Yale with the Linda project. Assume you have a big cluster of processors,
an algorithm, and a way to slice this algorithm in pieces so these processors
can all work on a piece, in parallel. How do you distribute the work and get
the results without tightly coupling "worker" nodes and the "master"
controller?

The solution is to have a basket of stuff that the master fills with work
assignments:

(compute, 0, 1)
(compute, 1, 2)
...

A worker asks this basket for a `(compute, _, _)` entry (where the underscore
stands for a wildcard) and gets a random one.
It makes the computation, and writes back

(result, 0.5)

(the Big Computation here is calculating the mean of two numbers - sorry, I'm
short in great examples today). The master, in the meantime, asks the basket
for `(result, _)` entries - it gets the first one, processes it, immediately
asks for the next one, etcetera. The desired decoupling has been achieved: any
number of workers can attack the problem, and there is no dependency between
the master and the workers; each node proceeds at its own pace.

JavaSpaces extends this concept with the power of Java and Jini technology. In
the above example, the algorithm `result = (compute[1] + compute[2]) / 2`
needed to be available a priori on every worker. With JavaSpaces, you don't
store data in the basket, but you store objects - code+data. The above example
would be handled like ("like", because I'm simplifying - see the Jini spec for
the full details):

space.write(new MeanComputation(0, 1), ...)
space.write(new MeanComputation(1, 2), ...)

with MeanComputation defined as

MeanComputation extends Computation {
int x1, x2;
MeanComputation(int x1, int x2) {
this.x1 = x1;
this.x2 = x2;
}
ComputationResult execute() {
return new MeanComputationResult((x1 + x2) / 2);
}
}

The fun thing here is that the workers will ask for the generic interface:

Computation template = new Computation();
while (1) {
Computation next = space.take(template, ...);
space.write(next.execute());
}

Voila - instant generic compute servers: without modifying any code on the
workers, we can start entering `StandardDeviationComputation` objects, or
`ReallyHardGeophysicsComputation objects`. Next week, I'll explain the trick.