getting even more off topic, here's some random tid bit that for some reason I remember ...
There was this mathemetician who thought that if a theoritical computer used all the energy in the world to power itself, then it would eventually run out of energy, and he calculated the maximum number it could count to. He then argued that theoretically there was no use for numbers larger than that. Seems kind of fishy to me, but someone might find it interesting
Stephen
PS. if I'm motivated I'll scrummage around my books and see if I can't get the whole story for you