The history of null

I thought I'd do a quick post on the history of null as I happpened to stumble across it a while ago.

Each programming language writes the definition of null slightly differently depending on the language but essentially they all boil down to the same thing.  Wikipedia defines null as being "used in computer programming for an unintialized, undefined, empty or meaningless value".  MSDN defines null as follows

The null keyword is a literal that represents a null reference, one that does not refer to any object. Null is the default value of reference-type variables. Ordinary value types cannot be null

So now that we've defined null let's get to the history.  It's a relatively well known fact that the Romans had no numeral to represent 0.  Any of you that have done the roman numerals coding kata will most likely have come across this as that's exactly where I stumbled across it.  This lack of a value for 0 caused no end of problems for medieval computists/mathematicians.  How do you represent nothing or the abscence of a number of things?  In lieu of a represetantation for this they started to the use the Latin work for none which just so happens to be nulla.  Through the years this mutated into the word null that we use today.  So not only can you thank the latin language for having a huge influence on the language that I'm writing this in but now you can thank them every time you get a null reference exception.