Autrijus Tang surprised us and the world by implementing an IMC compiler for Pugs in only 24 years. The first 9 months of this time he spent developing into a fully developed life form - a wise choice. After 12-13 years of receiving nurturing and guidance by the educational establishment, dropped out and founded a .com, for which he would later become CEO. 3 months or so ago he was ready to learn Haskell, less than a week ago added reading about various other critical technologies, until finally glancing over the spec, spending an hour or so translating the spec to his internal representation of a Haskell grammar, and then not actually taking any time to code anything, because the translation just happened to be executable.
If Dirk Gently were here, he would no doubt also mention how interrelated this chain of events was with the development of LISP and Smalltalk, the Turing machine, Descarte's "I think, therefore I am", the Maoist revolution and the eruption of the Oronui caldera 30,000 years ago, which spewed out enough magma and volcanic ash to cover the entire island of Taiwan 20 meters thick (except it erupted in New Zealand, forming what is now known as Taupo lake), of course event had a truly profound impact on the development of humanity - and with a little basic tuition in calculus it is quite easy to induct a proof that this therefore was connected to the development of this IMC compiler.
If you take those events into account, well, it actually took ∞ years to produce, unless you buy into one of those strange theories about the universe being able to be described as starting or ending at a particular point in time. Such matters are, of course, irrelevant; the point is that however long it took to achieve, the fact is that it is now in the past is a monumental achievement of all people involved. But especially Autrijus and Leo Tölsche.
Infinity is a hard concept for most people to deal with. Most people can't even understand the concept of a million, like a million litres of water, or a million tons of CO2. This failure to grasp scale has been one of the many reasons Global Warming scaremongers manage to continue to strike fear into the hearts of the populous. Similarly so, implementing a brand new language along with a fully spec'ed virtual machine, with consistent and reliable semantics should be a piece of cake, right?
Mathematicians have been haggling each other about various minor points of infinity for hundreds of years - great hackers of bygone eras like Pascal, Fermat, Fibbonacci, have slowly and surely been chomping away at examining the characteristics of defined worlds. Unlike the real world, defined worlds have guaranteed characteristics - after all, you defined them! This is the sort of thing that allows infinity to be tackled, and inducted into a single point of existance.
This is why some say, that the field of mathematics is the only field of scientific development that deals in truths. Science deals in empirical observations and theories, and can never do anything more - despite what various nihilistic fools in the practice of blind reductionism that call themselves physicists claim. Some religions even claim to deal with truth, but unfortunately, that's a find of truth which can only be experienced, and cannot be explained or passed on to others; this is a riddle.
But times have changed, and the field of mathematics is encroaching on computer science at an alarming rate. The Curry/Howard Correspondance was perhaps the most significant development in recent times for computer science, and especially Pugs. All of a sudden, these arcane proofs and tools of induction, integration, calculus that mathematicians have been working on for years now have a practical application; we can finally make something useful out of them - a damned fast compiler that you only have to tell the rules of what you need to do, and it automatically inducts a correct program that behaves according to what you told it. The compiler's optimisations itself are all proven correct too - not just the ideas of very high level wizards implemented bearing in mind lots of different test cases. Interested readers are invited to read Types and Programming Languages, the text on programming languages that inspired pugs - but be warned - this book is aimed at graduate students, despite being a practical guide. It sure as hell scares me.
Of course, this all boils down to a matter of preference. If your programming mind is fixed to be imperative-only, then how could you possibly consider to accept that a piece of code like this:
∃ Class A, B : A.superClass = B ↔ A ∈ B.subClasses
Could possibly be better than the pure "object oriented" approach? Shucks, you know if you expect to be able to write code like that, you're just not using a real language.
That particular constraint is actually quite easy, using Class::Tangram;
package Class; use base qw(Class::Tangram); our $fields = { ref => { superClass => { class => "Class", companion => "subClasses" } }, set => { subClasses => { class => "Class", companion => "superClass" } }, };
Interested readers are invited to look at the implementation inside the Perl module, which has been coded to be strictly compliant with the correct object oriented approach, which I used to think really meant something. But after seeing how such a simple idea can be expressed so succinctly - avoiding literally hundreds of lines of code, I think I might just be ready to see if this "Concise Hell" has any spaces open.
The question only remains - which sins are considered bad enough to buy myself entrance?