The Future of the Operating System

cbrooks on 2002-12-06T14:24:09

I just moved to NH and earned myself an excruciatingly long commute to work (approx 3 hours a day). The upside is that I have a lot of time to listen to books on tape and, most recently, MP3s. If you are in a similar situation, check out technetcast.com -- the site is presented by Dr. Dobb's Journal, and it has lots of (free) technical conference presentations available on MP3.

Yesterday, I listened to Richard Rashid, VP of Research at Microsoft give a very interesting presentation on the future of the OS. He argues that the essential concepts of the modern OS: hierarchical file systems and user-initiated execution of programs are artifacts of the computers that we had 30 years ago. Computers today are some 40 million times more powerful, and storage capacity is approaching the point that a person could record and store every conversation that they have from birth to death on a single hard drive. Rashid envisions a shift towards interacting with your OS primarily through google-style query mechanisms, rather than a filesystem. The OS would be smart enough to initiate programs on its own, rather than waiting for user input. Obviously, OSes would have a host of new (or at least dramatically improved) input devices, such as speech recognition and handwriting recognition. Perhaps more dramatically, broad spectrum language parsers would have an increasing role in the background to prioritize and categorize documents, email, spoken conversations, etc., and then react without an explicit request from the user.

Anyway, it is a thoughtful talk -- certainly worth a listen.


I used to have a long commute...

jordan on 2002-12-06T17:06:33

but, my flying car has taken care of that.

Seriously, talking about computers being 40 Million times more powerful leads to some pretty silly conclusions.

While raw computing power may have improved 40 Million times, programming has not advanced anywhere nearly as much. I would hardly trust Windows (or even Unix) systems to do a lot of launching programs based on what it "thinks" I need, or trusting that it really gets the gist of my natural language request. I could see this really getting out of hand, ordering 50 Million pizzas for tonight's party or letting the dog out in sub-zero weather and forgetting to let the dog back in in 5 minutes.

We had a professor in the CS department at my University over 20 years ago who was telling us that only a few of us would have jobs in Programming in 5 years because automated AI systems generating programs from natural language requirements would obviate the need for most programmers. AFAICT, we're no closer to that dream today than we were then.

I feel the same way about these kind of predictions.

BTW, I do like to listen to a lot of audio, music, letures, radio programs, while I work. This technetcast is quite a find! Thanks!

Re:I used to have a long commute...

cbrooks on 2002-12-06T19:08:37

>This technetcast is quite a find!

It's pretty darned cool.

>talking about computers being 40 Million times more powerful leads to some pretty silly conclusions.

Take a listen to the talk. Rashid is not making the argument that improved processing speed / storage capacity magically allow us to develop cool stuff. His argument (perhaps poorly summarized by me) is that the choices that OS designers made may have been reasonable 2-3 decades ago. However, partly because of processing and storage improvements, and partly because there have been substantial genuine improvements in our programming tools (notably in language parsing, machine learning and handwriting, speech and video recognition), we can do better today.



>I would hardly trust Windows (or even Unix) systems to do a lot of launching programs based on what it "thinks" I need, or trusting that it really gets the gist of my natural language request.



I understand your concern, and it seems likely that a certain class of applications will require human intervention for some time to come.

However, his claims are less out-landish than they might seem on the surface. The computer does not need to understand the "gist" of the request the way a human would understand it. The computer needs to be able to parse the request to identify the core query and determine what a reasonable reply would look like. It would then access a database of categorized content, looking for similar queries and replies. The "understanding" is "simply" the application of a set of associational rules.

As an example, Mozilla 1.3 will include a spam filter that uses Bayesian statistics to determine whether or not an email is spam, and file the resulting email appropriately.



Examples that Rashid suggests include improved categorization of email, for example, based on the computer's recognition of patterns in the way that you've handled email in the past. For example, if you always respond to emails from a certain email address as soon as you read them, the computer might learn from your behavior and automatically assign those emails a higher priority, or even page you when one arrives. Machine learning is at the core of many of the improvements that he discusses.

Re:I used to have a long commute...

jordan on 2003-01-22T11:36:58

  • >This technetcast is quite a find!


    It's pretty darned cool.


Hey! I just went to technetcast.com to try and snag some stuff and it was gone!

It points to some abandoned-site-now-ad-portal now.

Too bad...

Re:I used to have a long commute...

cbrooks on 2003-01-22T12:42:58

Apologies, I should have posted the full url. TechNetCast is hosted by Dr. Dobb's Journal at: http://technetcast.ddj.com/. Thankfully, it is still up and running.... ;-)

Re:I used to have a long commute...

jordan on 2003-01-22T20:48:02

Forgot to followup my comment with a note that I'd found it through Google.

I'm pretty sure that I'd tried it before and http://www.technetcast.com/ used to work.