Assuming that x86 CISC is faster and cheaper
than RISC, and consumer hardware has completely
outstripped high-performance specialized
computing hardware, we should be sad.
This means no one is doing research on
entirely new sorts of things that people can
have at home.
It means the child ate his mother.
Oh, but what home computer, so fantastic as
it is, possibily want from high performance
computing?
Why not just kill it off?
In the past we got from the workstation
makers:
GL 3D graphics acceleration; floating point
processors; memory management units; and
they popularized the GUI, Ethernet,
multiheaded systems, remote access, TCP/IP,
and, of course, Unix.
Without workstations, we
probably never would have thought to add any
of this to our computers.
There wouldn't be a demonstrated use for it,
as mass produced consumer hardware doesn't
get new features unless there's a demonstrated
use and demand for it.
The only thing the home computer has done is
make innovation affordable and faster -
more RAM; more disc; faster 3D; faster
Ethernet.
This brings me to Apple.
Apple, being relatively high-end, has
thus far pushed innovation.
They introduced firewire, demonstrating
the utility of high-speed serial networking
and it's application to video editing.
Before then, PC video capture devices were
unusable, and no one was talking about
streamlining interfacing camcorders to
computers.
Apple also had external drives (SCSI)
ages before PCs did, and they brought
dual CPU systems into the home.
Without that, BIOS makers wouldn't have
seen the utility of booting from USB
devices (which were external harddrives
at the time), and then the whole flash
keychain dongle revolution never would
have happened.
You might argue that Apple will continue
to innovate - and indeed they might.
Switching to x86 alone doesn't kill innovation
(despite historical evidence).
However, the reasons that prompt a company
to switch to x86 often do kill innovation.
The companies want to lower costs and go
mainstream -- Apple, in this case, is
trying to more directly take on Microsoft's
market (Jobs said as part of his speech
he's ready to go after Microsoft).
That means getting rid of a lot of that
overhead associated with doing things
differently (heh, think different my ass).
It means streamlining production and not
offering things that can't demonstrate to
pay for themselves.
It means cutting R&D off at the knees - just
like HP recently did.
And Apple has been having such great luck
recently with just development (programming),
why bother with research?
Jobs said that the spirit of Apple lives in
the operating system -- he wasn't saying that
right before he ditched MacOS9.
It sounds an awful lot like Apple doesn't want
to be Apple any more.
And with things like the iPod, they don't
need to be the Apple of yore.
The problem is, no one wants to be who they
were - Sun doesn't want to make
RISC workstations,
they want to sell x86 boxen and peddle
Java related goods.
IBM doesn't want to computers so much as
they want to make guts for gamesystems
(IBM, go ask Motorola how that went for them
with Atari).
Everyone else - SGI, HP (including Compaq
and Digital), etc - just want to be resellers
for Intel now.
No high-end workstations means power-users
aren't dabbling with companies ideas of the
future and thereby supporting research.
And of course we can't afford the quarter
million dollar RISC starter systems that
IBM, SGI, and Sun sell (okay, Sun is still
a little better).
It's impossible to imagine what the future
would have brought us if we hadn't killed the
workstation.
But now what do we get?
Will Intel give us innovation? They're
offering us DRM - ooh, yippie.
What about Gateway? Gateway Country and
Windows ME - super, thanks.
We've fucked ourselves.
We have no where to turn to.
And the few players left -- the ones
everyone delegated everything to --
are becoming increasingly hostile
to hobbyists who want to innovate
for themselves.
Cray
is innovating more than the lot of them,
and the way we treat Cray, do we deserve it?
Will we even manage to keep them around?
Of course, all of this is the result of
turning the computing experience into a
mechanically produced, canned product for
old people and record industry execs,
cynically writing off the future as
disinteresting or not immediately marketable.
You think HP's Carly Fiorina was bad?
Well, Carly's views are about the same as
the rest of the industry-exects -- short-sell
the future.
-scott
P.S.: I'm leaving comments in search of sympathy
and insight even though I've never in my
life seen a worthwhile comment anywhere.
Since I tend to berate commenters, I suggest
you not post unless you're ready to be berated.
Is a hardware vender -- they don't care what OS you run on the thing and happily releases technical information (at one point, Sun was interesting because it was selling Unix cheaply for the day)
Has no conflicts of interest against its customers -- they've entered into no deals where they've promised to restrict what their customers can do, be it listen to music or attempt to use an ISP other than one of the three for which icons are on the desktop
Supports the hardware and operating system for any use -- be it commercial, industrial, hobby, research, or education -- and does so without feeling threatened that their customer is trying to compete with them and without feeling as though they're losing control of the customer
Responds to bug reports promptly and earnestly
Seeks standards and interopation
Doesn't sell out to competitors only seeking to eliminate them from the market and canabolize the corpse
Re:Unix workstation options
scrottie on 2005-06-10T16:29:59
Also HP has PA-RISC workstations - didn't realize!
c8000
j6750
c3750
Yet more Re: Unix workstation options
n1vux on 2005-06-10T18:55:03
HP apparrently also still offers the last upgrades for Alpha workstations as well as the legacy OpenVMS Alpha servers (which run Linux just fine, thank you, Red Hat or Debian or Gentoo).HP AlphaStation ES47
HP AlphaStation DS25
HP AlphaStation DS1510 year old departmental Alpha servers are available used for cheap money that are on clock par with the Sun Blade 150
... but won't take enough memory to do spiffy graphics workstatio stuff.
Re:Graphics and innovation
scrottie on 2005-06-13T23:31:30
Wew, wew, wew! We have a winner! The "completely missed the point" award goes to n1vux. No, you dimwit. I was drawing a contrast between "brand new things" and "just speeding up what already exists". Graphics card makers, driven by gamers, aren't trying anything new. The basic architecture for 3D acceleration was laid down by SGI in the mid 1990's. Adding fans doesn't count as innovation. My question was, who is making entirely new computer achitecture that might someday go mainstream after being an expensive high-end feature, outside of mainstream? To say the mass manufacturs will get there sooner or later is like saying we would have figured out anime eventually and it didn't have to come out of Japan. Anyway, even if you agree with that last absurdity, stay the hell off my blog.
-scottRe:Graphics and innovation
n1vux on 2005-06-21T18:37:43
Who's calling whom names, youngster?
Why do you assume innovation is only possibly on novel engineering workstations? Because that's all you've ever seen and you haven't read history of this and other industries? Or just stupid?
Do you assume that progress can only come from cycling back around the "it's a new architecutre" merry-go-round? CISCRISC and "Let's create another layer of Cache" recapitulate the phylogeny as much as break new ground, but the new engineers are quite impressed with their inventiveness at (re)discovering the old solution to a system imbalance. The PDP-8 was a RISC reaction to a CISC world, long before either acronym was invented, but not an engineering workstation.
Innovation is possible whenever some class of early-adopter are willing to pay a premium to get bleeding-edge features (and bleed a little too) and thus indirectly subsidize the R&D. The gap is eventually closed and then the innovation moves elsewhere.
Engineers subsidized general computer architecture and peripherials development for years. That does not make it the only way.
The pr0n industry early-adopters subsidized VCR and CD-ROM development, and probably a few other peripherals since. Football and movie fans helped subsidize the VHS VCR (in preference to the superior in quality Betamax, but first release ov VHS had tapes long enough for a game or a movie, and Beta didn't, and the market couldn't wait). Bible-study groups helped subsidize the CD-rom infrastructure, as it enabled distribution of full-text-in-parallel and concordance without a *large* stack of floppies. These industries have moved on... to the internet. (And various other opportunities rather beyond the scope of this debate.)
Now that you can buy enough "engineering workstation" for most tasks as off-the-shelf components, innovation has moved elsewhere.
Much of the bleeding edge of hardware today is in the embedded space -- the better gaming consoles are better computing platforms for less than commodity "computers" costing some factor N more, hence the number of "Linux on " hacks (and vendor countermeasures).
Clustering/interconnect & virtualization are other areas still rife with innovation today.
Large SMP systems are borrowing more from mainframe stacks than from workstations today; the early adopters are business not engineers. This doesn't stifle innovation... just innovation that might eventually trickle down to your personal computer museum.
Regarding your red herring on Anime... If Anime is about big eyes and sexual content, Betty Boop had it first. If it's about neat SciFI toys in cartoons, Dick Tracy had it first. And General MacArthur is still partly repsonsible for the twisted nature of Japanese pr0n to this day -- if you don't get that reference, look it up. I really don't care, however, what you do for your jollies.
As to your final rudeness regarding "my blog", if you want your own private blog, you better host it on 127.0.0.1... or if vanity requires public airing without criticism, uncheck the [x] Allow Comments box.
I should have guessed by your nick "scrottie" that I didn't want to know you.
Systems Software Research is Irrelevant, according to Rob Pike, due to Unix. Similar perspective in a different area.
Guess the future is bland.
Also, strange how most everyone else seems to have more luck with their commenters.