Ted Leung picked up something that Bill Pugh (yes, that Bill Pugh) is working on: code optimization in compilers.
Turns out, CPU power doubles every ~18 months (Moore's Law), but optimizing compilers double CPU power every ~18 years (Proebsting's Law). On the one hand, if you're lazy and just buy new hardware early and often, your code will run fast enough if you wait long enough. On the other hand, if you spend huge amounts of money on getting smart people to work on compiler optimization, you'll get roughly 4% speed increases annually.
(Let's not forget: the classic text on compilers is ~15 years old. There is very little new work in the field, and certainly nothing so ground breaking that it is supplanting the Dragon Book.)
That's not an excuse to be lazy and write bad code. Pugh's statement is that we need to stop focusing on what instructions the CPU is executing, and focus more on how we write code. For example, if you need your Java program running faster today, one very workable alternative is to translate it into C/C++. A better alternative is to find the stupid things you are doing that senselessly waste time and stop doing them! (Pugh uses the example of synchronized access to thread-local variables: If they're thread local, then why are you synchronizing access to them??!?!)