Cutting edge? There have been so little real advances in computers over the last 10 years it is pathetic. Babysteps is all they are.
I remember reading a report about a new major idea in computer program called "analytical algorithms" (I think) about 10 years ago. Its first principle was to get a 50% chance of finding 1 out of 100,000 items in a database you'd have to search half those items or n/2 (n being number of items), but someone got an analytical algorthims to work as 1/square root of N, which is 158 times faster. And at the time there was said to be a big push in this application at universities, but I cannot see that these were ever brought into personal computers if they got more analytical algorithms to function.
Newer programs run the same speed or slower than the program they replaced. New CPUs are hardly faster than one rated 50% slower. I see the problem being how software is coded. There is no reason that a OPERATING SYSTEM should require 1GB or RAM as vista does. An OS should just run OTHER programs not have so many internal hidden programs running that is requires so many more resources. Programs are become so bloated because everyone wants every function available from a right click menu - which requires more programming to be stored in RAM for that quick access.
In order for software to become more user friendly for the common user, they have to put so much extra stuff in it, and hence into RAM/swap file, that program cannot run fast anymore.