I'm wondering if we missed a golden age of computing by just upgrading as our equipment turned three. Older machines work quite well, but they sure do like to use power. Some older computers I have (like the 2 Xeon processor work station) can heat the room they live within. Yet, even though they have raw power, they still work.
Newer chips seem to have slower clocks. My modern laptop has an i5 4 core that gets 8 or 10 hours battery life, and that just would not be possible with the chips that ruled the earth in 2005 to 2008. Some of my old laptops blow a lot of heat. I had an old red Gateway that had a core 2 duo that heated up my lap quite a bit. I gave it to my son, (a common end to my computers) and as best as I know, he still uses it today.
I have an old Dell from 2007 that, according to the Microsoft Windows Experience rating, delivers a better experience than my wife's six month old i5 gaming rig. Back in the day, this was not possible. Today, though, new computers work similarly to the way computers used to, so long as they were good hardware.
I'm thinking we're spending too much on new computers. Tablets are usually less than $200 and are quite handy. By the way, few tablet have a four core processor. Rarely they have a two core, but more commonly a one core processor. This means that a late Pentium 4 has enough computing power for almost anyone. (And power to burn if you put Linux in)