The problem isn't the hardware, but the software running on the site... The software either:
1. Provides more features so consumes the available hardware... (A good thing as long as the new features improve productivity).
2. On the client side, more is being pushed to the client in the form of javascript (and flash / java applets to a lesser extent). So rather than the server building a complete page and sending it to the client, it now sends little-itty bits at a time and is expected to respond to all these additional requests made by these javascript libraries that "power web 2.0 technologies". So you get a latency vs throughput issue in regards to communications... I think blogger.com sites are a good example - the entire client side is driven 100% by javascript (blogger.com sites give you a blank page if you turn off javascript). All their sites seem slow and have high latency to events...
Part of this extends from a lot of newer "web developers" never having written code in a constrained environment so don't think about resources to a large extent. (Some developers don't realise that new javascript library they just added as a requirement for 1 new function weighs in at over 1MB in size - yes jQuery I'm looking at your bloated arse - yes I know you can be trim and lean, but tell people how to do it). If it works fine on their 1 desktop PC, they don't give a sh*t about anyone else... (Mind you, the same can be said for a lot of new software developed these days especially in the enterprise market).
Beta Slashdot is a good example of the new mentality vs the old one. Lots of javascript, very dynamic, but slow and horridly broken... The classic site, loads a bucket load of information very quickly and actually works (because it's not as dynamic as the new one). Also, old classic slashdot works perfectly fine without javascript, the new one doesn't work at all...
There is also a push towards programming languages at a higher level of abstraction... Back in the real old days, if you wanted dynamic content, you had to write your code in C and build it as a CGI plugin for your webserver, then shortly later came using Perl for dynamic content... Newer sites will tend to use PHP (which is highly optimised for web), but is still slower than writing CGI based plugins in C (as this is a native code vs interpreted code issue). A good webserver will however compile and cache the PHP/C#/Java code for performance so it's largely moot, but is still an issue.