What's wrong with this?
I can see some potential problems here....
You gave the example of Usenet feeds. However, Usenet messages are static once they are sent - you can't "back up" in the news feed stream and edit a message in any way. Web pages, on the other hand, can be changed at any time. I could write a script to do a tweak to a (theoretical) gig of documents on my server, then your service would have to re-propagate these throughout the network. Keeping up with the web would be a huge job.
What you're talking about sounds like, basically, not only indexing and caching the entire web, or at least the portion running your software, a job that huge companies with enormous server farms and massive budgets have not even come close to accomplishing.
And, what about online scripts? Host sites would have to be able to generate dynamic content, a rather large part of the web today (tangent: any stats on how much of the web actually is dynamic? that could be interesting). This is really not very feasible, unless you restrict users to 1 language and 1 feature set. Most sites use a pretty wide variety of languages / Apache modules / add-ons / shiny super duper new scripting gadgets. And if you do allow unrestricted scripting, that brings up security concerns.
The benefits of this would be marginal; most sites don't max out their bandwidth / hardware capacities very often, and if they do, it's time for an upgrade. So this would only help for huge peaking traffic - say, the /. effect.
Instead of taking on this huge project, which you cite as a way to combat the /. effect and has limited other usefulness, why don't you just whip up some scripts to go out and mirror all the static content from every page that Slashdot links to? *grin*
And if you do try to pull this off...best of luck, it could be a neat project after some tweaking in the concept.