We have long had a culture of waste. Before the 1970's the average appliance, toy or tool was designed to be maintained. They used screws and removable covers and spare parts were often easy to get, even if you had to buy a used one to use for parts.
Ebay is the modern version of this. People buy non-functioning units from their former users, strip out the working parts, and sell them back to other users who know how to fix their own widgets. Until they start potting everything in epoxy (which they never will, because then even the manufacturers would have to throw away units that fail QC instead of repairing them), they won't be able to kill the market for replacement parts.
Since the electronics revolution of the 70's, more and more devices were transistorized and marked with "No user serviceable parts inside". Fuses went from little glass tubes on the back of the TV to rice grain sized devices on a circuit board. Replace instead of repair became the norm.
Be careful what you classify as "waste", now. Throwing away a burned-out IC is
much less wasteful than throwing out a bin full of burned-out vacuum tubes. The problem, I suppose, is that nobody wants to learn how to solder anymore, or else they don't feel like shipping the dead unit back to the manufacturer so the manufacturer can wave-solder a new IC into place and put the unit back into circulation. It's really the user's fault for being lazy, not the manufacturer's fault, at least insofar as the difficulty in servicing "comes with the turf" as opposed to being made intentionally difficult to discourage repairs.
While we think of the automobile as the poster boy for planned obsolescence, the computer is really the leader in this area. By design, each version of Microsoft's products require better hardware. By design, they refuse to provide support for older versions and will not license it. By design, they lobbied for the Digital Millennium Copyright Act which makes it illegal for anyone else to provide fixes for their products.
I have two computers at home; one is a Sony laptop from 2004, and the other is a self-built desktop PC from 2003. (my work computer was built in 2001.) The Sony has been upgraded with a solid-state hard drive and a 802.11n wireless card (which required taking the entire lower half of the case off, yeesh); the old hard drive got put into a hot-swappable external enclosure and is now used for backing up files on my desktop PC. The desktop PC, on the other hand, has had virtually every original component replaced since I first bought it, and the originals are all still in-use in computers I have since built for my family. A few months ago I took the remaining spare parts and built a clunker to donate to a charity; it works fine, but it's obviously pretty slow by today's standards, but at the same time computer speed has leveled-off; the parts go as fast as they can now, give or take a little finessing.
The end result has been a self perpetuated cycle of needing new computers to run the new software that you have to buy to be able to run current applications. The new computers, of course, have to have the new OS to ensure that you have to upgrade your applications too.
To a certain extent that's true, because the engineers (such as myself) have been trying to push the technology to its limit, much like what happened from the 70's to the early 2000's with the combustion engine. Once that practical limit is reached, however, only incremental progress is made from then on, and the software running on that hardware will also reach a limit of what it can do in a reasonable period of time.
This could only happen if we are already conditioned to the idea that things are disposable.
I see a different problem: most people are too damned lazy to figure out how they can use what they already have to suit their needs at any given moment. (it may be egotistical of me, but I'm going to exempt myself from this because I reuse just about everything at least once.) To a certain extent, this can be blamed on slick marketing, as evidenced by the people standing in line to get a new Apple iPad when it doesn't even do anything their iPhones can't do, it's just a different size. (Woo, that's exciting, no really.
) But past that extent, it is, once again, the fault of the consumer for not looking to what they've got on-hand and applying their ingenuity (or a smart friend's ingenuity) to turn it into what they want to have. I realize the human brain burns 20% of a person's daily calorie intake even when they're slacking off and watching TV, but calories are so cheap anymore that there's no excuse to indulge the old instinct to think as little as possible.
Remember, when you point a finger at someone else, your other three fingers are pointing back at you. Fix your own bad habits,
then blame The Man and his Establishment for causing all your (remaining) problems.