I don't know that it's wasted power exactly. The way I see it, there's something like a domino effect going on: FPGAs are uprooting GPUs for the sha256d proof of work, while GPUs have just made their debut on the scrypt proof of work. As for whether CPU-mining LTC is worth the trouble: look at the marginal power cost if the machine was going to be on anyway, or at the total power cost if not. Should they ever hit the market, ASICs might well clobber current FPGA board owners, though since the scrypt proof of work requires much more memory, those board owners might be SOL (barring the Butterfly Labs
metamorphosis trade-up offer).
An Atom D525 has a marginal power cost of only about 3W over idle for 5.5 kH/s, which (while small) is a net win. Extrapolating current values, that's an expected payback of 18.62 LTC a month, which should cover even fairly outrageous energy pricing if the machine is otherwise left on anyway. On the other hand, that same desktop with SSD and nVidia ION graphics has an active-idle draw of about 20W at the wall; this makes it a net loss even at 0.075 USD/kWh if the machine is not required to be left on. (You can squeeze ~1 MH/s out of the ION mining BTC, but that's not enough to change the equation.)
Then there's always the gaming potential of a good GPU. Let it mine for you when you're not keeping it busy blowing the living daylights out of something.
