According to litecoin wiki mining hardware comparison, an AMD Phenom X4 955 at 3.6ghz gets 24kh @ 125 watts. This translates to 0.192kh per watt.
A gpu rig consisting of 69xx series gpus can produce 998kh @ 920 watts at the wall. This translates to 1.08kh per watt.
So does at least a 5.6 factor increase in *efficiency* qualify as "significantly more"?
Consider the litecoin wiki entry for the Intel Core i7 860 which produces 25kh at 153 watts (a believable wattage consumption for the entire system). It gives a system kh/watt score of only 0.163. The gpu example is now a factor of 6.6 times more efficient.
to have a fair comparison:
it's measured at my btc-mining-rigs, the point is, that i only count the extra-cpu-watt of the rigs, because they are mining btc and a cpu-coin is mined extra, when there is one, when there is no coin to mine i save the watt
core i3 / 2.4 ghz / 3 threads / ~ 8 w only for this / 10 kh/s =>1.25 kh/w
CPU / GPU // CPU is
1.157 more efficient 
core i3 / 3.1 ghz / 3 threads / ~ 15 w only for this / 12.5 kh/s =>0.83 kh/w
CPU / GPU // CPU is
0.768 less efficientcore i7 / 3.6 ghz / 7 threads / ~ 46 w only for this / 32 kh/s =>0.695 kh/w
CPU / GPU // CPU is
0.644 less efficientfor me i can't see the 5- or 6-higher-efficiency-factor of the gpu...