That's not how scrypt GPU mining works. You are implying that the GPU memory is not used at all, but this is bullshit (just try to downclock the GPU memory and see the effect yourself). You are implying that the memory latency is somehow important, but this is also bullshit. The memory bandwidth is the limiting factor. You are implying that only a single 128K scratchpad is used per whole GPU (or per SIMD unit), but this is also wrong. In fact thousands of hashes are calculated simultaneously and each one of them needs its own scratchpad (of configurable size and not necessarily 128K). You really have no idea what you are talking about.
+1
I was a bit surprised and taken aback at DeathAndTaxes description of scrypt mining on GPU and the lack of understanding of how it is accomplished, given his post history. The idea that scrypt implementations on GPU do not store the scrypt scratchpad in external RAM, and instead can fit it in on-die RAM (with more than a handful of shaders processing scrypt), is way way incorrect and pretty far out there.
EDIT - Reviewing DeathAndTaxes post history going back to the early days of Litecoin, I'm stumped. Hey DeathAndTaxes, were you just trolling? Or, has someone hacked your account and posted it as a joke at your expense?