It's new to me that with the default values it's truly GPU resistant. As far as I understand, it's not that the operations aren't hard to be done by GPUs itself, it's just that enough memory is necessary for a certain scrypt hashing (depending on the used values of course). So with enough RAM a GPU should outperform CPUs like they do it on SHA256, no matter what variable values are used. Correct me if I'm wrong.
Unless people start making custom boards for their GPUs (and RAM for GPUs is
very expensive), high RAM usage is
the way of making parallelization difficult.
Scrypt's RAM usage grows linearly with the number of rounds. Each round requires roughly 128 bytes of RAM (with block size parameter and parallelization
parameters set to 1), so Litecoin's 1024 rounds require 128 KiB. Bump the number of rounds to 10478576 and it will require 128 MiB. A 7970 video card with 2048 shaders but only 3 GiB of RAM could only use 24 of its shaders.
+1 GPU coins aren't resistant to botnets so this is nothing different. I hope that the coin has rather unique things that will make it stand out from the rest

Meh. Every computer has a CPU, but only a tiny fraction of them have dedicated GPUs.
Anyway: I think you guys are missing the point here. For getting a pure proof of work mechanism done there is no need for an enc/dec function like scrypt. It can be done with simple hashing (what we are looking for). Of course it's interesting to see how scrypt makes the processing cache intense and it may be something we should include in our hashing function too.
While scrypt uses Salsa20/8 (an encryption algorithm), scrypt itself is not reversible and, therefore, not encryption. Scrypt is designed as a key derivation function and can be used as a deliberately slow hashing function.