So I'm more or less a noob at all the ins and outs with cards, drivers and SDK's and have tried to follow more or less what is said here and to stick with the basics when setting up and running various miners. I went ahead and updated my drivers today while leaving the SDK at 2.4 and I'm seeing what I think is small decrease in performance. I also updated from 2.3.1 to 2.3.6 and if anything I find it slower but it could be the drivers.
Anyway that's not really my comment or question. I'm more interested in knowing where the I (intensity) setting comes into play. My command line has been all default except for I 9 forever, mainly because somebody else suggested the setting. Since I was playing with things today I thought I would try other intensity levels to see what happens and the difference was HUGE in terms of CPU use. I found that just about any setting 1-10 resulted in the same 50% CPU use I had been seeing and accepting as normal to only 1-5% CPU use by cgminier using I 7. Now what would CPU cycles be higher at 5 or 6 than 7, as well as 8,9,10 I can understand if it's a rising scale? I'm seeing maybe 7Mh/s less with 2.3.6 and Intensity 7 but my CPU is idling while GPU is maybe a touch hotter but overall machine heat is way down due to lower CPU use.
CARD: 5770
GPU/Mem: 975/350
OS: Win7X64
SDK: 2.4
Catalyst: 12.4