Post
Topic
Board Altcoin Discussion
Re: [XPM] Working on a GPU miner for Primecoin, new thread :)
by
bcp19
on 05/09/2013, 22:17:01 UTC
It seems like memory corruption, the most annoying kind of bug. Might take some time to fix, I haven't been able to reproduce it on my computer.
This is a serious problem that may not have a fix.  I have a GPU that I still have running on SHA256 because it starts throwing massive errors if I try to use the same intensity on scrypt coins.

I don't think this is a GPU error, but rather an error in the CPU code somewhere.
The reason I brought it up is that several people had memory problems on relatively new cards and one of the posters explained about the tolerances of GDDR5 memory used in GPUs as compared to the DDR3 used in computers.  http://www.mersenneforum.org/showthread.php?t=17598 is the initial discussion.

In answer to the question of "Why is GPU memory weaker than RAM?" was:

Quote
That is normal for video cards. I already commented this few times, the video cards industry is the only one (from all the electronic related branches) which accepts memories with "some percent" of unstable or bad bits. That is to be able to make cheap cards for people as you an me, and it is accepted because your eyes will make no difference (and even with specialized instruments is difficult to see) between a pixel on the screen which is Green-126 and one which is Green-127. Your monitor even has only 18 physical "wires" (6 for each of Red, Green, and Blue) through which the color is transmitted, so it CAN'T show more then 2^18=262144 colors, and your eyes may see more then 4000 only if you are professional photographer or painter (or woman, hehe, my wife always tells me that I only see 16 colors, and indeed, for me "peach" is a fruit, not a color, and "rose" is a flower). So, displaying 24-bit color or even 32-bit color is just because old processors use a full byte for each r/g/b, or because new processors can operate faster on 32-bit registers. But from 8 bits of red, only 6 most significant go to the monitor. When you use your monitor in 16 bits mode, there are 5 lines of red and blue, and 6 of green (as the human eye is more sensitive to green), the rest of the "bits" of the color is "wasted" anyhow (physically, the lines are not connected to the LCD controllers, I am not talking about serialization, LVDS, and other stuff used by your card to send signals to your monitor, I am only talking what's happening at the LCD controller level).

That is why a Tesla C/M2090 is 4-6 time the price of gtx580, there is no difference between them except the ECC memory and intensive production testing for endurance.

--------------------------------------------------------------------------------

One possible remedy postulated is downclocking the memory.