Search content
Sort by

Showing 4 of 4 results by dogelimic
Post
Topic
Board Mining (Altcoins)
Re: [ANN] cudaMiner - a new litecoin mining application [Windows/Linux]
by
dogelimic
on 07/01/2014, 15:43:31 UTC

Are you using the latest version from git or a binary release? I see a commit message on Dec 28th that says "add back support for chunked memory allocation and texture cache to Kepler kernel. Slight speed-ups with -C 1 are seen." which implies he removed it at one point, possibly when upgrading to CUDA 5.5.


Was using the latest binary, the 12-18 release. I'll try compiling from git, thanks.

edit: Is there a guide somewhere to compiling on windows? I got all the components but I'm unsure what to do with them.
Post
Topic
Board Mining (Altcoins)
Re: [ANN] cudaMiner - a new litecoin mining application [Windows/Linux]
by
dogelimic
on 07/01/2014, 09:34:08 UTC
Can someone please explain this to me? It looks contradictory.

-C, --texture-cache   comma separated list of flags (0/1) specifying which of the CUDA devices shall use the texture cache for mining. Kepler devices will profit.

This says kepler devices will profit.

GPU #0: GeForce GTX 770 with compute capability 3.0
GPU #0: the 'K' kernel ignores the texture cache argument

My 770 is a kepler device, but cudaminer says that my kernel ignores the texture cache...

So which is it and how will my device profit from a launch option which is ignored?
Post
Topic
Board Mining (Altcoins)
Re: [ANN] cudaMiner - a new litecoin mining application [Windows/Linux]
by
dogelimic
on 05/01/2014, 14:32:36 UTC
What is the trick?
The trick is to force computation onto the 3D side. You do this by starting Cudaminer when 2D is running, and then disable 2D. This keeps the 2D running for the program only and forces the 2D computation into the 3D program. Its like running an 8 bit program on a 32 bit system without emulation.
I'd also like to understand how this works... disable 2D while it's running?
Just turn on the overclock and then cudaminer. You have to then, disable 2D in GPU Tweak. That is the process. I would like to find out why the scrypt is not being submitted or accepted. I would need someone who knows about programming and scrypt computations to find the issue, and possibly fix it. This would be a huge boost if we can get it working. It could all be for naught though.
This is what I get whenever I try. I've tried a number of launch codes but they don't seem to make any difference.

GPU #0: GeForce GTX 560 Ti result does not validate on CPU (i=5432, s=1)!
GPU #0: GeForce GTX 560 Ti result does not validate on CPU (i=5859, s=1)!
GPU #0: GeForce GTX 560 Ti result does not validate on CPU (i=4586, s=0)!
GPU #0: GeForce GTX 560 Ti result does not validate on CPU (i=923, s=1)!
GPU #0: GeForce GTX 560 Ti result does not validate on CPU (i=2834, s=0)!
GPU #0: GeForce GTX 560 Ti result does not validate on CPU (i=6629, s=0)!
Post
Topic
Board Beginners & Help
Topic OP
Hello
by
dogelimic
on 05/01/2014, 07:58:08 UTC
Lurking for awhile, would like to participate in the altcoin mining forums.

dogecoin to the moon!!!!