Post
Topic
Board Mining (Altcoins)
Re: CCminer(SP-MOD) Modded NVIDIA Maxwell / Pascal kernels.
by
abctoz
on 30/11/2016, 18:34:07 UTC
I think 1070 loses efficiency powering the 8GB of memory, the 1060 3GB should be best, except for in ethereum.


Probably, but it's hard to gauge just by how much. I think the number is miniscule and it's nice not to get locked with 3-4GB in the future as I'm not planning on selling 1xxx series cards in less than well over a year.

I could only find estimates on memory power consumption which vary wildly:

http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory (1/3 of the page)

Based on other sources older cards consumed 4.35 W per gigabyte of GDDR5 (17.4W for 4GB and 34.8W for 8GB) but the same source says it's closer to 20W or slightly more for 8GB.

There are also much higher numbers, like 50W for an R9 290X.



isn't it watt per die instead of eg. 1 GB Huh

I don't know. Maybe that's one of the reason why the numbers are so different.

It would be per die... The amount of power memory consumes is pretty small compared to the entirety of the card. Some people also forget that Ethereum is not the only coin in existence and building your ecosystem around one coin that will be getting hit hard in the coming months is not such a great idea.

Nvidia is also not responsible for the kind of memory manufacturers use. That would be up to manufacturers such as MSI/Gigabyte/Asus.

Ethereum continues to give the largest % of mining revenue by far, without it we can all throw our cards away. Also 30W/card would not be negligible.

Only NVIDIA pulls this memory switching shit every generation, your argument fails because AMD and its AIB partners don't do this.