I think it's just his math is bad. I'm doing about 8.5g/s on gminer, I think it's just reporting hashes a minute not second. If you divide 8.5 by 60 you get .141 and my accepted speed is hovering around 0.145h/sec on the pool.
Miner display graph per second, pool display hashes per second. To compute one hash miner should find solution, to find solution miner need compute on average 42 graphs. Thus H/s ~ G/s / 42
As a miner, should I care about graphs per second? Obviously I care a lot about Pool accepted h/sec so I care about miner reported h/sec. But do I need to care about graphs/sec?
Algorithm: Cortex Stratum server: host: mining.ctxcpool.com:8008 user: 0x89c27a***************** password: x Power calculator: on Color output: on Watchdog: on API: off Log to file: miner.log Selected devices: GPU0 Intensity: 100 Temperature limits: 90C ------------------------------------------------------------------ 19:22:06 Failed to initialize miner on GPU0: MSI GeForce GTX 1070 Ti 8GB: out of memory 19:22:06 No devices for mining
GTX 1080 8Gb normal work!
Win 7 or Win 10, I have no luck with 8gb cards on Win 10 but they all work on Win7.
Does anyone know why my Cortex hashrate differs so much? Tried out a few diff pools.
Gminer reports 6.5G/s but pool reports 0.125H/s. Does G stand for something I've never noticed before?
same here, cortex in gminer is broken.
maybe dev misplaced the 5% to the miner and 95% to the dev fee LOL
I think it's just his math is bad. I'm doing about 8.5g/s on gminer, I think it's just reporting hashes a minute not second. If you divide 8.5 by 60 you get .141 and my accepted speed is hovering around 0.145h/sec on the pool.