Sorry if this has been addressed before, but I tried searching and came up empty. I'm trying to find out what the two hashrates for a particular GPU/ASIC/FPGA represent. The total hashrates shown at the top of the UI I understand - on the left a 5 second average and on the right the average since the app was started. For individual GPUs however, I'm confused.
One of my GPUs is currently showing 724.1K/728.7K. The value on the left is very stable and rarely changes, but the value on the right changes constantly (+-1K). It makes perfect sense to me that the value on the left is the 5s average for that particular card. However, if the value on the right is the "total" average since the card started mining, why isn't it even more stable than the value on the left?
Or have I missed anything?