Hi, hoping for some help with 1 of my 3090's I use for mining. Using v5.5c.
The problem I'm having is as soon as it hits around 60c temp, the performance drastically goes down. For some reason the card stops pulling the usual ~350 watts and only draws ~270 watts after 60c ish temperature is hit. I have a 2nd 3090 which pulls much higher, has no monitors plugged into it and draws around 400 watts. That card will happily go up to 66-67c temps and remains at the 400 watts draw. I'm really scratching my head with the first 3090. It hurts watching it's hshrate go from 120mh to 90mh just because the temp climbs from 56c to 60c lol. Meanwhile the 2nd card doesn't seem to care about temperature and will chug away doing 122mh with 400 watts.
If you need any extra information such as logs, OC settings etc then please let me know and I can provide them.
Let me guess, the card in question is EVGA or Gigabyte and the second one is another brand? Most likely thermal throttling is the case, 3080\3090 are known for that, these 2 brands being the worst (in that case only LOL). Note: it's not the core temp that you see in afterburner, it's memory temp, you'd need utility called HWinFO64 to see it. Download it, take a look at your memory jusction temp sensor reading and you'll most likely see 110-112 degrees Celsius. Common fix is to change thermal pads to good quality ones with 12-17 W/mk (I use these,
https://cdn.alzashop.com/ImgW.ashx?fd=f4&cd=CB131w1&i=1.jpg) and do everything possible to improve the ventilation.
Both cards are EVGA FTW3 Ultra 3090's. Not using Afterburner but EVGA's Precision X1 instead. 60c is way too low for thermal throttling when I have the temp target set at 91c lol.