That entire heatsink on each card is saturated at 50C! Holy hell on a crutch!
Are those IR pics accurate? That seems excessive... if it actually is 40 - 50C, what's going to happen to them in hot weather?
Maybe Jeff can heat up a room to 85F or so and let it run to see what happens.
Is 50C really a problem? I had the impression that even 80C isn't a problem? My GPU is at 62C right now not doing anything. (FOSS Linux Radeon driver has no power management

)
Are there components on the board that might not be able to withstand >50C temperatures?
Remember that the hotter the heatsinks get the more heat they dissipate. Both by radiation but also by warming up the exhaust air more. If Jeff's room is 20C now and he heated it up to 35C I don't think we would see an increase to more than 60C, with the fans running at the same speed.
I know some computer components can run extremely hot.. what I'd be more concerned with high temperatures over long time periods is the lifespan of the device. Just how long can these keep running at those temps... 3 months... 6 months.. 3 years?... Hopefully we won't find this out anytime soon

Why would these temperatures break down the device though, as you seems to suggest is possible? Which components are slowly degraded because of 80C temperatures? The PCB itself? Surely the silicon itself can stand temperatures much much higher than that. But of course there are capacitors, voltage converters and all sorts of components I don't know what are, and it's possible they somehow break down at higher temperatures.
I'm genuinely interested in an answer, because we all hear all the time that, for example, running one's CPU/GPU at high temperatures causes its lifespan to decrease, but I haven't actually seen any data to support this. I mean, it obviously gets too hot at some point, where some of the components will not be able to handle this, but whether this is 60C, 80C, 120C or 150C I don't know.