And here he's correct. If the VRM is designed to withstand 100 C, cooling it from 80 to 40 makes no difference.
Well it depends. No device is fine at 99.99999C and then explodes at 100C. The hotter a component gets the less efficient it becomes that means more current is needed and more current means more heat which means higher temps, which means lower efficiency .... and the cycle goes on.
All the problems seem to be limited to the rushed "lets cut the number of VRMs in half" mid production modification version. If the 4 VRM version (doing the job of 8 VRM as designed) is running at high load then even a modest amount of cooling could improve performance. Not saying that IS the case just pointing out it could be.