so pull the unit and debug it at a bench. this has been the traditional method of working on large scale cluster deployments for decades, why do it any differently for mining? Commie's example is the ideal way of running a large group of mining rigs.
because pulling the unit with two PSU and (insert the number of GPUs) is a job for two people hehe.
besides pulling out and putting back in is a waste of time. look at what i said above, i fixed my hanging rig faster than you can pull that bar bell from a rack LOL.
Wrong. Hint: rails. It has rails.

Then it's just a matter of few screws and top cover.
So you pull it out like a drawer. Then screws? Screw that. It maybe lighter with rails but it is a waste of time.
How about a rack with rails? You pull the whole rack to troubleshoot and push it back(column of racks) to the air pathway(vent)
A rack with rails is better than a drawer with rails hehe
I'm a bit lost now. A rack is shown on the first picture, what's the point of pulling it (yes you can, it has wheels)? Or do you mean something else?
This is a classic server room setup, each rig equals to a server. Sure if you only have 4-5 rigs you don't need a rack, just use separate cases or hang them to the wall if you wish LOL But when you run dozens of rigs then it's completely different picture. Container setups for 500+cards/250+ asics with hot/cold zones, monitoring systems, security etc are in the market for a reason.
It is a closed rack, that' why i called it drawers, if you want to mine in drawers in the tropics then go ahead hehe
What i mean by racks is open air racks...just google and youtube big gpu farms, they don't mine in drawers with fans.....aaaand they are not in the tropics

Sure card cover panels can be removed and it won't affect the efficiency, but then it might cause additional problems with zoning (indoors) or dust (outdoors). In any case if I decide to go ahead and get one for myself I will test various configs and decide what's best for me.
Anyway, reading a real use case report right now, guys switched from classic rig setup to these cases about a year ago, managing to install 275 8 card rigs (2200 cards) on 70sq.m, 40% more efficient space use. Just getting rid of GPU and extra rig fans alone gave them 33.66Kw/h of extra power, saving almost $15000 annually. Cooling air volume for the room was reduced more than 3 times allowing use of less powerful equipment, hot air extraction system consumes only 8Kw during hot summer months, intake air temperature 30C, cards temperature 55C.
Pretty impressive I'd say!