Dug through the source, recompiled, made several cgminer modifications and tried several tweaks.
Learnings, tested on a 13.5TH autofreq miner from Dec 2017, overclocked to ~15.5TH with asicboost enabled on Braiins and playing with the code:
1) The temperature displayed (at least for my model of S9) is a complete lie. The miner (and this code comes from bitmain) tries to read both PCB and ASIC temp. The ASIC temps are deemed unreliable by the algorithm and/or can't be read on my model, so the code basically takes the actual PCB temp reading (generally 50-65C for me) adds an arbitrary 30-35C to "fake" an estimate at the ASIC temps.
Braiins web UI displays the faked temp including the offset (~90C), bitmain stock UI (autofreq, pre-asicboost) displays the original PCB reading (~60C).
2) Auto fan control works fine, it's just maybe biased a little slower than bitmain's -- mine hovered around 70% PWM in braiins. I made modifications to cgminer to push the fan curve higher for mine to ensure good cooling for overclocking, I'm now generally at 80-90% PWM now. The difference between 70% PWM fan and 100% PWM fan is at least 20-40W power increase to run the fans.
3) Power usage on pre-asicboost bitmain firmware at 13.5TH is roughly equivalent to Braiins with asicboost at 15TH (9V and 712mhz set on each chain) for me
4) The 'recommended voltage' goes down as frequency goes up based on Bitmain's internally coded tables, which are intended to help the device stay within a given power usage envelope. [until you hit the point of frying the thing], more voltage is always going to help the chain go faster -- it's just going to suck more power (and possibly more power than your PSU or cooling can handle)
I'm also working on an auto-frequency-adjuster that can lower frequency per-asic-chip on each board (like that one with the cool red/green UI, but automated).
It works well for modifying frequency on a per-asic basis, but the heuristic I was basing 'when should I turn down the frequency' on appears invalid, at least for some of the chips (I've been using 'reported per-asic hashrate not within 15% of ideal per-asic hashrate'). Anyone know what that red/green UI uses to determine when to display red?