Edit: And the irony of this -> Only ASICs will survive.
No. We'll see adaptive algos introduced which favor FPGA's. While FPGA programming may be a specialized skill and it takes time to code up a new algo, the cost and time to market of a FPGA solution will always be a lot less than designing and deploying new ASIC solutions and their energy efficiency (operational cost) will always be better than gpus.
I think there is no reason why fpgas can't be game changers.
I think you are not factoring in the cost of FPGAs.
I looked at x16r / RVN numbers. I estimate about 25$ / FPGA with 10000 on the network. The 5+ months to recover cost. And ten 10000 is conservative, considering x16R has probably 4 times the GPUs on it currently compared to Phi.
Make 20000, and its down to $12/card.
Anyway its in motion, this is not going to end well for GPUs or FPGAs.