I am saying it makes no financial sense. The major GPU coins already have asics, with ETH being the biggest. And those coins that have no asic on it will have asics on it before the FPGA will come close to ever breaking even.
And everytime an asic takes over an alternate coin, all the GPUs and FPGAs that were mining that coin will go to a non-asic coin. That means the remaining non-asic coins will have difficulty skyrocket and guess wat, that means your GPUs and FPGAs mining make less money. Everytime a new asic comes out, the FPGAs make less money. So, there is no niche. There is a limited amount to be earned from mining u see. If the asics keep appearing to take some of it, the switchers (FPGAs/GPUs) will have less and less.
Why not u calculate how much it would cost to make an FPGA that can do the equivalent of mining 504mh like the L3+ or 14TH like the S9.
Run your numbers and see the gap. While I dont have the numbers, I dont think it can work.
Anyways, good luck in your endeavors. Maybe I am wrong.
Just my 2 cents
While your point of view is true in general, there are a couple of exceptions: mainly space and power constraints. Lets be very specific in our comparison to avoid meaningless comparisons, XCVU9P ($4,000 FPGA) vs GTX 1080 Ti ($800 GPU).
Now lets take the Phi1612 algorithm and give the FPGA a more realistic 5x performance advantage against the GPU at a power consumption rate of 150 watts (0.150 kWh). I would rather run a 100 FPGA farm (17.5 kWh) vs a 500 GPU farm (87.5 kWh) any day and heres why. Lower overall costs if you believe the 5x advantage ascribed to the FPGA.
4x FPGAs and components approx. ($16K + $1.2K) in server chassis and add 100 watts for overhead.
4x GPUs and components approx. ($3.2K + $0.65K) in frame and add 100 watts for overhead.
$430K 100 FPGA farm vs $481.25K 500 GPU farm. Can you at least agree that this makes financial sense?
Please don't take the power usage here, let's consider only the hash power and the cost between XCVU9P ($4,000 FPGA) vs GTX 1080 Ti ($800 GPU)