Interesting. I try to present a similar concept when I get asked about operating expenses (hardware, power bill, etc.) versus BTC mined. For example: if mining one BTC ends up yielding a negative bottom line of -$100 (technically a loss) after all the operating expenses are applied, it can be argued that it is akin to buying one BTC at $100. Therefore, based on the current BTC-USD average exchange rate of $138, it would then be an excellent buy. It's all relative.
In that case, you hace counted the coins' value twice.
Unless, of course, "if mining one BTC ends up yielding a negative bottom line of -$100" does mean you have counted costs and not subtracted the coins value yet.
But with a GPU, you definitely use way way more than 138$ on energy per coin you mine.
The "i convert currencies" explanation is perfectly fine, its just interesting that some trade a BTC for 300$ using the GPU, which is loud and hot in your room all the time, instead of buying from some guy at 138$

Not paying for energy could be one good reason, but there are stories where the miners found out these kind of deals can easily get cancelled when they start consuming way more than before all the time. The only explanation where i dont feel someone is really mis-calculating, is "its my hobby, i have no problem with my hobby costing money".
My example is just that, an example. I didn't specifically say that GPUs would yield a -$100 loss per BTC. Just in case you didn't get the gist of it, it's just a different way of looking at it (hence I stated, "It's all relative"). Why would it be counting the value of coins twice? Please explain in detail. What part of "after all the operating expenses are applied" makes it "counting the coin's value twice"? I reiterate: if I ended up with a -$100 (loss) after all the expenses are subtracted, it stands that I have one BTC that cost me $100 (debit) which is actually worth $138 or so in the market. Doesn't it mean that I'm still ahead $38 if I exchange that one BTC for $138?