The math is simple: for example, in a 4096-bit search space, an h160 prefix is found on average once. If this were frequent (2 out of 3 prefixes in 4096 bits), the hash function would be broken. That's why this is a good guide for probabilistic searches. Anyone who says otherwise is wrong.
No, what's wrong is that you misinterpret the math and make some false statements. You are contradicting the very basis of actual theory of probabilities.
No, the hash function ain't broke if you find some prefix zero times, once, twice, or a hundred times (and it will happen if you keep trying, or else reality is broken, not the hash).
The often edge cases will eventually be counter-weighted by their opposite encounters (ranges with zero results).
You should straight up go to Princeton and give some lectures about this before they call the police.