No, because the assumptions I made become less true the more rounds are done (maybe they're not even accurate enough after one round). The set of all possible images of SHA256^N becomes smaller for larger N until it converges to a fixed set (which is probably very large). Then SHA-256 is a permutation (one-to-one mapping) on this set. (This is true for every function from a space to itself).
I thought it would be interesting to see what the entropy reduction is for multiple rounds. I assumed each round has its own independent random oracle which maps k * N elements to N potential output elements, where 0 <= k <= 1 and N is 2 ^ 256. For each round, I found that on average, exp(-k) * N output elements have no preimage. Therefore, each round maps k * N elements to (1 - exp(-k)) * N elements.
Iterating this, the entropy reduction (ignoring the non-uniform output distribution for now) is:
Round | Cumulative entropy reduction |
1 | 0.6617 |
2 | 1.0938 |
4 | 1.6800 |
8 | 2.4032 |
16 | 3.2306 |
32 | 4.1285 |
64 | 5.0704 |
128 | 6.0381 |
256 | 7.0204 |
I don't observe any convergence, and indeed the equation k = 1 - exp(-k) has one solution: at k = 0. But this is probably because I assumed that each round had its own independent random oracle. The results may be different for a fixed function like SHA-256.