My question is not about profitability. If we take your example (10 Satoshi 10 million times), we will see winning and losing streaks of 10 satoshi at each step. Obviously, with 1.01x you will see more winning streaks than losing ones and vice versa (it doesn't really matter). So, will variance in the length of, say, losing streaks remain the same for both of these options? To put it differently, we will inevitably see outliers (like 15 losses in a row for 1.01x) but will their regularity, i.e. how often they appear (if we measure them in some relative terms to make them comparable for different multipliers), be the same?
You are right about losing streaks. There is no doubt probability of losing 15 times consecutively is much lower when you wager on 1.01x and that's why I said:
Usually, winners of the contests are those gamblers who wager on small odds with high probabilities. As the total wagring amount determine the winners, in this way they can play for a long time and increase their wagring volume. They try to play with low risk and lose as little as possible.
Mathematically, your chance of winning is same in both cases. But betting on big odds can cause you lose earlier. As you see in the posts I quoted, I believe that betting on small odds is better for increasing your wageing volume. But it doesn't mean that increase your chance in long term.
So, in short term, betting on small odds is better. Because it causes you to play for a longer time (especially when you have not a big bankroll)
In long term, there is no difference. (Those gamblers who play for long term usually have big bankrolls and can afford to lose even more than 20 times in a row)