My question is not about profitability. If we take your example (10 Satoshi 10 million times), we will see winning and losing streaks of 10 satoshi at each step. Obviously, with 1.01x you will see more winning streaks than losing ones and vice versa (it doesn't really matter). So, will variance in the length of, say, losing streaks remain the same for both of these options? To put it differently, we will inevitably see outliers (like 15 losses in a row for 1.01x) but will their regularity, i.e. how often they appear (if we measure them in some relative terms to make them comparable for different multipliers), be the same?
You are right about losing streaks. There is no doubt probability of losing 15 times consecutively is much lower when you wager on 1.01x
Okay, I will try a different approach to convey what I'm particularly curious about
If you run 10 million bets on 1.01x, your average losing streak will be equal to 1 since it can't be equal to 0 by definition and it still can't be greater than 1 plus some small fraction since the chances of winning are pretty high. On the other hand, if you bet on 9900x, the length of your losing streak on average can be like 1000 (I don't really know, it's just a hunch and a number to show what I mean). But neither average tells us anything about outliers. Is the frequency of outliers normalized to some common denominator (i.e. variance or deviation from the mean) going to be the same for these two multipliers? That's what interests me so much