If someone buys some scratch tickets and loses, they lost their money. If someone buys some scratch tickets and wins, they buy more scratch tickets and lose their money anyways.
You are not taking the sample size of the entire field.
Let P(X) = probability of X result, E(X) = expected value, L = loss, W = win
You're outlining the following two cases: someone buys scratch tickets and loses, someone buys scratch tickets and wins (and then buys more scratch tickets).
Thus, we have (assuming 50-50 with 0% house edge)
Case 1: P(L) = 0.5
Case 2: P(W->L) = 0.5
That's not the case.
In reality, there is an infinite number of cases from your proposal of "greed."
Case 1: P(W) = 0.5 (Player wins 1 unit)
Case 2: P(L) = 0.5 (Player loses 1 unit)
E(X) = 0
Case 1.1: P(WL) = 0.25 (Player wins 0 units)
Case 1.2: P(WW) = 0.25 (Player wins 1 unit)
Case 2.1: P(LW) = 0.25 (Player wins 0 units)
Case 2.2: P(LL) = 0.25 (Player loses 1 unit)
E(X) = 0
We can continue infinitely but when you look at it, no matter how many bets the player does (i.e. total amount wagered) there is an expected profit of 0.
You keep assuming that the players lose. That's the problem in your argument. It's like trying to say that 1 = 2 by assuming that 1 = 2. That's begging the question.