I guess the lesson here is that <Space x Time> is always a constant!
....
Faster = Larger
Slower = Smaller
The only way to reduce this constant would be to find a property satisfied with differenties probabilities in 2 sets:
the set of the keys stored (Space)
the set of the keys generated (Time)
For example, if you have to work in a 2^40 interval,
usually you have to precompute 2^20 keys to store (space)
and 2^20 keys at run time (time)
then 2^20 * 2^20 = 2^40
Now image you find a interval of 2^20 keys where a certain property occurs with probability 1/4 while the same property occurs with probability 1/2 in the rest of the space (example: points with coordinates X lower than 0.5 * 2^256)
i.e. a set where a certain property occurs less frequently than the expected value.
You can store 2^20/2^2 = 2^18 keys, only the keys that satisfy such a property,
and then you need to generate 2^20 keys * 2 = 2^21 keys (on average you need to generate 2 keys at each step, because only 1 of 2 satisfies the property).
So: 2^18 (space) x 2^21 (time) = 2^39
But you have to find such property.
By the time you're doubling up on the amount of work (time) just to halve the required space, you're already far behind a different approach, which uses the same amount of time (work), but a constant amount of required space...
The costs of really wanting to get a 100% guaranteed deterministic solution does not scale from a point on, unfortunately. But getting a cheap probabilistic method to scale, and never finding it to NOT work, is like pretending that humanity can advance to a Type 4 civilization by tomorrow (possible? sure! likely? not so much!). It's more likely that something like that will happen, rather that a probabilistic algorithm to not find a solution. A probabilistic method is not like some lottery or whatever people may try to get an analogue to; it is unimaginable to actually find a counter-example that beats all odds.