By the time you're doubling up on the amount of work (time) just to halve the required space, you're already far behind a different approach, which uses the same amount of time (work), but a constant amount of required space...
The costs of really wanting to get a 100% guaranteed deterministic solution does not scale from a point on, unfortunately. But getting a cheap probabilistic method to scale, and never finding it to NOT work, but still insisting it doesn't work, is like pretending that humanity can advance to a Type 4 civilization by tomorrow (possible? sure! likely? not so much!). It's more likely that something like that will happen, rather that a probabilistic algorithm to not find a solution. A probabilistic method is not like some lottery or whatever people may try to get an analogue to; it is unimaginable to actually find a counter-example that beats all odds.
false, once the limit is exceeded, kangaroo has no advantage, so it is worth the time spent generating the db. if you would use mathematical references to validate your claims, that would be great.
this is a method of improving scalability.
but hey, keep wasting your time with kangaroo, 3emi sends you his best wishes.
So what limit is that? The fact that #120 was just solved using "crazy fast kangaroos" doesn't help your case very much, it just proves again that they indeed work, exactly according to the math you're so much dismissing as "false". Let me see some BSGS solving any 90+ bits puzzle please. Without months of "building a database", as if that should not be taken into account at all as "work".