In my previous example I split 2^40 = 2^20 * 2^20
but you can split the same interval in this way: 2^40 = 2^22 (space) * 2^18 (time)
We can never split 40 bits in two parts, such that none of the parts is less than half. The part that is the "space", no matter if big or small, needs to first be filled (using the respective amount of operations). It does not matter how well it is optimized, stored, scanned, or queried, the first thing that is required is that it needs to be
created.
This is the issue OP refuses to understand, this sqrt(n) bound. He basically provides an idea that makes a good algorithm tens to hundreds of times slower than it should be (the higher the range, the bigger the slowdown), at the expense of lower used space, and highly increased amount of work.
If its about precomputing and reuse, oh well, precomputing that reduces the number of computations after every run can already be applied to basically any other algorithm as well, with better results and with much lower need for storage.
And yes it was confirmed #120 was solved using kangaroos. No one on Earth can stop someone from building an ASIC kangaroo that runs millions of times faster than any CPU, but also no one on Earth will ever be able to do the same with BSGS (they would also need to pretty much change the basics of how a computer works as well, to use an insanely huge amount of memory that doesn't even exist). Only a flat-earther like the OP would refuse to understand this indeed.