You would have to give me concrete examples.
work with = I've tested with 60 million xpoints, so I know it can do that many, how many are you wanting to run/test?
small key sizes = what do you mean? search in a 40 bit range, 48 bit range, etc. are you talking about ranges or small key sizes as in, private key sizes?
addresses/xpoints=addresses would be converted to hash160 which would be smaller than xpoints, so I would imagine, ran on the same systems, with same amount of RAM, you could run more addresses/hash160s versus xpoints.
I want to test maybe 10-15x more close to 1B addresses. Machine specs 50TB SSDs, 128 RAM. How should I proceed, just generate a large .txt file and see if it works? I doubt .txt files can handle that much data...
yes, small key sizes as in 20^35 - 2^40 keysize. How much of an impact has the size of the key? As in the smaller the key, the larger the amount of xpoints or addresses the tool can work with? Or even if the key is really small like 123*G it's still going to search forever within a large dataset of addresses...