that is a goog question. is it possible to speed that up, comparing two list with millions of lines?
If you sort the file and use binary search to look for each item in your list then your runtime becomes O(log2(n)) for each for each entry and so you're going to have a maximum of O(NumberOfAddresses*log2(n)) as your worst case runtime. It's really not slow, that's about 9 units of time to search for an address in a list with 1,000,000,000 lines in it.
Actually fitting all that into memory is going to be a problem though. There are some algorithms I read in a Knuth book about on-disk sorting but they're very old and I think they may have to be adapted for hard disks instead of tapes.