Hmm, I expected that it would only take hours with your build.
Have you set a higher dbcache as I recommended in my previous reply?
Because its default setting of 450MiB isn't ideal to your machine's specs.
Oh no, I missed that one. Just turned it up to 16GB and it didn't seem to change the speed. I'm on about block 400,000 after clearing the tables and starting over 36 hours ago (I noticed something in my code was failing to capture the addresses properly and everything was being labeled as 'unknown'. I've fixed it, and verified a few other issues with the data gathering. Started everything over.
Checking the system, it appears that RAM isn't being heavily used (only 3GB), and the real culprit is Postgres (taking up 50% of CPU - while processing each block in about 2-3 seconds - sometimes it will do 3-5 very quickly). This will eventually catch up, but that's not ideal. Probably not the optimal dB choice due to indexing... I should probably add records at speed and deal with deconficting and adding in the indexes after? Or possibly move to a time series database? I guess this gets into what exactly I want to do with the data... Haven't quite sorted that out yet, I wanted to see it first...
I tried removing all my ON CONFLICT statements, but that didn't seem to imrove things. I tried batching, and it didn't change the speed much either. I think this is just a Postgres insert issue. I should find a faster way to dump the data in, probably from a flat file?
I don't have much experience with datasets this large, I've usually gotten away with inserts as I go...