Next scheduled rescrape ... never
Version 2
Last scraped
Scraped on 26/04/2025, 10:22:58 UTC
It doesn't use my cores. I only have 4% out of 100!!!

(4% out of 100%) Shocked, may i know how many cores you have ?
The modification I suggested, which limits the multiprocessing.Pool to 60 workers, was necessary to avoid the ValueError: need at most 63 handles caused by the Windows API’s limitation of 63 simultaneous processes.

Try this:
Code:
with multiprocessing.Pool(min(cpu_count, cpu_count - 2), initializer=init_globals,
                         initargs=(shared_counter, shared_found_counter, total_seeds)) as pool:

Or try experimenting with a smaller chunk size, like 1 million or 100,000, to see if it improves utilization,, Smaller chunks allow more frequent task distribution to workers, which can help keep all cores busy  Roll Eyes
Version 1
Scraped on 19/04/2025, 10:28:14 UTC
It doesn't use my cores. I only have 4% out of 100!!!

(4% out of 100%) Shocked, may i know how many corecores you have ?
The modification I suggested, which limits the multiprocessing.Pool to 60 workers, was necessary to avoid the ValueError: need at most 63 handles caused by the Windows API’s limitation of 63 simultaneous processes.

Try this:
Code:
with multiprocessing.Pool(min(cpu_count, cpu_count - 2), initializer=init_globals,
                         initargs=(shared_counter, shared_found_counter, total_seeds)) as pool:

Or try experimenting with a smaller chunk size, like 1 million or 100,000, to see if it improves utilization,, Smaller chunks allow more frequent task distribution to workers, which can help keep all cores busy  Roll Eyes
Original archived Re: Bitcoin puzzle transaction ~32 BTC prize to who solves it
Scraped on 19/04/2025, 10:22:52 UTC
It doesn't use my cores. I only have 4% out of 100!!!

(4% out of 100%) Shocked, may i know how many core you have ?
The modification I suggested, which limits the multiprocessing.Pool to 60 workers, was necessary to avoid the ValueError: need at most 63 handles caused by the Windows API’s limitation of 63 simultaneous processes.

Try this:
Code:
with multiprocessing.Pool(min(cpu_count, cpu_count - 2), initializer=init_globals,
                         initargs=(shared_counter, shared_found_counter, total_seeds)) as pool:

Or try experimenting with a smaller chunk size, like 1 million or 100,000, to see if it improves utilization,, Smaller chunks allow more frequent task distribution to workers, which can help keep all cores busy  Roll Eyes