My first reaction was "+1 for fast setup", but most of the 24hr delay I suffered was local disc. Disabling fsync (?) on the database while in catch-up mode would help the most.
Huh, isn't P2P supposed to be faster because you can download from many users at once instead of one source?
(also the reason why some gaming companies use bittorrent to distribute updates)
Good point. But since the sha-256 of the block is wired into the code, it is perfectly reasonable to ship the data too. When the blockchain is over 500meg, I think transfer efficiency will become important.
We have options,
- ship blockchain from SF until it's not politely within their AUP, then re-evaluate. I couldn't find a file size limit, even for the project website service (only a quick surf of their docs).
- ship 'small' binaries from SF, and 'large' releases with data via BitTorrent
- ship 'small' release, including the .torrent for the blockchain and a fetcher script. This looks for one of three popular command line BitTorrent clients for the platform and uses that to fetch the chain, or whinge if it can't.
http://sourceforge.net/apps/trac/sourceforge/wiki/Developer%20web says
Note: All file releases should be a single file. Multiple files for the same release should be archived together (tar, deb, zip, etc.). We recommend using rsync for all uploads over 20 megabytes in size, as rsync allows for resuming canceled or interrupted transfers.
Hmm, shipping the blockchain for each binary arch would be perverse.
Then, who provides the tracker & seed for the data? Someone with incentive or community spirit? Well, this forum+wiki seem to live on
http://www.slicehost.com/ => min $20/month. It could probably share without hurting the website, and (I think) the seed could be severely throttled to make other BT seeds pull more weight.