Post
Topic
Board Project Development
Re: [ANNOUNCE] Abe 0.7: Open Source Block Explorer Knockoff
by
notawake
on 21/03/2013, 02:18:41 UTC
My Abe installation was stuck on Namecoin block 99501 (2013-03-10 13:15:28). Attempting to rescan the block chain yielded the following error:

Code:
block 326226 already in chain 3
commit
Exception at 463166665
Failed to catch up {'blkfile_number': 1, 'dirname': '/home/notawake/.namecoin', 'chain_id': None, 'id': Decimal('2'), 'blkfile_offset': 463114051}
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 2428, in catch_up
    store.catch_up_dir(dircfg)
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 2493, in catch_up_dir
    store.import_blkdat(dircfg, ds, blkfile['name'])
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 2626, in import_blkdat
    store.import_block(b, chain_ids = chain_ids)
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 1662, in import_block
    tx['tx_id'] = store.import_and_commit_tx(tx, pos == 0)
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 2147, in import_and_commit_tx
    tx_id = store.import_tx(tx, is_coinbase)
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 2091, in import_tx
    store.binin(txout['scriptPubKey']), pubkey_id))
  File "/usr/local/lib/python2.7/dist-packages/Abe/DataStore.py", line 464, in sql
    store.cursor.execute(cached, params)
DataError: value too long for type character varying(20000)

The next Namecoin block, 99502, has a size of 52606 bytes. In order to allow 99502 (and subsequent blocks) to be imported into the Abe database, I dropped the txout_detail view (because PostgreSQL does not allow changing the column types in tables used by views), removed the length restriction from the column txout_scriptpubkey in the table txout, and recreated the txout_detail view.

I'm sure this solution is dangerous if someone manages to stuff much larger blocks into a block chain, but I do not know a reasonable limit for this column yet.