I´m getting this Error-Masssage:
no chain_id
catch_up_rpc: abort
Opened e:\datenbank\bitcoin\daten\blocks\blk00080.dat
Exception at 13228858152345733452
Failed to catch up {'blkfile_offset': 123332745, 'blkfile_number': 100080, 'chai
n_id': None, 'loader': None, 'dirname': 'e:\\datenbank\\bitcoin\\daten', 'id': D
ecimal('2')}
Traceback (most recent call last):
File "Abe\DataStore.py", line 2639, in catch_up
store.catch_up_dir(dircfg)
File "Abe\DataStore.py", line 2897, in catch_up_dir
store.import_blkdat(dircfg, ds, blkfile['name'])
File "Abe\DataStore.py", line 3024, in import_blkdat
b = store.parse_block(ds, chain_id, magic, length)
File "Abe\DataStore.py", line 3055, in parse_block
d['transactions'].append(deserialize.parse_Transaction(ds))
File "Abe\deserialize.py", line 92, in parse_Transaction
d['txOut'].append(parse_TxOut(vds))
File "Abe\deserialize.py", line 67, in parse_TxOut
d['value'] = vds.read_int64()
File "Abe\BCDataStream.py", line 72, in read_int64
def read_int64 (self): return self._read_num(' File "Abe\BCDataStream.py", line 110, in _read_num
(i,) = struct.unpack_from(format, self.input, self.read_cursor)
OverflowError: Python int too large to convert to C long