If you're talking about several kB, that's going to be very annoying to do. At least add partial checksums per line or multiple lines.
Yeah maybe! Although you seem to forget that my file is almost 90 kilobytes. Checksums is an additional overhead and software complexity that most people probably don't have the technical ability for. But I'm intrigued.
This makes it much faster to check where you have a reading error.
I wish I could test out this method but I'm skeptical that it's going to be worth the additional hassle and storage space being consumed. but I could be wrong. especially with a 90,000 character string that needs to be scanned in correctly!
A simple md5sum (
https://en.wikipedia.org/wiki/Md5sum) per line would be:
Code:
43dbfbbc3fe9eecccc313b5ed4707bec -
7d2f3295028a1dfb41df0c9e696d9d9b -
b75dbc5c69502db35d76028643314996 -
24d8f8420aaf27e25d93787cd434a7b9 -
5833cf59445a9657c2da7088ae7a4119 -
5952f16a3f33d8c87e5846605cc95cac -
6989ed58a32c2622dda4418eab730c65 -
844a3c31527f88588c3dd7f22ccdf883 -
c5dca4c2ee7ac2d7c0c1874d267a8b7a -
32b5dc97587bd5e7e34f58f55f90353d -
yeah I know about md5sum but do you think that's the best type of checksum to be using? It doesn't attempt to fix any errors and it's pretty huge.

I mean my 25 page file I could legit see it balooning to 40 pages after this fiasco!