Phew.. I just did a whole overhaul on the way stuff is saved, and it's not even done yet.
I went with the same kind of approach Notch did, now multiple chunks are saved in one file, and it's compressed! It's tiny
I created a test case:
Old format:
New format:
The new format stores a maximum of 1024 chunks ( 32 by 32 ) chunks in one file. Chunks are intelligently cached in compressed form in memory and uncompressed when they are needed which keeps the RAM usage really low (expect 2~4kb per unused chunk, ~80kb per used chunk). In addition to that, it's REALLY fast! which is pretty cool.
Data is compressed with zlib which (I think) comes with it's own corruption detection stuff, so I no longer have to check that manually
PS:
Another awesome thing is that your old data is converted to the new data ON THE FLY So you have to do nothing, and you won't even notice it!
I went with the same kind of approach Notch did, now multiple chunks are saved in one file, and it's compressed! It's tiny
I created a test case:
Old format:
- 325 files
- 25,3 MB
New format:
- 6 files
- 783 KB
The new format stores a maximum of 1024 chunks ( 32 by 32 ) chunks in one file. Chunks are intelligently cached in compressed form in memory and uncompressed when they are needed which keeps the RAM usage really low (expect 2~4kb per unused chunk, ~80kb per used chunk). In addition to that, it's REALLY fast! which is pretty cool.
Data is compressed with zlib which (I think) comes with it's own corruption detection stuff, so I no longer have to check that manually
PS:
Another awesome thing is that your old data is converted to the new data ON THE FLY So you have to do nothing, and you won't even notice it!