I've often wondered whether you could effectively use this strategy to optimize a (braces for impact) blockchain. Presumably there are some relevant time spans:
- how long ago you need blocks for preventing alternate-history shenanigans
- how long ago you need blocks for nodes that may go offline for a while and come back confused
- how long ago you need fine granularity of records for accounting reasons
If none of those timespans are "forever" then maybe there could be agreement between nodes that at some point we start summarizing. So instead of deleting snapshots, you're merging adjacent blocks and purging transitional states that are not required to tell a consistent story.
- how long ago you need blocks for preventing alternate-history shenanigans
- how long ago you need blocks for nodes that may go offline for a while and come back confused
- how long ago you need fine granularity of records for accounting reasons
If none of those timespans are "forever" then maybe there could be agreement between nodes that at some point we start summarizing. So instead of deleting snapshots, you're merging adjacent blocks and purging transitional states that are not required to tell a consistent story.