The ‘store’ package provides efficient binary serialization. There are a couple features that particularly distinguish it from most prior Haskell serialization libraries:
Its primary goal is speed. By default, direct machine representations are used for things like numeric values (
Word32, etc) and buffers (
Vector, etc). This means that much of serialization uses the equivalent of
- Another way that the serialization behavior can vary is if
integer-simple is used instead of GHC’s default of using
Integerserialized with the
integer-simpleflag enabled are not compatible with those serialized without the flag enabled.
- Another way that the serialization behavior can vary is if integer-simple is used instead of GHC’s default of using GMP.
Instead of implementing lazy serialization / deserialization involving multiple input / output buffers,
pokealways work with a single buffer. This buffer is allocated by asking the value for its size before encoding. This simplifies the encoding logic, and allows for highly optimized tight loops.
storecan optimize size computations by knowing when some types always use the same number of bytes. This allows us to compute the byte size of a
Vector Int32by just doing
length v * 4.
It also features:
Optimized serialization instances for many types from base, vector, bytestring, text, containers, time, template-haskell, and more.
TH and GHC Generics based generation of Store instances for datatypes.
TH generation of testcases.
Utilities for streaming encoding / decoding of Store encoded messages, via the
Store is best used for communication between trusted processes and local caches. It can certainly be used for other purposes, but the builtin set of instances have some gotchas to be aware of:
Store’s builtin instances serialize in a format which depends on machine endianness.
Store’s builtin instances trust the data when deserializing. For example, the deserialization of
Vectorwill read the vector’s length from the first 8 bytes. It will then allocate enough memory to store all the elements. Malicious or malformed input could cause allocation of large amounts of memory. See issue #122.
- Fixes testsuite compilation with
network >= 3.1.2. See [#159].
Storeinstances for all serializable datatypes exported by the
timelibrary. See #158.
- Attempts to fix build on ghc-7.8.4. See #157.
- Adds a
Natural. See #154.
- Test now compiles with
smallcheck >= 1.2and
base >= 4.14. See #153.
- Now only depends on
ghc < 8.
- Fix for compilation with
0.7.3did not use enough CPP, and so broke builds for older versions. This release fixes that.
- Fixes compilation with
template-haskell-126.96.36.199. See #149.
- Fixes compilation with
vector >= 0.12.1.1by making
deriveManyStoreUnboxVectorcapable of handling more complex instance constraints. In particular, it now correctly generates instances
Store (Vector (f (g a))) => Store (Vector (Compose f g a))and
Store (Vector (f a)) => Store (Vector (Alt f a)).
Fixes compilation with GHC-7.10 due to it not defining
Identity. See #142.
Documents some gotchas about using store vs other libraries
- Fixes a bug where the
Storablesuperclasses instead of `Store. See #143.
- Can now optionally be built with
integer-gmp, via the
integer-simplecabal flag. Note that the serialization of
integer-simplediffers from what is used by the GMP default. See #147.
- Now builds with GHC-7.10 - compatibility was broken in 0.6.0 due to the fix for GHC-8.8. See [#146][https://github.com/fpco/store/issues/146].
- Now builds with GHC-8.8. This is a major version bump because MonadFail constraints were added to some functions, which is potentially a breaking change.
- Fixes compilation with GHC < 8.0. See #142.
- Update to the instances for generics, to improve error messages for sum types with more than 255 constructors. See #141
Update to TH to support sum types with more than 62 constructors.
Uses TH to derive Either instance, so that it can sometimes have ConstSize #119.
- Updates to test-suite enabling
storeto build with newer dependencies.
Data.Store.Streamingmoved to a separate package,
Buildable with GHC 8.2
Fix to haddock formatting of Data.Store.TH code example
- Fixed compilation on GHC 7.8
- Less aggressive inlining, resulting in faster compilation / simplifier not running out of ticks
- Fixed testsuite
Breaking change in the encoding of Map / Set / IntMap / IntSet, to use ascending key order. Attempting to decode data written by prior versions of store (and vice versa) will almost always fail with a decent error message. If you’re unlucky enough to have a collision in the data with a random Word32 magic number, then the error may not be so clear, or in extremely rare cases, successfully decode, yielding incorrect results. See #97 and #101.
Performance improvement of the ‘Peek’ monad, by introducing more strictness. This required a change to the internal API.
API and behavior of ‘Data.Store.Version’ changed. Previously, it would check the version tag after decoding the contents. It now also stores a magic Word32 tag at the beginning, so that it fails more gracefully when decoding input that lacks encoded version info.
Deprecated in favor of 0.4.1
Fix to derivation of primitive vectors, only relevant when built with primitive-0.6.2.0 or later
Removes INLINE pragmas on the generic default methods. This dramatically improves compilation time on recent GHC versions. See #91.
instance Contravariant Size
Uses store-core-0.3.*, which has support for alignment sensitive architectures.
Adds support for streaming decode from file descriptor, not supported on windows. As part of this addition, the API for “Data.Store.Streaming” has changed.
- Fixes a bug that could could result in attempting to malloc a negative number of bytes when reading corrupted data.
- Fixes a bug that could result in segfaults when reading corrupted data.
- Adds experimental
Data.Store.TypeHash. The new functionality is similar to TypeHash, but there are much fewer false positives of hashes changing.
- Now exports types related to generics
- Core functionality split into
Streaming support now prefixes each Message with a magic number, intended to detect mis-alignment of data frames. This is worth the overhead, because otherwise serialization errors could be more catastrophic - interpretting some bytes as a length tag and attempting to consume many bytes from the source.
weigh based allocations benchmark.
Streaming support now has checks for over/undershooting buffer
- First public release