A simple Hadoop streaming library
|LTS Haskell 19.33:||0.2.0.3|
|Stackage Nightly 2022-03-17:||0.2.0.3|
|Latest on Hackage:||0.2.0.3|
Module documentation for 0.2.0.3
A simple Hadoop streaming library based on conduit, useful for writing mapper and reducer logic in Haskell and running it on AWS Elastic MapReduce, Azure HDInsight, GCP Dataproc, and so forth.
Word Count Example
See the Haddock in
for a simple word-count example.
A Few Things to Note
ByteString vs Text
HadoopStreaming module provides the general
Reducer data types, whose input and output
types are abstract. They are usually instantiated with either
ByteString is more suitable if the input/output needs to be decoded/encoded, for instance using the
base64-bytestring library. On the other hand,
Text could make more sense if decoding/encoding is not needed,
or if the data is not UTF-8 encoded (see below regarding encodings). In general I’d imagine
used much more often than
HadoopStreaming.Text modules provide some utilities for working with
It is highly recommended that your input data be UTF-8 encoded, as this is the default encoding Hadoop uses. If you must use other encodings such as UTF-16, keep in mind the following gotchas:
It is not enough that your code can work with the encoding you choose to use:
By default, if any of your input files does not end with a UTF-8 representation of newline, i.e., a
0x0Abyte, Hadoop streaming will add a
Likewise, if any line in your mapper output does not contain a UTF-8 representation of tab (
0x09), Hadoop streaming will add it at the end of the line.
This will almost certainly break your job. It may be possible to configure Hadoop streaming and tell it to use other encodings, so that the above behavior is consistent with the encoding you choose to use, but I don’t know whether that is the case. I tried
-D mapreduce.map.java.opts="-Dfile.encoding=UTF-16BE"but that doesn’t seem to work.
If you use
ByteStringas the input type and use
Data.ByteString.hGetLineto read lines from the input, be aware that
0x0Abytes as line breaks, so it doesn’t work properly for non-UTF-8 encoded input. For example, in UTF-16BE and UTF-16LE, the newline character is encoded as
0x0A 0x00, respectively.
Revision history for hadoop-streaming
0.2.0.3 – 2020-05-18
- Update text lower bound.
0.2.0.2 – 2020-04-06
- Add test files to the tarball.
0.2.0.1 – 2020-04-05
- Fix a broken link.
0.2.0.0 – 2020-04-05
- Make input and output types abstract.
0.1.0.0 – 2020-04-01
- First version. Released on an unsuspecting world.