HTTP downloader tailored for web-crawler needs.

Latest on Hackage:1.0.33

This package is not currently in any snapshots. If you're interested in using it, we recommend adding it to Stackage Nightly. Doing so will make builds more reliable, and allow to host generated Haddocks.

BSD-3-Clause licensed and maintained by Vladimir Shabanov

HTTP/HTTPS downloader built on top of http-conduit and used in crawler.

  • Handles all possible http-conduit exceptions and returns human readable error messages.

  • Handles some web server bugs (returning deflate data instead of gzip, invalid gzip encoding).

  • Uses OpenSSL instead of tls package (since tls doesn't handle all sites).

  • Ignores invalid SSL sertificates.

  • Receives data in 32k chunks internally to reduce memory fragmentation on many parallel downloads.

  • Download timeout.

  • Total download size limit.

  • Returns HTTP headers for subsequent redownloads and handles 'Not modified' results.

  • Can be used with external DNS resolver (e.g. concurrent-dns-cache).