HTTP downloader tailored for web-crawler needs. https://github.com/bazqux/http-conduit-downloader
|Latest on Hackage:||1.0.33|
This package is not currently in any snapshots. If you're interested in using it, we recommend adding it to Stackage Nightly. Doing so will make builds more reliable, and allow stackage.org to host generated Haddocks.
HTTP/HTTPS downloader built on top of
and used in https://bazqux.com crawler.
Handles all possible http-conduit exceptions and returns human readable error messages.
Handles some web server bugs (returning
deflatedata instead of
Uses OpenSSL instead of
tlsdoesn't handle all sites).
Ignores invalid SSL sertificates.
Receives data in 32k chunks internally to reduce memory fragmentation on many parallel downloads.
Total download size limit.
Returns HTTP headers for subsequent redownloads and handles 'Not modified' results.
Can be used with external DNS resolver (e.g. concurrent-dns-cache).