tokenify

A regex lexer

https://github.com/AKST/tokenify

Latest on Hackage:0.1.2.0

This package is not currently in any snapshots. If you're interested in using it, we recommend adding it to Stackage Nightly. Doing so will make builds more reliable, and allow stackage.org to host generated Haddocks.

MIT licensed by Angus Thomsen
Maintained by [email protected]

A lexer used to split text into tokens