Tokenize is a Julia package that serves a similar purpose and API as the tokenize module in Python but for Julia. This is to take a string or buffer containing Julia code, perform lexical analysis and return a stream of tokens.
Features
- Fast, it currently lexes all of Julia source files in ~0.25 seconds (580 files, 2 million Tokens)
- Round trippable, that is, from a stream of tokens the original string should be recoverable exactly
- Round trippable, that is, from a stream of tokens the original string should be recoverable exactly
- The function tokenize is the main entrypoint for generating Tokens
- Each Token is represented by where it starts and ends, what string it contains and what type it is
- Documentation available
License
MIT LicenseFollow Tokenize.jl
Other Useful Business Software
Our Free Plans just got better! | Auth0
You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of Tokenize.jl!