Things left to consider:
- token::Type enum is getting fairly large.
breaking it up could invoke substantial code bloat
- Compound operators might make more sense at the parser level
- Compound-assign operators are ripe for syntactic desugaring,
but there must be some reason it's done separately in other languages.
- Operators like FatArrow may still make sense at the tokenizer level, regardless.
- What is a lexer? A miserable pile of parsers!
- Operator overloading, or user-defined operators? Hmm...