module volta.token.lexer

Code Map

module volta.token.lexer;


//! Tokenizes a source file.
fn lex(source: Source) TokenWriter { }
fn lex(source: Source) TokenWriter

Tokenizes a source file.

Side-effects: Will advance the source loc, on success this will be EOF.

Throws: CompilerError on errors.

Returns: A TokenWriter filled with tokens.

fn match(tw: TokenWriter, c: dchar) bool

Advance and return true if matched. Adds an error and returns false otherwise.

Side-effects: If @src.current and @c matches, advances source to next character.

fn match(tw: TokenWriter, s: string) bool

Call match for every character in a given string. Returns false if any match fails, true otherwise.

Side-effects: Same as calling match repeatedly.

fn matchIf(tw: TokenWriter, c: dchar) bool

Returns true if something has been matched, false otherwise. No errors generated.

fn lexFailed(tw: TokenWriter, s: string) LexStatus

Add a LexFailed error with the given string.

fn lexExpected(tw: TokenWriter, loc: Location, s: string) LexStatus

Add an Expected error with the given string.

fn lexExpected(tw: TokenWriter, s: string) LexStatus

Calls lexExpected with tw.source.loc.

fn nextLex(tw: TokenWriter) NextLex

Return which TokenType to try and lex next.

fn consume(src: Source, characters: scope (const(scope (dchar)[])) size_t

Consume characters from the source from the characters array until you can't. Returns: the number of characters consumed, not counting underscores.

fn lexNumber(tw: TokenWriter) LexStatus

Lex an integer literal and add the resulting token to tw. If it detects the number is floating point, it will call lexReal directly.

fn lexReal(tw: TokenWriter) LexStatus

Lex a floating literal and add the resulting token to tw.