LL1Checker 3.0
“Tool for verifying LL(1) grammars and validating input strings.”
Loading...
Searching...
No Matches
Lex Class Reference

Classes

struct  Add
 Functor for adding tokens to the token list during tokenization. More...
 
struct  ParseInput
 Lexer functor for defining tokenization rules using Boost Spirit Lex. More...
 

Public Member Functions

 Lex (std::string filename)
 Constructs a lexer and tokenizes the specified input file.
 
std::string Next ()
 Retrieves the next token from the token vector.
 

Private Member Functions

void Tokenize ()
 Tokenizes the input file using Boost Spirit Lex.
 

Private Attributes

std::string filename_
 
std::vector< std::string > tokens_
 
unsigned current_
 

Constructor & Destructor Documentation

◆ Lex()

Lex::Lex ( std::string filename)
explicit

Constructs a lexer and tokenizes the specified input file.

Parameters
filenamePath to the input file containing the string to be validated.
Note
The program aborts if any errors occur during lexer creation or tokenization.

Member Function Documentation

◆ Next()

std::string Lex::Next ( )

Retrieves the next token from the token vector.

Returns
std::string The next token in the sequence; returns an empty string if the end of the line (EOL) is reached.

This function allows sequential access to tokens processed by the lexer.

◆ Tokenize()

void Lex::Tokenize ( )
private

Tokenizes the input file using Boost Spirit Lex.

This function reads the content of the file specified by filename_, tokenizes it using Boost Spirit Lex, and stores the resulting tokens in the tokens_ member variable. If the tokenization process encounters an invalid token, a LexerError is thrown with an error message indicating the invalid token.

Exceptions
LexerErrorIf an invalid token is encountered during tokenization.

The function performs the following steps:

  1. Opens the file specified by filename_ and reads its content into a string.
  2. Converts the string into a C-style string (char array) for processing.
  3. Uses Boost Spirit Lex to tokenize the input string.
  4. If tokenization is successful, the tokens are stored in the tokens_ member variable.
  5. If tokenization fails (e.g., due to an invalid token), a LexerError is thrown.
Note
The function relies on the parse_input functor and the add function to handle the tokenization and token storage, respectively.
See also
LexerError
tokens_
filename_

The documentation for this class was generated from the following files: