|
struct | Add |
| Functor for adding tokens to the token list during tokenization. More...
|
|
struct | ParseInput |
| Lexer functor for defining tokenization rules using Boost Spirit Lex. More...
|
|
|
| Lex (std::string filename) |
| Constructs a lexer and tokenizes the specified input file.
|
|
std::string | Next () |
| Retrieves the next token from the token vector.
|
|
|
void | Tokenize () |
| Tokenizes the input file using Boost Spirit Lex.
|
|
|
std::string | filename_ |
|
std::vector< std::string > | tokens_ |
|
unsigned | current_ |
|
◆ Lex()
Lex::Lex |
( |
std::string | filename | ) |
|
|
explicit |
Constructs a lexer and tokenizes the specified input file.
- Parameters
-
filename | Path to the input file containing the string to be validated. |
- Note
- The program aborts if any errors occur during lexer creation or tokenization.
◆ Next()
std::string Lex::Next |
( |
| ) |
|
Retrieves the next token from the token vector.
- Returns
- std::string The next token in the sequence; returns an empty string if the end of the line (EOL) is reached.
This function allows sequential access to tokens processed by the lexer.
◆ Tokenize()
Tokenizes the input file using Boost Spirit Lex.
This function reads the content of the file specified by filename_
, tokenizes it using Boost Spirit Lex, and stores the resulting tokens in the tokens_
member variable. If the tokenization process encounters an invalid token, a LexerError
is thrown with an error message indicating the invalid token.
- Exceptions
-
LexerError | If an invalid token is encountered during tokenization. |
The function performs the following steps:
- Opens the file specified by
filename_
and reads its content into a string.
- Converts the string into a C-style string (char array) for processing.
- Uses Boost Spirit Lex to tokenize the input string.
- If tokenization is successful, the tokens are stored in the
tokens_
member variable.
- If tokenization fails (e.g., due to an invalid token), a
LexerError
is thrown.
- Note
- The function relies on the
parse_input
functor and the add
function to handle the tokenization and token storage, respectively.
- See also
- LexerError
-
tokens_
-
filename_
The documentation for this class was generated from the following files: