Tokens¶
bcql_py.parser.tokens
¶
Token
dataclass
¶
A single token produced by the BCQL lexer.
Attributes:
| Name | Type | Description |
|---|---|---|
type |
TokenType
|
The :class: |
value |
str
|
The raw string content of the token. |
position |
int
|
The 0-based character offset in the source string. |
display_type
¶
Return a user-friendly representation of a TokenType for error messages.
Concrete punctuation and keywords are shown quoted (e.g. "']'", "'within'"); literal
categories are shown as a bare word (e.g. "string", "identifier"). Falls back to the
internal name if a future TokenType is added without a display entry.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
ttype
|
TokenType
|
The token type to format. |
required |
Returns:
| Type | Description |
|---|---|
str
|
A short, human-readable label suitable for inclusion in a syntax error message. |
View source on GitHub: src/bcql_py/parser/tokens.py lines 166–179
display_token
¶
Return a user-friendly representation of a concrete Token for error messages.
For category tokens (string, identifier, integer) the actual value is appended so the user can
see what they typed ("identifier 'word'"). For symbols and keywords the raw value is shown
quoted ("'['"). EOF becomes "end of input".
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
tok
|
'Token'
|
The token to format. |
required |
Returns:
| Type | Description |
|---|---|
str
|
A short, human-readable description of the token, suitable for |
View source on GitHub: src/bcql_py/parser/tokens.py lines 182–208