package core:text/regex/tokenizer
⌘K
Ctrl+K
or
/
Index
Types (4)
Constants (0)
This section is empty.
Variables (0)
This section is empty.
Procedure Groups (0)
This section is empty.
Types
Error ¶
Error :: enum int { None, Illegal_Null_Character, Illegal_Codepoint, Illegal_Byte_Order_Mark, }
Related Procedures With Returns
Token ¶
Token :: struct { kind: Token_Kind, text: string, pos: int, }
Related Procedures With Returns
Token_Kind ¶
Token_Kind :: enum int { Invalid, EOF, Rune, Wildcard, Alternate, Concatenate, Repeat_Zero, Repeat_Zero_Non_Greedy, Repeat_One, Repeat_One_Non_Greedy, Repeat_N, Optional, Optional_Non_Greedy, Rune_Class, Open_Paren, Open_Paren_Non_Capture, Close_Paren, Anchor_Start, Anchor_End, Word_Boundary, Non_Word_Boundary, }
Tokenizer ¶
Tokenizer :: struct { flags: bit_set[regex_common.Flag; u8], src: string, ch: rune, offset: int, read_offset: int, last_token_kind: Token_Kind, held_token: Token, error_state: Error, paren_depth: int, }
Related Procedures With Parameters
Constants
This section is empty.
Variables
This section is empty.
Procedures
scan_comment ¶
scan_comment :: proc(t: ^Tokenizer) {…}
Procedure Groups
This section is empty.
Source Files
Generation Information
Generated with odin version dev-2024-11 (vendor "odin") Windows_amd64 @ 2024-11-16 21:10:10.881839000 +0000 UTC