package core:encoding/csv

Overview

package csv reads and writes comma-separated values (CSV) files. This package supports the format described in RFC 4180

Types

Error ¶

Error :: union {
	Reader_Error, 
	io.Error, 
}

Reader ¶

Reader :: struct {
	// comma is the field delimiter
	// reader_init will set it to be ','
	// A "comma" must be a valid rune, nor can it be \r, \n, or the Unicode replacement character (0xfffd)
	comma:               rune,
	// comment, if not 0, is the comment character
	// Lines beginning with the comment character without a preceding whitespace are ignored
	comment:             rune,
	// fields_per_record is the number of expected fields per record
	//         if fields_per_record is >0, 'read' requires each record to have that field count
	//         if fields_per_record is  0, 'read' sets it to the field count in the first record
	//         if fields_per_record is <0, no check is made and records may have a variable field count
	fields_per_record:   int,
	// If trim_leading_space is true, leading whitespace in a field is ignored
	// This is done even if the field delimiter (comma), is whitespace
	trim_leading_space:  bool,
	// If lazy_quotes is true, a quote may appear in an unquoted field and a non-doubled quote may appear in a quoted field
	lazy_quotes:         bool,
	// multiline_fields, when set to true, will treat a field starting with a " as a multiline string
	// therefore, instead of reading until the next \n, it'll read until the next "
	multiline_fields:    bool,
	// reuse_record controls whether calls to 'read' may return a slice using the backing buffer
	// for performance
	// By default, each call to 'read' returns a newly allocated slice
	reuse_record:        bool,
	// reuse_record_buffer controls whether calls to 'read' clone the strings of each field or uses
	// the data stored in record buffer for performance
	// By default, each call to 'read' clones the strings of each field
	reuse_record_buffer: bool,
	// internal buffers
	r:                   bufio.Reader,
	line_count:          int,
	// current line being read in the CSV file
	raw_buffer:          [dynamic]u8,
	record_buffer:       [dynamic]u8,
	field_indices:       [dynamic]int,
	last_record:         [dynamic]string,
	sr:                  strings.Reader,
}
 

Reader is a data structure used for reading records from a CSV-encoded file

The associated procedures for Reader expects its input to conform to RFC 4180.

Reader_Error ¶

Reader_Error :: struct {
	kind:       Reader_Error_Kind,
	start_line: int,
	line:       int,
	column:     int,
	expected,
	got:        int,
}

Reader_Error_Kind ¶

Reader_Error_Kind :: enum int {
	Bare_Quote, 
	Quote, 
	Field_Count, 
	Invalid_Delim, 
}

Writer ¶

Writer :: struct {
	// Field delimiter (set to ',' with writer_init)
	comma:    rune,
	// if set to true, \r\n will be used as the line terminator
	use_crlf: bool,
	w:        io.Writer,
}
 

Writer is a data structure used for writing records using a CSV-encoding.

Constants

DEFAULT_RECORD_BUFFER_CAPACITY ¶

DEFAULT_RECORD_BUFFER_CAPACITY :: 256

Variables

reader_error_kind_string ¶

reader_error_kind_string: [Reader_Error_Kind]string = …

Procedures

is_io_error ¶

is_io_error :: proc "odin" (err: Error, io_err: io.Error) -> bool {…}
 

is_io_error checks where an Error is a specific io.Error kind

read ¶

read :: proc "odin" (r: ^Reader, allocator := context.allocator) -> (record: []string, err: Error) {…}
 

read reads a single record (a slice of fields) from r

All \r\n sequences are normalized to \n, including multi-line field

read_all ¶

read_all :: proc "odin" (r: ^Reader, allocator := context.allocator) -> ([][]string, Error) {…}
 

read_all reads all the remaining records from r. Each record is a slice of fields. read_all is defined to read until an EOF, and does not treat, and does not treat EOF as an error

read_all_from_string ¶

read_all_from_string :: proc "odin" (input: string, records_allocator := context.allocator, buffer_allocator := context.allocator) -> ([][]string, Error) {…}
 

read_all reads all the remaining records from the provided input.

read_from_string ¶

read_from_string :: proc "odin" (input: string, record_allocator := context.allocator, buffer_allocator := context.allocator) -> (record: []string, n: int, err: Error) {…}
 

read reads a single record (a slice of fields) from the provided input.

reader_destroy ¶

reader_destroy :: proc "odin" (r: ^Reader) {…}
 

reader_destroy destroys a Reader

reader_init ¶

reader_init :: proc "odin" (reader: ^Reader, r: io.Reader, buffer_allocator := context.allocator) {…}
 

reader_init initializes a new Reader from r

reader_init_with_string ¶

reader_init_with_string :: proc "odin" (reader: ^Reader, s: string, buffer_allocator := context.allocator) {…}
 

reader_init_with_string initializes a new Reader from s

write ¶

write :: proc "odin" (w: ^Writer, record: []string) -> io.Error {…}
 

write writes a single CSV records to w with any of the necessarily quoting. A record is a slice of strings, where each string is a single field.

If the underlying io.Writer requires flushing, make sure to call io.flush

write_all ¶

write_all :: proc "odin" (w: ^Writer, records: [][]string) -> io.Error {…}
 

write_all writes multiple CSV records to w using write, and then flushes (if necessary).

writer_flush ¶

writer_flush :: proc "odin" (w: ^Writer) -> io.Error {…}
 

writer_flush flushes the underlying io.Writer. If the underlying io.Writer does not support flush, nil is returned.

writer_init ¶

writer_init :: proc "odin" (writer: ^Writer, w: io.Writer) {…}
 

writer_init initializes a Writer that writes to w

Procedure Groups

This section is empty.

Source Files

Generation Information

Generated with odin version dev-2023-03 (vendor "odin") Windows_amd64 @ 2023-03-29 21:09:05.404842200 +0000 UTC