package core:crypto/hash
Overview
package hash provides a generic interface to the supported hash algorithms.
A high-level convenience procedure group hash
is provided to easily
accomplish common tasks.
hash_string
- Hash a given string and return the digest.
hash_bytes
- Hash a given byte slice and return the digest.
hash_string_to_buffer
- Hash a given string and put the digest in
the third parameter. It requires that the destination buffer
is at least as big as the digest size.
hash_bytes_to_buffer
- Hash a given string and put the computed
digest in the third parameter. It requires that the destination
buffer is at least as big as the digest size.
hash_stream
- Incrementally fully consume a io.Stream
, and return
the computed digest.
hash_file
- Takes a file handle and returns the computed digest.
A third optional boolean parameter controls if the file is streamed
(default), or or read at once.
Example:
package hash_example
import "core:crypto/hash"
main :: proc() {
input := "Feed the fire."
// Compute the digest, using the high level API.
returned_digest := hash.hash(hash.Algorithm.SHA512_256, input)
defer delete(returned_digest)
// Variant that takes a destination buffer, instead of returning
// the digest.
digest := make([]byte, hash.DIGEST_SIZES[hash.Algorithm.BLAKE2B]) // @note: Destination buffer has to be at least as big as the digest size of the hash.
defer delete(digest)
hash.hash(hash.Algorithm.BLAKE2B, input, digest)
}
A generic low level API is provided supporting the init/update/final interface that is typical with cryptographic hash function implementations.
Example:
package hash_example
import "core:crypto/hash"
main :: proc() {
input := "Let the cinders burn."
// Compute the digest, using the low level API.
ctx: hash.Context
digest := make([]byte, hash.DIGEST_SIZES[hash.Algorithm.SHA3_512])
defer delete(digest)
hash.init(&ctx, hash.Algorithm.SHA3_512)
hash.update(&ctx, transmute([]byte)input)
hash.final(&ctx, digest)
}
Index
Constants (2)
Variables (3)
Procedure Groups (1)
Types
Algorithm ¶
Algorithm :: enum int { Invalid, BLAKE2B, BLAKE2S, SHA224, SHA256, SHA384, SHA512, SHA512_256, SHA3_224, SHA3_256, SHA3_384, SHA3_512, SM3, Legacy_KECCAK_224, Legacy_KECCAK_256, Legacy_KECCAK_384, Legacy_KECCAK_512, Insecure_MD5, Insecure_SHA1, }
Algorithm is the algorithm identifier associated with a given Context.
Related Procedures With Parameters
- hash_bytes
- hash_bytes_to_buffer
- hash_file
- hash_stream
- hash_string
- hash_string_to_buffer
- init
- hash (procedure groups)
Related Procedures With Returns
Context ¶
Context :: struct { _algo: Algorithm, _impl: union { _blake2.Blake2b_Context, _blake2.Blake2s_Context, sha2.Context_256, sha2.Context_512, sha3.Context, sm3.Context, keccak.Context, md5.Context, sha1.Context, }, }
Context is a concrete instantiation of a specific hash algorithm.
Related Procedures With Parameters
Constants
MAX_BLOCK_SIZE ¶
MAX_BLOCK_SIZE :: sha3.BLOCK_SIZE_224
MAX_BLOCK_SIZE is the maximum block size used by any of Algorithms supported by this package.
MAX_DIGEST_SIZE ¶
MAX_DIGEST_SIZE :: 64
MAX_DIGEST_SIZE is the maximum size digest that can be returned by any of the Algorithms supported via this package.
Variables
ALGORITHM_NAMES ¶
ALGORITHM_NAMES is the Algorithm to algorithm name string.
BLOCK_SIZES ¶
BLOCK_SIZES is the Algoritm to block size in bytes.
DIGEST_SIZES ¶
DIGEST_SIZES is the Algorithm to digest size in bytes.
Procedures
algorithm ¶
algorithm returns the Algorithm used by a Context instance.
block_size ¶
block_size returns the block size of a Context instance in bytes.
digest_size ¶
digest_size returns the digest size of a Context instance in bytes.
final ¶
final finalizes the Context, writes the digest to hash, and calls reset on the Context.
Iff finalize_clone is set, final will work on a copy of the Context, which is useful for for calculating rolling digests.
hash_bytes ¶
hash_bytes :: proc(algorithm: Algorithm, data: []u8, allocator := context.allocator) -> []u8 {…}
hash_bytes will hash the given input and return the computed digest in a newly allocated slice.
hash_bytes_to_buffer ¶
hash_bytes_to_buffer will hash the given input and write the
computed digest into the third parameter. It requires that the
destination buffer is at least as big as the digest size. The
provided destination buffer is returned to match the behavior of
hash_bytes
.
hash_file ¶
hash_file :: proc(algorithm: Algorithm, hd: os.Handle, load_at_once: bool = false, allocator := context.allocator) -> ([]u8, io.Error) {…}
hash_file will read the file provided by the given handle and return the computed digest in a newly allocated slice.
hash_stream ¶
hash_stream :: proc(algorithm: Algorithm, s: io.Stream, allocator := context.allocator) -> ([]u8, io.Error) {…}
hash_stream will incrementally fully consume a stream, and return the computed digest in a newly allocated slice.
hash_string ¶
hash_string :: proc(algorithm: Algorithm, data: string, allocator := context.allocator) -> []u8 {…}
hash_bytes will hash the given input and return the computed digest in a newly allocated slice.
hash_string_to_buffer ¶
hash_string_to_buffer will hash the given input and assign the
computed digest to the third parameter. It requires that the
destination buffer is at least as big as the digest size. The
provided destination buffer is returned to match the behavior of
hash_string
.
init ¶
init initializes a Context with a specific hash Algorithm.
reset ¶
reset :: proc(ctx: ^Context) {…}
reset sanitizes the Context. The Context must be re-initialized to be used again.
Procedure Groups
hash ¶
hash :: proc{ hash_stream, hash_file, hash_bytes, hash_string, hash_bytes_to_buffer, hash_string_to_buffer, }
Source Files
Generation Information
Generated with odin version dev-2024-11 (vendor "odin") Windows_amd64 @ 2024-11-16 21:10:09.814414400 +0000 UTC