Removed context switching system from the crypto library to simplify the code

This commit is contained in:
zhibog
2021-11-09 16:50:13 +01:00
parent eb96f9677e
commit c24454ae70
32 changed files with 3124 additions and 7327 deletions

View File

@@ -2,48 +2,43 @@
A crypto library for the Odin language
## Supported
This library offers various algorithms available in either native Odin or via bindings to the [Botan](https://botan.randombit.net/) crypto library.
This library offers various algorithms implemented in Odin.
Please see the chart below for the options.
**Note:** All crypto hash algorithms, offered by [Botan\'s FFI](https://botan.randombit.net/handbook/api_ref/hash.html), have been added.
## Hashing algorithms
| Algorithm | Odin | Botan |
|:-------------------------------------------------------------------------------------------------------------|:-----------------|:---------------------|
| [BLAKE](https://web.archive.org/web/20190915215948/https://131002.net/blake) | ✔️ | |
| [BLAKE2B](https://datatracker.ietf.org/doc/html/rfc7693) | ✔️ | ✔️ |
| [BLAKE2S](https://datatracker.ietf.org/doc/html/rfc7693) | ✔️ | |
| [GOST](https://datatracker.ietf.org/doc/html/rfc5831) | ✔️ | ✔️ |
| [Grøstl](http://www.groestl.info/Groestl.zip) | ✔️ | |
| [HAVAL](https://web.archive.org/web/20150111210116/http://labs.calyptix.com/haval.php) | ✔️ | |
| [JH](https://www3.ntu.edu.sg/home/wuhj/research/jh/index.html) | ✔️ | |
| [Keccak](https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf) | ✔️ | ✔️ |
| [MD2](https://datatracker.ietf.org/doc/html/rfc1319) | ✔️ | |
| [MD4](https://datatracker.ietf.org/doc/html/rfc1320) | ✔️ | ✔️ |
| [MD5](https://datatracker.ietf.org/doc/html/rfc1321) | ✔️ | ✔️ |
| [RIPEMD](https://homes.esat.kuleuven.be/~bosselae/ripemd160.html) | ✔️ | ✔️\* |
| [SHA-1](https://datatracker.ietf.org/doc/html/rfc3174) | ✔️ | ✔️ |
| [SHA-2](https://csrc.nist.gov/csrc/media/publications/fips/180/2/archive/2002-08-01/documents/fips180-2.pdf) | ✔️ | ✔️ |
| [SHA-3](https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf) | ✔️ | ✔️ |
| [SHAKE](https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf) | ✔️ | ✔️ |
| [Skein](https://www.schneier.com/academic/skein/) | | ✔️\*\* |
| [SM3](https://datatracker.ietf.org/doc/html/draft-sca-cfrg-sm3-02) | ✔️ | ✔️ |
| [Streebog](https://datatracker.ietf.org/doc/html/rfc6986) | ✔️ | ✔️ |
| [Tiger](https://www.cs.technion.ac.il/~biham/Reports/Tiger/) | ✔️ | ✔️ |
| [Tiger2](https://www.cs.technion.ac.il/~biham/Reports/Tiger/) | ✔️ | |
| [Whirlpool](https://web.archive.org/web/20171129084214/http://www.larc.usp.br/~pbarreto/WhirlpoolPage.html) | ✔️ | ✔️ |
\* Only `RIPEMD-160`
\*\* Only `SKEIN-512`
| Algorithm | |
|:-------------------------------------------------------------------------------------------------------------|:-----------------|
| [BLAKE](https://web.archive.org/web/20190915215948/https://131002.net/blake) | ✔️ |
| [BLAKE2B](https://datatracker.ietf.org/doc/html/rfc7693) | ✔️ |
| [BLAKE2S](https://datatracker.ietf.org/doc/html/rfc7693) | ✔️ |
| [GOST](https://datatracker.ietf.org/doc/html/rfc5831) | ✔️ |
| [Grøstl](http://www.groestl.info/Groestl.zip) | ✔️ |
| [HAVAL](https://web.archive.org/web/20150111210116/http://labs.calyptix.com/haval.php) | ✔️ |
| [JH](https://www3.ntu.edu.sg/home/wuhj/research/jh/index.html) | ✔️ |
| [Keccak](https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf) | ✔️ |
| [MD2](https://datatracker.ietf.org/doc/html/rfc1319) | ✔️ |
| [MD4](https://datatracker.ietf.org/doc/html/rfc1320) | ✔️ |
| [MD5](https://datatracker.ietf.org/doc/html/rfc1321) | ✔️ |
| [RIPEMD](https://homes.esat.kuleuven.be/~bosselae/ripemd160.html) | ✔️ |
| [SHA-1](https://datatracker.ietf.org/doc/html/rfc3174) | ✔️ |
| [SHA-2](https://csrc.nist.gov/csrc/media/publications/fips/180/2/archive/2002-08-01/documents/fips180-2.pdf) | ✔️ |
| [SHA-3](https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf) | ✔️ |
| [SHAKE](https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf) | ✔️ |
| [SM3](https://datatracker.ietf.org/doc/html/draft-sca-cfrg-sm3-02) | ✔️ |
| [Streebog](https://datatracker.ietf.org/doc/html/rfc6986) | ✔️ |
| [Tiger](https://www.cs.technion.ac.il/~biham/Reports/Tiger/) | ✔️ |
| [Tiger2](https://www.cs.technion.ac.il/~biham/Reports/Tiger/) | ✔️ |
| [Whirlpool](https://web.archive.org/web/20171129084214/http://www.larc.usp.br/~pbarreto/WhirlpoolPage.html) | ✔️ |
#### High level API
Each hash algorithm contains a procedure group named `hash`, or if the algorithm provides more than one digest size `hash_<size>`\*\*\*.
Each hash algorithm contains a procedure group named `hash`, or if the algorithm provides more than one digest size `hash_<size>`\*.
Included in these groups are four procedures.
* `hash_string` - Hash a given string and return the computed hash. Just calls `hash_bytes` internally
* `hash_bytes` - Hash a given byte slice and return the computed hash
* `hash_stream` - Takes a stream from io.Stream and returns the computed hash from it
* `hash_file` - Takes a file handle and returns the computed hash from it. A second optional boolean parameter controls if the file is streamed (this is the default) or read at once (set to true)
\*\*\* On some algorithms there is another part to the name, since they might offer control about additional parameters.
\* On some algorithms there is another part to the name, since they might offer control about additional parameters.
For instance, `HAVAL` offers different sizes as well as three different round amounts.
Computing a 256-bit hash with 3 rounds is therefore achieved by calling `haval.hash_256_3(...)`.
@@ -51,13 +46,6 @@ Computing a 256-bit hash with 3 rounds is therefore achieved by calling `haval.h
The above mentioned procedures internally call three procedures: `init`, `update` and `final`.
You may also directly call them, if you wish.
#### Context system
The library uses a context system internally to be able to switch between Odin / Botan implementations freely.
When an Odin implementation is available, it is the default.
You may change what is used during runtime by calling `foo.use_botan()` or `foo.use_odin()`.
It is also possible to set this during compile time via `USE_BOTAN_LIB=true`.
Internally a vtable is used to set the appropriate procedures when switching. This works for all the procedures mentioned in the APIs above.
#### Example
```odin
package crypto_example
@@ -67,12 +55,16 @@ import "core:crypto/md4"
main :: proc() {
input := "foo"
// Compute the hash via Odin implementation
// Compute the hash, using the high level API
computed_hash := md4.hash(input)
// Switch to Botan
md4.use_botan()
// Compute the hash via Botan bindings
computed_hash_botan := md4.hash(input)
// Compute the hash, using the low level API
ctx: md4.Md4_Context
computed_hash_low: [16]byte
md4.init(&ctx)
md4.update(&ctx, transmute([]byte)input)
md4.final(&ctx, computed_hash_low[:])
}
```
For example uses of all available algorithms, please see the tests within `tests/core/crypto`.

View File

@@ -6,7 +6,6 @@ package _blake2
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the BLAKE2 hashing algorithm, as defined in <https://datatracker.ietf.org/doc/html/rfc7693> and <https://www.blake2.net/>
*/
@@ -76,7 +75,7 @@ BLAKE2B_IV := [8]u64 {
0x1f83d9abfb41bd6b, 0x5be0cd19137e2179,
}
init_odin :: proc(ctx: ^$T) {
init :: proc(ctx: ^$T) {
when T == Blake2s_Context {
block_size :: BLAKE2S_BLOCK_SIZE
} else when T == Blake2b_Context {
@@ -139,17 +138,17 @@ init_odin :: proc(ctx: ^$T) {
}
if len(ctx.cfg.key) > 0 {
copy(ctx.padded_key[:], ctx.cfg.key)
update_odin(ctx, ctx.padded_key[:])
update(ctx, ctx.padded_key[:])
ctx.is_keyed = true
}
copy(ctx.ih[:], ctx.h[:])
copy(ctx.h[:], ctx.ih[:])
if ctx.is_keyed {
update_odin(ctx, ctx.padded_key[:])
update(ctx, ctx.padded_key[:])
}
}
update_odin :: proc(ctx: ^$T, p: []byte) {
update :: proc "contextless" (ctx: ^$T, p: []byte) {
p := p
when T == Blake2s_Context {
block_size :: BLAKE2S_BLOCK_SIZE
@@ -161,7 +160,7 @@ update_odin :: proc(ctx: ^$T, p: []byte) {
if len(p) > left {
copy(ctx.x[ctx.nx:], p[:left])
p = p[left:]
blake2_blocks(ctx, ctx.x[:])
blocks(ctx, ctx.x[:])
ctx.nx = 0
}
if len(p) > block_size {
@@ -169,13 +168,22 @@ update_odin :: proc(ctx: ^$T, p: []byte) {
if n == len(p) {
n -= block_size
}
blake2_blocks(ctx, p[:n])
blocks(ctx, p[:n])
p = p[n:]
}
ctx.nx += copy(ctx.x[ctx.nx:], p)
}
blake2s_final_odin :: proc(ctx: $T, hash: []byte) {
final :: proc "contextless" (ctx: ^$T, hash: []byte) {
when T == Blake2s_Context {
blake2s_final(ctx, hash)
}
when T == Blake2b_Context {
blake2b_final(ctx, hash)
}
}
blake2s_final :: proc "contextless" (ctx: ^Blake2s_Context, hash: []byte) {
if ctx.is_keyed {
for i := 0; i < len(ctx.padded_key); i += 1 {
ctx.padded_key[i] = 0
@@ -193,7 +201,7 @@ blake2s_final_odin :: proc(ctx: $T, hash: []byte) {
ctx.f[1] = 0xffffffff
}
blake2_blocks(ctx, ctx.x[:])
blocks(ctx, ctx.x[:])
j := 0
for s, _ in ctx.h[:(ctx.size - 1) / 4 + 1] {
@@ -205,7 +213,7 @@ blake2s_final_odin :: proc(ctx: $T, hash: []byte) {
}
}
blake2b_final_odin :: proc(ctx: $T, hash: []byte) {
blake2b_final :: proc "contextless" (ctx: ^Blake2b_Context, hash: []byte) {
if ctx.is_keyed {
for i := 0; i < len(ctx.padded_key); i += 1 {
ctx.padded_key[i] = 0
@@ -223,7 +231,7 @@ blake2b_final_odin :: proc(ctx: $T, hash: []byte) {
ctx.f[1] = 0xffffffffffffffff
}
blake2_blocks(ctx, ctx.x[:])
blocks(ctx, ctx.x[:])
j := 0
for s, _ in ctx.h[:(ctx.size - 1) / 8 + 1] {
@@ -239,7 +247,7 @@ blake2b_final_odin :: proc(ctx: $T, hash: []byte) {
}
}
blake2_blocks :: proc(ctx: ^$T, p: []byte) {
blocks :: proc "contextless" (ctx: ^$T, p: []byte) {
when T == Blake2s_Context {
blake2s_blocks(ctx, p)
}
@@ -248,7 +256,7 @@ blake2_blocks :: proc(ctx: ^$T, p: []byte) {
}
}
blake2s_blocks :: #force_inline proc "contextless"(ctx: ^Blake2s_Context, p: []byte) {
blake2s_blocks :: #force_inline proc "contextless" (ctx: ^Blake2s_Context, p: []byte) {
h0, h1, h2, h3, h4, h5, h6, h7 := ctx.h[0], ctx.h[1], ctx.h[2], ctx.h[3], ctx.h[4], ctx.h[5], ctx.h[6], ctx.h[7]
p := p
for len(p) >= BLAKE2S_BLOCK_SIZE {
@@ -1404,7 +1412,7 @@ blake2s_blocks :: #force_inline proc "contextless"(ctx: ^Blake2s_Context, p: []b
ctx.h[0], ctx.h[1], ctx.h[2], ctx.h[3], ctx.h[4], ctx.h[5], ctx.h[6], ctx.h[7] = h0, h1, h2, h3, h4, h5, h6, h7
}
blake2b_blocks :: #force_inline proc "contextless"(ctx: ^Blake2b_Context, p: []byte) {
blake2b_blocks :: #force_inline proc "contextless" (ctx: ^Blake2b_Context, p: []byte) {
h0, h1, h2, h3, h4, h5, h6, h7 := ctx.h[0], ctx.h[1], ctx.h[2], ctx.h[3], ctx.h[4], ctx.h[5], ctx.h[6], ctx.h[7]
p := p
for len(p) >= BLAKE2B_BLOCK_SIZE {

View File

@@ -1,79 +0,0 @@
package _ctx
/*
Copyright 2021 zhibog
Made available under the BSD-3 license.
List of contributors:
zhibog: Initial creation and testing of the bindings.
Implementation of the context, used internally by the crypto library.
*/
import "core:io"
import "core:os"
Hash_Size :: enum {
_16,
_20,
_24,
_28,
_32,
_40,
_48,
_64,
_128,
}
Hash_Context :: struct {
botan_hash_algo: cstring,
external_ctx: any,
internal_ctx: any,
hash_size: Hash_Size,
hash_size_val: int,
is_using_odin: bool,
using vtbl: ^Hash_Context_Vtable,
}
Hash_Context_Vtable :: struct {
hash_bytes_16 : proc (ctx: ^Hash_Context, input: []byte) -> [16]byte,
hash_bytes_20 : proc (ctx: ^Hash_Context, input: []byte) -> [20]byte,
hash_bytes_24 : proc (ctx: ^Hash_Context, input: []byte) -> [24]byte,
hash_bytes_28 : proc (ctx: ^Hash_Context, input: []byte) -> [28]byte,
hash_bytes_32 : proc (ctx: ^Hash_Context, input: []byte) -> [32]byte,
hash_bytes_40 : proc (ctx: ^Hash_Context, input: []byte) -> [40]byte,
hash_bytes_48 : proc (ctx: ^Hash_Context, input: []byte) -> [48]byte,
hash_bytes_64 : proc (ctx: ^Hash_Context, input: []byte) -> [64]byte,
hash_bytes_128 : proc (ctx: ^Hash_Context, input: []byte) -> [128]byte,
hash_file_16 : proc (ctx: ^Hash_Context, hd: os.Handle, load_at_once := false) -> ([16]byte, bool),
hash_file_20 : proc (ctx: ^Hash_Context, hd: os.Handle, load_at_once := false) -> ([20]byte, bool),
hash_file_24 : proc (ctx: ^Hash_Context, hd: os.Handle, load_at_once := false) -> ([24]byte, bool),
hash_file_28 : proc (ctx: ^Hash_Context, hd: os.Handle, load_at_once := false) -> ([28]byte, bool),
hash_file_32 : proc (ctx: ^Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool),
hash_file_40 : proc (ctx: ^Hash_Context, hd: os.Handle, load_at_once := false) -> ([40]byte, bool),
hash_file_48 : proc (ctx: ^Hash_Context, hd: os.Handle, load_at_once := false) -> ([48]byte, bool),
hash_file_64 : proc (ctx: ^Hash_Context, hd: os.Handle, load_at_once := false) -> ([64]byte, bool),
hash_file_128 : proc (ctx: ^Hash_Context, hd: os.Handle, load_at_once := false) -> ([128]byte, bool),
hash_stream_16 : proc (ctx: ^Hash_Context, s: io.Stream) -> ([16]byte, bool),
hash_stream_20 : proc (ctx: ^Hash_Context, s: io.Stream) -> ([20]byte, bool),
hash_stream_24 : proc (ctx: ^Hash_Context, s: io.Stream) -> ([24]byte, bool),
hash_stream_28 : proc (ctx: ^Hash_Context, s: io.Stream) -> ([28]byte, bool),
hash_stream_32 : proc (ctx: ^Hash_Context, s: io.Stream) -> ([32]byte, bool),
hash_stream_40 : proc (ctx: ^Hash_Context, s: io.Stream) -> ([40]byte, bool),
hash_stream_48 : proc (ctx: ^Hash_Context, s: io.Stream) -> ([48]byte, bool),
hash_stream_64 : proc (ctx: ^Hash_Context, s: io.Stream) -> ([64]byte, bool),
hash_stream_128 : proc (ctx: ^Hash_Context, s: io.Stream) -> ([128]byte, bool),
hash_bytes_slice : proc (ctx: ^Hash_Context, input: []byte, out_size: int, allocator := context.allocator) -> []byte,
hash_file_slice : proc (ctx: ^Hash_Context, hd: os.Handle, out_size: int, load_at_once := false, allocator := context.allocator) -> ([]byte, bool),
hash_stream_slice : proc (ctx: ^Hash_Context, s: io.Stream, out_size: int, allocator := context.allocator) -> ([]byte, bool),
init : proc (ctx: ^Hash_Context),
update : proc (ctx: ^Hash_Context, data: []byte),
final : proc (ctx: ^Hash_Context, hash: []byte),
}
_init_vtable :: #force_inline proc() -> ^Hash_Context {
ctx := new(Hash_Context)
vtbl := new(Hash_Context_Vtable)
ctx.vtbl = vtbl
return ctx
}

View File

@@ -6,7 +6,6 @@ package _sha3
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the Keccak hashing algorithm, standardized as SHA3 in <https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf>
To use the original Keccak padding, set the is_keccak bool to true, otherwise it will use SHA3 padding.
@@ -115,14 +114,14 @@ keccakf :: proc "contextless" (st: ^[25]u64) {
}
}
init_odin :: proc "contextless" (c: ^Sha3_Context) {
init :: proc "contextless" (c: ^Sha3_Context) {
for i := 0; i < 25; i += 1 {
c.st.q[i] = 0
}
c.rsiz = 200 - 2 * c.mdlen
}
update_odin :: proc "contextless" (c: ^Sha3_Context, data: []byte) {
update :: proc "contextless" (c: ^Sha3_Context, data: []byte) {
j := c.pt
for i := 0; i < len(data); i += 1 {
c.st.b[j] ~= data[i]
@@ -135,7 +134,7 @@ update_odin :: proc "contextless" (c: ^Sha3_Context, data: []byte) {
c.pt = j
}
final_odin :: proc "contextless" (c: ^Sha3_Context, hash: []byte) {
final :: proc "contextless" (c: ^Sha3_Context, hash: []byte) {
if c.is_keccak {
c.st.b[c.pt] ~= 0x01
} else {
@@ -149,14 +148,14 @@ final_odin :: proc "contextless" (c: ^Sha3_Context, hash: []byte) {
}
}
shake_xof_odin :: proc "contextless" (c: ^Sha3_Context) {
shake_xof :: proc "contextless" (c: ^Sha3_Context) {
c.st.b[c.pt] ~= 0x1F
c.st.b[c.rsiz - 1] ~= 0x80
keccakf(&c.st.q)
c.pt = 0
}
shake_out_odin :: proc "contextless" (c: ^Sha3_Context, hash: []byte) {
shake_out :: proc "contextless" (c: ^Sha3_Context, hash: []byte) {
j := c.pt
for i := 0; i < len(hash); i += 1 {
if j >= c.rsiz {

View File

@@ -6,7 +6,6 @@ package _tiger
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the Tiger hashing algorithm, as defined in <https://www.cs.technion.ac.il/~biham/Reports/Tiger/>
*/
@@ -291,7 +290,7 @@ Tiger_Context :: struct {
ver: int,
}
round :: #force_inline proc "contextless"(a, b, c, x, mul: u64) -> (u64, u64, u64) {
round :: #force_inline proc "contextless" (a, b, c, x, mul: u64) -> (u64, u64, u64) {
a, b, c := a, b, c
c ~= x
a -= T1[c & 0xff] ~ T2[(c >> 16) & 0xff] ~ T3[(c >> 32) & 0xff] ~ T4[(c >> 48) & 0xff]
@@ -300,7 +299,7 @@ round :: #force_inline proc "contextless"(a, b, c, x, mul: u64) -> (u64, u64, u6
return a, b, c
}
pass :: #force_inline proc "contextless"(a, b, c: u64, d: []u64, mul: u64) -> (x, y, z: u64) {
pass :: #force_inline proc "contextless" (a, b, c: u64, d: []u64, mul: u64) -> (x, y, z: u64) {
x, y, z = round(a, b, c, d[0], mul)
y, z, x = round(y, z, x, d[1], mul)
z, x, y = round(z, x, y, d[2], mul)
@@ -312,7 +311,7 @@ pass :: #force_inline proc "contextless"(a, b, c: u64, d: []u64, mul: u64) -> (x
return
}
key_schedule :: #force_inline proc "contextless"(x: []u64) {
key_schedule :: #force_inline proc "contextless" (x: []u64) {
x[0] -= x[7] ~ 0xa5a5a5a5a5a5a5a5
x[1] ~= x[0]
x[2] += x[1]
@@ -331,7 +330,7 @@ key_schedule :: #force_inline proc "contextless"(x: []u64) {
x[7] -= x[6] ~ 0x0123456789abcdef
}
compress :: #force_inline proc "contextless"(ctx: ^Tiger_Context, data: []byte) {
compress :: #force_inline proc "contextless" (ctx: ^Tiger_Context, data: []byte) {
a := ctx.a
b := ctx.b
c := ctx.c
@@ -346,13 +345,13 @@ compress :: #force_inline proc "contextless"(ctx: ^Tiger_Context, data: []byte)
ctx.c += c
}
init_odin :: proc(ctx: ^Tiger_Context) {
init :: proc "contextless" (ctx: ^Tiger_Context) {
ctx.a = 0x0123456789abcdef
ctx.b = 0xfedcba9876543210
ctx.c = 0xf096a5b4c3b2e187
}
update_odin :: proc(ctx: ^Tiger_Context, input: []byte) {
update :: proc(ctx: ^Tiger_Context, input: []byte) {
p := make([]byte, len(input))
copy(p, input)
@@ -380,7 +379,7 @@ update_odin :: proc(ctx: ^Tiger_Context, input: []byte) {
}
}
final_odin :: proc(ctx: ^Tiger_Context, hash: []byte) {
final :: proc(ctx: ^Tiger_Context, hash: []byte) {
length := ctx.length
tmp: [64]byte
if ctx.ver == 1 {
@@ -391,16 +390,16 @@ final_odin :: proc(ctx: ^Tiger_Context, hash: []byte) {
size := length & 0x3f
if size < 56 {
update_odin(ctx, tmp[:56 - size])
update(ctx, tmp[:56 - size])
} else {
update_odin(ctx, tmp[:64 + 56 - size])
update(ctx, tmp[:64 + 56 - size])
}
length <<= 3
for i := uint(0); i < 8; i += 1 {
tmp[i] = byte(length >> (8 * i))
}
update_odin(ctx, tmp[:8])
update(ctx, tmp[:8])
for i := uint(0); i < 8; i += 1 {
tmp[i] = byte(ctx.a >> (8 * i))

View File

@@ -6,7 +6,6 @@ package blake
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the BLAKE hashing algorithm, as defined in <https://web.archive.org/web/20190915215948/https://131002.net/blake>
*/
@@ -14,102 +13,59 @@ package blake
import "core:os"
import "core:io"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_28 = hash_bytes_odin_28
ctx.hash_file_28 = hash_file_odin_28
ctx.hash_stream_28 = hash_stream_odin_28
ctx.hash_bytes_32 = hash_bytes_odin_32
ctx.hash_file_32 = hash_file_odin_32
ctx.hash_stream_32 = hash_stream_odin_32
ctx.hash_bytes_48 = hash_bytes_odin_48
ctx.hash_file_48 = hash_file_odin_48
ctx.hash_stream_48 = hash_stream_odin_48
ctx.hash_bytes_64 = hash_bytes_odin_64
ctx.hash_file_64 = hash_file_odin_64
ctx.hash_stream_64 = hash_stream_odin_64
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan does nothing, since BLAKE is not available in Botan
@(warning="BLAKE is not provided by the Botan API. Odin implementation will be used")
use_botan :: #force_inline proc() {
use_odin()
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
@(private)
_create_blake256_ctx :: #force_inline proc(is224: bool, size: _ctx.Hash_Size) {
ctx: Blake256_Context
ctx.is224 = is224
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = size
}
@(private)
_create_blake512_ctx :: #force_inline proc(is384: bool, size: _ctx.Hash_Size) {
ctx: Blake512_Context
ctx.is384 = is384
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = size
}
/*
High level API
*/
// hash_string_224 will hash the given input and return the
// computed hash
hash_string_224 :: proc(data: string) -> [28]byte {
hash_string_224 :: proc "contextless" (data: string) -> [28]byte {
return hash_bytes_224(transmute([]byte)(data))
}
// hash_bytes_224 will hash the given input and return the
// computed hash
hash_bytes_224 :: proc(data: []byte) -> [28]byte {
_create_blake256_ctx(true, ._28)
return _hash_impl->hash_bytes_28(data)
hash_bytes_224 :: proc "contextless" (data: []byte) -> [28]byte {
hash: [28]byte
ctx: Blake256_Context
ctx.is224 = true
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_224 will read the stream in chunks and compute a
// hash from its contents
hash_stream_224 :: proc(s: io.Stream) -> ([28]byte, bool) {
_create_blake256_ctx(true, ._28)
return _hash_impl->hash_stream_28(s)
hash: [28]byte
ctx: Blake256_Context
ctx.is224 = true
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_224 will read the file provided by the given handle
// and compute a hash
hash_file_224 :: proc(hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
_create_blake256_ctx(true, ._28)
return _hash_impl->hash_file_28(hd, load_at_once)
if !load_at_once {
return hash_stream_224(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_224(buf[:]), ok
}
}
return [28]byte{}, false
}
hash_224 :: proc {
@@ -121,29 +77,53 @@ hash_224 :: proc {
// hash_string_256 will hash the given input and return the
// computed hash
hash_string_256 :: proc(data: string) -> [32]byte {
hash_string_256 :: proc "contextless" (data: string) -> [32]byte {
return hash_bytes_256(transmute([]byte)(data))
}
// hash_bytes_256 will hash the given input and return the
// computed hash
hash_bytes_256 :: proc(data: []byte) -> [32]byte {
_create_blake256_ctx(false, ._32)
return _hash_impl->hash_bytes_32(data)
hash_bytes_256 :: proc "contextless" (data: []byte) -> [32]byte {
hash: [32]byte
ctx: Blake256_Context
ctx.is224 = false
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_256 will read the stream in chunks and compute a
// hash from its contents
hash_stream_256 :: proc(s: io.Stream) -> ([32]byte, bool) {
_create_blake256_ctx(false, ._32)
return _hash_impl->hash_stream_32(s)
hash: [32]byte
ctx: Blake256_Context
ctx.is224 = false
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_256 will read the file provided by the given handle
// and compute a hash
hash_file_256 :: proc(hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
_create_blake256_ctx(false, ._32)
return _hash_impl->hash_file_32(hd, load_at_once)
if !load_at_once {
return hash_stream_256(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_256(buf[:]), ok
}
}
return [32]byte{}, false
}
hash_256 :: proc {
@@ -155,29 +135,53 @@ hash_256 :: proc {
// hash_string_384 will hash the given input and return the
// computed hash
hash_string_384 :: proc(data: string) -> [48]byte {
hash_string_384 :: proc "contextless" (data: string) -> [48]byte {
return hash_bytes_384(transmute([]byte)(data))
}
// hash_bytes_384 will hash the given input and return the
// computed hash
hash_bytes_384 :: proc(data: []byte) -> [48]byte {
_create_blake512_ctx(true, ._48)
return _hash_impl->hash_bytes_48(data)
hash_bytes_384 :: proc "contextless" (data: []byte) -> [48]byte {
hash: [48]byte
ctx: Blake512_Context
ctx.is384 = true
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_384 will read the stream in chunks and compute a
// hash from its contents
hash_stream_384 :: proc(s: io.Stream) -> ([48]byte, bool) {
_create_blake512_ctx(true, ._48)
return _hash_impl->hash_stream_48(s)
hash: [48]byte
ctx: Blake512_Context
ctx.is384 = true
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_384 will read the file provided by the given handle
// and compute a hash
hash_file_384 :: proc(hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
_create_blake512_ctx(true, ._48)
return _hash_impl->hash_file_48(hd, load_at_once)
if !load_at_once {
return hash_stream_384(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_384(buf[:]), ok
}
}
return [48]byte{}, false
}
hash_384 :: proc {
@@ -189,29 +193,53 @@ hash_384 :: proc {
// hash_string_512 will hash the given input and return the
// computed hash
hash_string_512 :: proc(data: string) -> [64]byte {
hash_string_512 :: proc "contextless" (data: string) -> [64]byte {
return hash_bytes_512(transmute([]byte)(data))
}
// hash_bytes_512 will hash the given input and return the
// computed hash
hash_bytes_512 :: proc(data: []byte) -> [64]byte {
_create_blake512_ctx(false, ._64)
return _hash_impl->hash_bytes_64(data)
hash_bytes_512 :: proc "contextless" (data: []byte) -> [64]byte {
hash: [64]byte
ctx: Blake512_Context
ctx.is384 = false
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_512 will read the stream in chunks and compute a
// hash from its contents
hash_stream_512 :: proc(s: io.Stream) -> ([64]byte, bool) {
_create_blake512_ctx(false, ._64)
return _hash_impl->hash_stream_64(s)
hash: [64]byte
ctx: Blake512_Context
ctx.is384 = false
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_512 will read the file provided by the given handle
// and compute a hash
hash_file_512 :: proc(hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
_create_blake512_ctx(false, ._64)
return _hash_impl->hash_file_64(hd, load_at_once)
if !load_at_once {
return hash_stream_512(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_512(buf[:]), ok
}
}
return [64]byte{}, false
}
hash_512 :: proc {
@@ -225,231 +253,188 @@ hash_512 :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [28]byte {
hash: [28]byte
if c, ok := ctx.internal_ctx.(Blake256_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([28]byte, bool) {
hash: [28]byte
if c, ok := ctx.internal_ctx.(Blake256_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
init :: proc "contextless" (ctx: ^$T) {
when T == Blake256_Context {
if ctx.is224 {
ctx.h[0] = 0xc1059ed8
ctx.h[1] = 0x367cd507
ctx.h[2] = 0x3070dd17
ctx.h[3] = 0xf70e5939
ctx.h[4] = 0xffc00b31
ctx.h[5] = 0x68581511
ctx.h[6] = 0x64f98fa7
ctx.h[7] = 0xbefa4fa4
} else {
ctx.h[0] = 0x6a09e667
ctx.h[1] = 0xbb67ae85
ctx.h[2] = 0x3c6ef372
ctx.h[3] = 0xa54ff53a
ctx.h[4] = 0x510e527f
ctx.h[5] = 0x9b05688c
ctx.h[6] = 0x1f83d9ab
ctx.h[7] = 0x5be0cd19
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
if !load_at_once {
return hash_stream_odin_28(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_28(ctx, buf[:]), ok
}
}
return [28]byte{}, false
}
hash_bytes_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Blake256_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Blake256_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_odin_32(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_32(ctx, buf[:]), ok
}
}
return [32]byte{}, false
}
hash_bytes_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [48]byte {
hash: [48]byte
if c, ok := ctx.internal_ctx.(Blake512_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([48]byte, bool) {
hash: [48]byte
if c, ok := ctx.internal_ctx.(Blake512_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
if !load_at_once {
return hash_stream_odin_48(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_48(ctx, buf[:]), ok
}
}
return [48]byte{}, false
}
hash_bytes_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [64]byte {
hash: [64]byte
if c, ok := ctx.internal_ctx.(Blake512_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([64]byte, bool) {
hash: [64]byte
if c, ok := ctx.internal_ctx.(Blake512_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
if !load_at_once {
return hash_stream_odin_64(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_64(ctx, buf[:]), ok
}
}
return [64]byte{}, false
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
if ctx.hash_size == ._28 || ctx.hash_size == ._32 {
_create_blake256_ctx(ctx.hash_size == ._28, ctx.hash_size)
if c, ok := ctx.internal_ctx.(Blake256_Context); ok {
init_odin(&c)
}
return
}
if ctx.hash_size == ._48 || ctx.hash_size == ._64 {
_create_blake512_ctx(ctx.hash_size == ._48, ctx.hash_size)
if c, ok := ctx.internal_ctx.(Blake512_Context); ok {
init_odin(&c)
} else when T == Blake512_Context {
if ctx.is384 {
ctx.h[0] = 0xcbbb9d5dc1059ed8
ctx.h[1] = 0x629a292a367cd507
ctx.h[2] = 0x9159015a3070dd17
ctx.h[3] = 0x152fecd8f70e5939
ctx.h[4] = 0x67332667ffc00b31
ctx.h[5] = 0x8eb44a8768581511
ctx.h[6] = 0xdb0c2e0d64f98fa7
ctx.h[7] = 0x47b5481dbefa4fa4
} else {
ctx.h[0] = 0x6a09e667f3bcc908
ctx.h[1] = 0xbb67ae8584caa73b
ctx.h[2] = 0x3c6ef372fe94f82b
ctx.h[3] = 0xa54ff53a5f1d36f1
ctx.h[4] = 0x510e527fade682d1
ctx.h[5] = 0x9b05688c2b3e6c1f
ctx.h[6] = 0x1f83d9abfb41bd6b
ctx.h[7] = 0x5be0cd19137e2179
}
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
#partial switch ctx.hash_size {
case ._28, ._32:
if c, ok := ctx.internal_ctx.(Blake256_Context); ok {
update_odin(&c, data)
update :: proc "contextless" (ctx: ^$T, data: []byte) {
data := data
when T == Blake256_Context {
if ctx.nx > 0 {
n := copy(ctx.x[ctx.nx:], data)
ctx.nx += n
if ctx.nx == BLOCKSIZE_256 {
block256(ctx, ctx.x[:])
ctx.nx = 0
}
case ._48, ._64:
if c, ok := ctx.internal_ctx.(Blake512_Context); ok {
update_odin(&c, data)
data = data[n:]
}
if len(data) >= BLOCKSIZE_256 {
n := len(data) &~ (BLOCKSIZE_256 - 1)
block256(ctx, data[:n])
data = data[n:]
}
if len(data) > 0 {
ctx.nx = copy(ctx.x[:], data)
}
} else when T == Blake512_Context {
if ctx.nx > 0 {
n := copy(ctx.x[ctx.nx:], data)
ctx.nx += n
if ctx.nx == BLOCKSIZE_512 {
block512(ctx, ctx.x[:])
ctx.nx = 0
}
data = data[n:]
}
if len(data) >= BLOCKSIZE_512 {
n := len(data) &~ (BLOCKSIZE_512 - 1)
block512(ctx, data[:n])
data = data[n:]
}
if len(data) > 0 {
ctx.nx = copy(ctx.x[:], data)
}
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
#partial switch ctx.hash_size {
case ._28, ._32:
if c, ok := ctx.internal_ctx.(Blake256_Context); ok {
final_odin(&c, hash)
final :: proc "contextless" (ctx: ^$T, hash: []byte) {
when T == Blake256_Context {
tmp: [65]byte
} else when T == Blake512_Context {
tmp: [129]byte
}
nx := u64(ctx.nx)
tmp[0] = 0x80
length := (ctx.t + nx) << 3
when T == Blake256_Context {
if nx == 55 {
if ctx.is224 {
write_additional(ctx, {0x80})
} else {
write_additional(ctx, {0x81})
}
case ._48, ._64:
if c, ok := ctx.internal_ctx.(Blake512_Context); ok {
final_odin(&c, hash)
} else {
if nx < 55 {
if nx == 0 {
ctx.nullt = true
}
write_additional(ctx, tmp[0 : 55 - nx])
} else {
write_additional(ctx, tmp[0 : 64 - nx])
write_additional(ctx, tmp[1:56])
ctx.nullt = true
}
if ctx.is224 {
write_additional(ctx, {0x00})
} else {
write_additional(ctx, {0x01})
}
}
for i : uint = 0; i < 8; i += 1 {
tmp[i] = byte(length >> (56 - 8 * i))
}
write_additional(ctx, tmp[0:8])
h := ctx.h[:]
if ctx.is224 {
h = h[0:7]
}
for s, i in h {
hash[i * 4] = byte(s >> 24)
hash[i * 4 + 1] = byte(s >> 16)
hash[i * 4 + 2] = byte(s >> 8)
hash[i * 4 + 3] = byte(s)
}
} else when T == Blake512_Context {
if nx == 111 {
if ctx.is384 {
write_additional(ctx, {0x80})
} else {
write_additional(ctx, {0x81})
}
} else {
if nx < 111 {
if nx == 0 {
ctx.nullt = true
}
write_additional(ctx, tmp[0 : 111 - nx])
} else {
write_additional(ctx, tmp[0 : 128 - nx])
write_additional(ctx, tmp[1:112])
ctx.nullt = true
}
if ctx.is384 {
write_additional(ctx, {0x00})
} else {
write_additional(ctx, {0x01})
}
}
for i : uint = 0; i < 16; i += 1 {
tmp[i] = byte(length >> (120 - 8 * i))
}
write_additional(ctx, tmp[0:16])
h := ctx.h[:]
if ctx.is384 {
h = h[0:6]
}
for s, i in h {
hash[i * 8] = byte(s >> 56)
hash[i * 8 + 1] = byte(s >> 48)
hash[i * 8 + 2] = byte(s >> 40)
hash[i * 8 + 3] = byte(s >> 32)
hash[i * 8 + 4] = byte(s >> 24)
hash[i * 8 + 5] = byte(s >> 16)
hash[i * 8 + 6] = byte(s >> 8)
hash[i * 8 + 7] = byte(s)
}
}
}
/*
BLAKE implementation
*/
SIZE_224 :: 28
SIZE_256 :: 32
SIZE_384 :: 48
@@ -542,8 +527,8 @@ G512 :: #force_inline proc "contextless" (a, b, c, d: u64, m: [16]u64, i, j: int
return a, b, c, d
}
block256 :: proc "contextless" (ctx: ^Blake256_Context, p: []byte) {
i, j: int = ---, ---
block256 :: proc "contextless" (ctx: ^Blake256_Context, p: []byte) #no_bounds_check {
i, j: int = ---, ---
v, m: [16]u32 = ---, ---
p := p
for len(p) >= BLOCKSIZE_256 {
@@ -595,7 +580,7 @@ block256 :: proc "contextless" (ctx: ^Blake256_Context, p: []byte) {
}
block512 :: proc "contextless" (ctx: ^Blake512_Context, p: []byte) #no_bounds_check {
i, j: int = ---, ---
i, j: int = ---, ---
v, m: [16]u64 = ---, ---
p := p
for len(p) >= BLOCKSIZE_512 {
@@ -646,189 +631,7 @@ block512 :: proc "contextless" (ctx: ^Blake512_Context, p: []byte) #no_bounds_ch
}
}
init_odin :: proc(ctx: ^$T) {
when T == Blake256_Context {
if ctx.is224 {
ctx.h[0] = 0xc1059ed8
ctx.h[1] = 0x367cd507
ctx.h[2] = 0x3070dd17
ctx.h[3] = 0xf70e5939
ctx.h[4] = 0xffc00b31
ctx.h[5] = 0x68581511
ctx.h[6] = 0x64f98fa7
ctx.h[7] = 0xbefa4fa4
} else {
ctx.h[0] = 0x6a09e667
ctx.h[1] = 0xbb67ae85
ctx.h[2] = 0x3c6ef372
ctx.h[3] = 0xa54ff53a
ctx.h[4] = 0x510e527f
ctx.h[5] = 0x9b05688c
ctx.h[6] = 0x1f83d9ab
ctx.h[7] = 0x5be0cd19
}
} else when T == Blake512_Context {
if ctx.is384 {
ctx.h[0] = 0xcbbb9d5dc1059ed8
ctx.h[1] = 0x629a292a367cd507
ctx.h[2] = 0x9159015a3070dd17
ctx.h[3] = 0x152fecd8f70e5939
ctx.h[4] = 0x67332667ffc00b31
ctx.h[5] = 0x8eb44a8768581511
ctx.h[6] = 0xdb0c2e0d64f98fa7
ctx.h[7] = 0x47b5481dbefa4fa4
} else {
ctx.h[0] = 0x6a09e667f3bcc908
ctx.h[1] = 0xbb67ae8584caa73b
ctx.h[2] = 0x3c6ef372fe94f82b
ctx.h[3] = 0xa54ff53a5f1d36f1
ctx.h[4] = 0x510e527fade682d1
ctx.h[5] = 0x9b05688c2b3e6c1f
ctx.h[6] = 0x1f83d9abfb41bd6b
ctx.h[7] = 0x5be0cd19137e2179
}
}
}
update_odin :: proc(ctx: ^$T, data: []byte) {
data := data
when T == Blake256_Context {
if ctx.nx > 0 {
n := copy(ctx.x[ctx.nx:], data)
ctx.nx += n
if ctx.nx == BLOCKSIZE_256 {
block256(ctx, ctx.x[:])
ctx.nx = 0
}
data = data[n:]
}
if len(data) >= BLOCKSIZE_256 {
n := len(data) &~ (BLOCKSIZE_256 - 1)
block256(ctx, data[:n])
data = data[n:]
}
if len(data) > 0 {
ctx.nx = copy(ctx.x[:], data)
}
} else when T == Blake512_Context {
if ctx.nx > 0 {
n := copy(ctx.x[ctx.nx:], data)
ctx.nx += n
if ctx.nx == BLOCKSIZE_512 {
block512(ctx, ctx.x[:])
ctx.nx = 0
}
data = data[n:]
}
if len(data) >= BLOCKSIZE_512 {
n := len(data) &~ (BLOCKSIZE_512 - 1)
block512(ctx, data[:n])
data = data[n:]
}
if len(data) > 0 {
ctx.nx = copy(ctx.x[:], data)
}
}
}
final_odin :: proc(ctx: ^$T, hash: []byte) {
when T == Blake256_Context {
tmp: [65]byte
} else when T == Blake512_Context {
tmp: [129]byte
}
nx := u64(ctx.nx)
tmp[0] = 0x80
length := (ctx.t + nx) << 3
when T == Blake256_Context {
if nx == 55 {
if ctx.is224 {
write_additional(ctx, {0x80})
} else {
write_additional(ctx, {0x81})
}
} else {
if nx < 55 {
if nx == 0 {
ctx.nullt = true
}
write_additional(ctx, tmp[0 : 55 - nx])
} else {
write_additional(ctx, tmp[0 : 64 - nx])
write_additional(ctx, tmp[1:56])
ctx.nullt = true
}
if ctx.is224 {
write_additional(ctx, {0x00})
} else {
write_additional(ctx, {0x01})
}
}
for i : uint = 0; i < 8; i += 1 {
tmp[i] = byte(length >> (56 - 8 * i))
}
write_additional(ctx, tmp[0:8])
h := ctx.h[:]
if ctx.is224 {
h = h[0:7]
}
for s, i in h {
hash[i * 4] = byte(s >> 24)
hash[i * 4 + 1] = byte(s >> 16)
hash[i * 4 + 2] = byte(s >> 8)
hash[i * 4 + 3] = byte(s)
}
} else when T == Blake512_Context {
if nx == 111 {
if ctx.is384 {
write_additional(ctx, {0x80})
} else {
write_additional(ctx, {0x81})
}
} else {
if nx < 111 {
if nx == 0 {
ctx.nullt = true
}
write_additional(ctx, tmp[0 : 111 - nx])
} else {
write_additional(ctx, tmp[0 : 128 - nx])
write_additional(ctx, tmp[1:112])
ctx.nullt = true
}
if ctx.is384 {
write_additional(ctx, {0x00})
} else {
write_additional(ctx, {0x01})
}
}
for i : uint = 0; i < 16; i += 1 {
tmp[i] = byte(length >> (120 - 8 * i))
}
write_additional(ctx, tmp[0:16])
h := ctx.h[:]
if ctx.is384 {
h = h[0:6]
}
for s, i in h {
hash[i * 8] = byte(s >> 56)
hash[i * 8 + 1] = byte(s >> 48)
hash[i * 8 + 2] = byte(s >> 40)
hash[i * 8 + 3] = byte(s >> 32)
hash[i * 8 + 4] = byte(s >> 24)
hash[i * 8 + 5] = byte(s >> 16)
hash[i * 8 + 6] = byte(s >> 8)
hash[i * 8 + 7] = byte(s)
}
}
}
write_additional :: proc(ctx: ^$T, data: []byte) {
write_additional :: proc "contextless" (ctx: ^$T, data: []byte) {
ctx.t -= u64(len(data)) << 3
update_odin(ctx, data)
update(ctx, data)
}

View File

@@ -6,7 +6,6 @@ package blake2b
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Interface for the BLAKE2B hashing algorithm.
BLAKE2B and BLAKE2B share the implementation in the _blake2 package.
@@ -15,49 +14,8 @@ package blake2b
import "core:os"
import "core:io"
import "../botan"
import "../_ctx"
import "../_blake2"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_64 = hash_bytes_odin
ctx.hash_file_64 = hash_file_odin
ctx.hash_stream_64 = hash_stream_odin
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_BLAKE2B)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
*/
@@ -71,22 +29,50 @@ hash_string :: proc(data: string) -> [64]byte {
// hash_bytes will hash the given input and return the
// computed hash
hash_bytes :: proc(data: []byte) -> [64]byte {
_create_blake2b_ctx()
return _hash_impl->hash_bytes_64(data)
hash: [64]byte
ctx: _blake2.Blake2b_Context
cfg: _blake2.Blake2_Config
cfg.size = _blake2.BLAKE2B_SIZE
ctx.cfg = cfg
_blake2.init(&ctx)
_blake2.update(&ctx, data)
_blake2.final(&ctx, hash[:])
return hash
}
// hash_stream will read the stream in chunks and compute a
// hash from its contents
hash_stream :: proc(s: io.Stream) -> ([64]byte, bool) {
_create_blake2b_ctx()
return _hash_impl->hash_stream_64(s)
hash: [64]byte
ctx: _blake2.Blake2b_Context
cfg: _blake2.Blake2_Config
cfg.size = _blake2.BLAKE2B_SIZE
ctx.cfg = cfg
_blake2.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_blake2.update(&ctx, buf[:read])
}
}
_blake2.final(&ctx, hash[:])
return hash, true
}
// hash_file will read the file provided by the given handle
// and compute a hash
hash_file :: proc(hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
_create_blake2b_ctx()
return _hash_impl->hash_file_64(hd, load_at_once)
if !load_at_once {
return hash_stream(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes(buf[:]), ok
}
}
return [64]byte{}, false
}
hash :: proc {
@@ -100,87 +86,16 @@ hash :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
Blake2b_Context :: _blake2.Blake2b_Context
init :: proc(ctx: ^_blake2.Blake2b_Context) {
_blake2.init(ctx)
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
update :: proc "contextless" (ctx: ^_blake2.Blake2b_Context, data: []byte) {
_blake2.update(ctx, data)
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [64]byte {
hash: [64]byte
if c, ok := ctx.internal_ctx.(_blake2.Blake2b_Context); ok {
_blake2.init_odin(&c)
_blake2.update_odin(&c, data)
_blake2.blake2b_final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([64]byte, bool) {
hash: [64]byte
if c, ok := ctx.internal_ctx.(_blake2.Blake2b_Context); ok {
_blake2.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_blake2.update_odin(&c, buf[:read])
}
}
_blake2.blake2b_final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
if !load_at_once {
return hash_stream_odin(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin(ctx, buf[:]), ok
}
}
return [64]byte{}, false
}
@(private)
_create_blake2b_ctx :: #force_inline proc() {
ctx: _blake2.Blake2b_Context
cfg: _blake2.Blake2_Config
cfg.size = _blake2.BLAKE2B_SIZE
ctx.cfg = cfg
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._64
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_blake2b_ctx()
if c, ok := ctx.internal_ctx.(_blake2.Blake2b_Context); ok {
_blake2.init_odin(&c)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(_blake2.Blake2b_Context); ok {
_blake2.update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(_blake2.Blake2b_Context); ok {
_blake2.blake2b_final_odin(&c, hash)
}
final :: proc "contextless" (ctx: ^_blake2.Blake2b_Context, hash: []byte) {
_blake2.final(ctx, hash)
}

View File

@@ -6,7 +6,6 @@ package blake2s
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Interface for the BLAKE2S hashing algorithm.
BLAKE2B and BLAKE2B share the implementation in the _blake2 package.
@@ -15,49 +14,8 @@ package blake2s
import "core:os"
import "core:io"
import "../_ctx"
import "../_blake2"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_32 = hash_bytes_odin
ctx.hash_file_32 = hash_file_odin
ctx.hash_stream_32 = hash_stream_odin
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan does nothing, since Blake2s is not available in Botan
@(warning="Blake2s is not provided by the Botan API. Odin implementation will be used")
use_botan :: #force_inline proc() {
use_odin()
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
*/
@@ -71,22 +29,50 @@ hash_string :: proc(data: string) -> [32]byte {
// hash_bytes will hash the given input and return the
// computed hash
hash_bytes :: proc(data: []byte) -> [32]byte {
_create_blake2s_ctx()
return _hash_impl->hash_bytes_32(data)
hash: [32]byte
ctx: _blake2.Blake2s_Context
cfg: _blake2.Blake2_Config
cfg.size = _blake2.BLAKE2S_SIZE
ctx.cfg = cfg
_blake2.init(&ctx)
_blake2.update(&ctx, data)
_blake2.final(&ctx, hash[:])
return hash
}
// hash_stream will read the stream in chunks and compute a
// hash from its contents
hash_stream :: proc(s: io.Stream) -> ([32]byte, bool) {
_create_blake2s_ctx()
return _hash_impl->hash_stream_32(s)
hash: [32]byte
ctx: _blake2.Blake2s_Context
cfg: _blake2.Blake2_Config
cfg.size = _blake2.BLAKE2S_SIZE
ctx.cfg = cfg
_blake2.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_blake2.update(&ctx, buf[:read])
}
}
_blake2.final(&ctx, hash[:])
return hash, true
}
// hash_file will read the file provided by the given handle
// and compute a hash
hash_file :: proc(hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
_create_blake2s_ctx()
return _hash_impl->hash_file_32(hd, load_at_once)
if !load_at_once {
return hash_stream(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes(buf[:]), ok
}
}
return [32]byte{}, false
}
hash :: proc {
@@ -100,87 +86,16 @@ hash :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
Blake2s_Context :: _blake2.Blake2b_Context
init :: proc(ctx: ^_blake2.Blake2s_Context) {
_blake2.init(ctx)
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
update :: proc "contextless" (ctx: ^_blake2.Blake2s_Context, data: []byte) {
_blake2.update(ctx, data)
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
if c, ok := ctx.internal_ctx.(_blake2.Blake2s_Context); ok {
_blake2.init_odin(&c)
_blake2.update_odin(&c, data)
_blake2.blake2s_final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
if c, ok := ctx.internal_ctx.(_blake2.Blake2s_Context); ok {
_blake2.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_blake2.update_odin(&c, buf[:read])
}
}
_blake2.blake2s_final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_odin(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin(ctx, buf[:]), ok
}
}
return [32]byte{}, false
}
@(private)
_create_blake2s_ctx :: #force_inline proc() {
ctx: _blake2.Blake2s_Context
cfg: _blake2.Blake2_Config
cfg.size = _blake2.BLAKE2S_SIZE
ctx.cfg = cfg
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._32
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_blake2s_ctx()
if c, ok := ctx.internal_ctx.(_blake2.Blake2s_Context); ok {
_blake2.init_odin(&c)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(_blake2.Blake2s_Context); ok {
_blake2.update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(_blake2.Blake2s_Context); ok {
_blake2.blake2s_final_odin(&c, hash)
}
final :: proc "contextless" (ctx: ^_blake2.Blake2s_Context, hash: []byte) {
_blake2.final(ctx, hash)
}

Binary file not shown.

View File

@@ -1,480 +0,0 @@
package botan
/*
Copyright 2021 zhibog
Made available under the BSD-3 license.
List of contributors:
zhibog: Initial creation and testing of the bindings.
Bindings for the Botan crypto library.
Created for version 2.18.1, using the provided FFI header within Botan.
The "botan_" prefix has been stripped from the identifiers to remove redundancy,
since the package is already named botan.
*/
import "core:c"
FFI_ERROR :: #type c.int
FFI_SUCCESS :: FFI_ERROR(0)
FFI_INVALID_VERIFIER :: FFI_ERROR(1)
FFI_ERROR_INVALID_INPUT :: FFI_ERROR(-1)
FFI_ERROR_BAD_MAC :: FFI_ERROR(-2)
FFI_ERROR_INSUFFICIENT_BUFFER_SPACE :: FFI_ERROR(-10)
FFI_ERROR_EXCEPTION_THROWN :: FFI_ERROR(-20)
FFI_ERROR_OUT_OF_MEMORY :: FFI_ERROR(-21)
FFI_ERROR_BAD_FLAG :: FFI_ERROR(-30)
FFI_ERROR_NULL_POINTER :: FFI_ERROR(-31)
FFI_ERROR_BAD_PARAMETER :: FFI_ERROR(-32)
FFI_ERROR_KEY_NOT_SET :: FFI_ERROR(-33)
FFI_ERROR_INVALID_KEY_LENGTH :: FFI_ERROR(-34)
FFI_ERROR_NOT_IMPLEMENTED :: FFI_ERROR(-40)
FFI_ERROR_INVALID_OBJECT :: FFI_ERROR(-50)
FFI_ERROR_UNKNOWN_ERROR :: FFI_ERROR(-100)
FFI_HEX_LOWER_CASE :: 1
CIPHER_INIT_FLAG_MASK_DIRECTION :: 1
CIPHER_INIT_FLAG_ENCRYPT :: 0
CIPHER_INIT_FLAG_DECRYPT :: 1
CIPHER_UPDATE_FLAG_FINAL :: 1 << 0
CHECK_KEY_EXPENSIVE_TESTS :: 1
PRIVKEY_EXPORT_FLAG_DER :: 0
PRIVKEY_EXPORT_FLAG_PEM :: 1
PUBKEY_DER_FORMAT_SIGNATURE :: 1
FPE_FLAG_FE1_COMPAT_MODE :: 1
x509_cert_key_constraints :: #type c.int
NO_CONSTRAINTS :: x509_cert_key_constraints(0)
DIGITAL_SIGNATURE :: x509_cert_key_constraints(32768)
NON_REPUDIATION :: x509_cert_key_constraints(16384)
KEY_ENCIPHERMENT :: x509_cert_key_constraints(8192)
DATA_ENCIPHERMENT :: x509_cert_key_constraints(4096)
KEY_AGREEMENT :: x509_cert_key_constraints(2048)
KEY_CERT_SIGN :: x509_cert_key_constraints(1024)
CRL_SIGN :: x509_cert_key_constraints(512)
ENCIPHER_ONLY :: x509_cert_key_constraints(256)
DECIPHER_ONLY :: x509_cert_key_constraints(128)
HASH_SHA1 :: "SHA1"
HASH_SHA_224 :: "SHA-224"
HASH_SHA_256 :: "SHA-256"
HASH_SHA_384 :: "SHA-384"
HASH_SHA_512 :: "SHA-512"
HASH_SHA3_224 :: "SHA-3(224)"
HASH_SHA3_256 :: "SHA-3(256)"
HASH_SHA3_384 :: "SHA-3(384)"
HASH_SHA3_512 :: "SHA-3(512)"
HASH_SHAKE_128 :: "SHAKE-128"
HASH_SHAKE_256 :: "SHAKE-256"
HASH_KECCAK_224 :: "KECCAK(224)"
HASH_KECCAK_256 :: "KECCAK(256)"
HASH_KECCAK_384 :: "KECCAK(384)"
HASH_KECCAK_512 :: "KECCAK(512)"
HASH_RIPEMD_160 :: "RIPEMD-160"
HASH_WHIRLPOOL :: "Whirlpool"
HASH_BLAKE2B :: "BLAKE2b"
HASH_MD4 :: "MD4"
HASH_MD5 :: "MD5"
HASH_TIGER_128 :: "Tiger(16,3)"
HASH_TIGER_160 :: "Tiger(20,3)"
HASH_TIGER_192 :: "Tiger(24,3)"
HASH_GOST :: "GOST-34.11"
HASH_STREEBOG_256 :: "Streebog-256"
HASH_STREEBOG_512 :: "Streebog-512"
HASH_SM3 :: "SM3"
HASH_SKEIN_512_256 :: "Skein-512(256)"
HASH_SKEIN_512_512 :: "Skein-512(512)"
HASH_SKEIN_512_1024 :: "Skein-512(1024)"
// Not real values from Botan, only used for context setup within the crypto lib
HASH_SHA2 :: "SHA2"
HASH_SHA3 :: "SHA3"
HASH_SHAKE :: "SHAKE"
HASH_KECCAK :: "KECCAK"
HASH_STREEBOG :: "STREEBOG"
HASH_TIGER :: "TIGER"
HASH_SKEIN_512 :: "SKEIN_512"
MAC_HMAC_SHA1 :: "HMAC(SHA1)"
MAC_HMAC_SHA_224 :: "HMAC(SHA-224)"
MAC_HMAC_SHA_256 :: "HMAC(SHA-256)"
MAC_HMAC_SHA_384 :: "HMAC(SHA-384)"
MAC_HMAC_SHA_512 :: "HMAC(SHA-512)"
MAC_HMAC_MD5 :: "HMAC(MD5)"
hash_struct :: struct{}
hash_t :: ^hash_struct
rng_struct :: struct{}
rng_t :: ^rng_struct
mac_struct :: struct{}
mac_t :: ^mac_struct
cipher_struct :: struct{}
cipher_t :: ^cipher_struct
block_cipher_struct :: struct{}
block_cipher_t :: ^block_cipher_struct
mp_struct :: struct{}
mp_t :: ^mp_struct
privkey_struct :: struct{}
privkey_t :: ^privkey_struct
pubkey_struct :: struct{}
pubkey_t :: ^pubkey_struct
pk_op_encrypt_struct :: struct{}
pk_op_encrypt_t :: ^pk_op_encrypt_struct
pk_op_decrypt_struct :: struct{}
pk_op_decrypt_t :: ^pk_op_decrypt_struct
pk_op_sign_struct :: struct{}
pk_op_sign_t :: ^pk_op_sign_struct
pk_op_verify_struct :: struct{}
pk_op_verify_t :: ^pk_op_verify_struct
pk_op_ka_struct :: struct{}
pk_op_ka_t :: ^pk_op_ka_struct
x509_cert_struct :: struct{}
x509_cert_t :: ^x509_cert_struct
x509_crl_struct :: struct{}
x509_crl_t :: ^x509_crl_struct
hotp_struct :: struct{}
hotp_t :: ^hotp_struct
totp_struct :: struct{}
totp_t :: ^totp_struct
fpe_struct :: struct{}
fpe_t :: ^fpe_struct
when ODIN_OS == "windows" {
foreign import botan_lib "botan.lib"
} else when ODIN_OS == "linux" {
foreign import botan_lib "system:botan-2"
} else when ODIN_OS == "darwin" {
foreign import botan_lib "system:botan-2"
}
@(default_calling_convention="c")
@(link_prefix="botan_")
foreign botan_lib {
error_description :: proc(err: c.int) -> cstring ---
ffi_api_version :: proc() -> c.int ---
ffi_supports_api :: proc(api_version: c.int) -> c.int ---
version_string :: proc() -> cstring ---
version_major :: proc() -> c.int ---
version_minor :: proc() -> c.int ---
version_patch :: proc() -> c.int ---
version_datestamp :: proc() -> c.int ---
constant_time_compare :: proc(x, y: ^c.char, length: c.size_t) -> c.int ---
same_mem :: proc(x, y: ^c.char, length: c.size_t) -> c.int ---
scrub_mem :: proc(mem: rawptr, bytes: c.size_t) -> c.int ---
hex_encode :: proc(x: ^c.char, length: c.size_t, out: ^c.char, flags: c.uint) -> c.int ---
hex_decode :: proc(hex_str: cstring, in_len: c.size_t, out: ^c.char, out_len: c.size_t) -> c.int ---
base64_encode :: proc(x: ^c.char, length: c.size_t, out: ^c.char, out_len: c.size_t) -> c.int ---
base64_decode :: proc(base64_str: cstring, in_len: c.size_t, out: ^c.char, out_len: c.size_t) -> c.int ---
rng_init :: proc(rng: ^rng_t, rng_type: cstring) -> c.int ---
rng_init_custom :: proc(rng_out: ^rng_t, rng_name: cstring, ctx: rawptr,
get_cb: proc(ctx: rawptr, out: ^c.char, out_len: c.size_t) -> ^c.int,
add_entropy_cb: proc(ctx: rawptr, input: ^c.char, length: c.size_t) -> ^c.int,
destroy_cb: proc(ctx: rawptr) -> rawptr) -> c.int ---
rng_get :: proc(rng: rng_t, out: ^c.char, out_len: c.size_t) -> c.int ---
rng_reseed :: proc(rng: rng_t, bits: c.size_t) -> c.int ---
rng_reseed_from_rng :: proc(rng, source_rng: rng_t, bits: c.size_t) -> c.int ---
rng_add_entropy :: proc(rng: rng_t, entropy: ^c.char, entropy_len: c.size_t) -> c.int ---
rng_destroy :: proc(rng: rng_t) -> c.int ---
hash_init :: proc(hash: ^hash_t, hash_name: cstring, flags: c.uint) -> c.int ---
hash_copy_state :: proc(dest: ^hash_t, source: hash_t) -> c.int ---
hash_output_length :: proc(hash: hash_t, output_length: ^c.size_t) -> c.int ---
hash_block_size :: proc(hash: hash_t, block_size: ^c.size_t) -> c.int ---
hash_update :: proc(hash: hash_t, input: ^c.char, input_len: c.size_t) -> c.int ---
hash_final :: proc(hash: hash_t, out: ^c.char) -> c.int ---
hash_clear :: proc(hash: hash_t) -> c.int ---
hash_destroy :: proc(hash: hash_t) -> c.int ---
hash_name :: proc(hash: hash_t, name: ^c.char, name_len: ^c.size_t) -> c.int ---
mac_init :: proc(mac: ^mac_t, hash_name: cstring, flags: c.uint) -> c.int ---
mac_output_length :: proc(mac: mac_t, output_length: ^c.size_t) -> c.int ---
mac_set_key :: proc(mac: mac_t, key: ^c.char, key_len: c.size_t) -> c.int ---
mac_update :: proc(mac: mac_t, buf: ^c.char, length: c.size_t) -> c.int ---
mac_final :: proc(mac: mac_t, out: ^c.char) -> c.int ---
mac_clear :: proc(mac: mac_t) -> c.int ---
mac_name :: proc(mac: mac_t, name: ^c.char, name_len: ^c.size_t) -> c.int ---
mac_get_keyspec :: proc(mac: mac_t, out_minimum_keylength, out_maximum_keylength, out_keylength_modulo: ^c.size_t) -> c.int ---
mac_destroy :: proc(mac: mac_t) -> c.int ---
cipher_init :: proc(cipher: ^cipher_t, name: cstring, flags: c.uint) -> c.int ---
cipher_name :: proc(cipher: cipher_t, name: ^c.char, name_len: ^c.size_t) -> c.int ---
cipher_output_length :: proc(cipher: cipher_t, output_length: ^c.size_t) -> c.int ---
cipher_valid_nonce_length :: proc(cipher: cipher_t, nl: c.size_t) -> c.int ---
cipher_get_tag_length :: proc(cipher: cipher_t, tag_size: ^c.size_t) -> c.int ---
cipher_get_default_nonce_length :: proc(cipher: cipher_t, nl: ^c.size_t) -> c.int ---
cipher_get_update_granularity :: proc(cipher: cipher_t, ug: ^c.size_t) -> c.int ---
cipher_query_keylen :: proc(cipher: cipher_t, out_minimum_keylength, out_maximum_keylength: ^c.size_t) -> c.int ---
cipher_get_keyspec :: proc(cipher: cipher_t, min_keylen, max_keylen, mod_keylen: ^c.size_t) -> c.int ---
cipher_set_key :: proc(cipher: cipher_t, key: ^c.char, key_len: c.size_t) -> c.int ---
cipher_reset :: proc(cipher: cipher_t) -> c.int ---
cipher_set_associated_data :: proc(cipher: cipher_t, ad: ^c.char, ad_len: c.size_t) -> c.int ---
cipher_start :: proc(cipher: cipher_t, nonce: ^c.char, nonce_len: c.size_t) -> c.int ---
cipher_update :: proc(cipher: cipher_t, flags: c.uint, output: ^c.char, output_size: c.size_t, output_written: ^c.size_t,
input_bytes: ^c.char, input_size: c.size_t, input_consumed: ^c.size_t) -> c.int ---
cipher_clear :: proc(hash: cipher_t) -> c.int ---
cipher_destroy :: proc(cipher: cipher_t) -> c.int ---
@(deprecated="Use botan.pwdhash")
pbkdf :: proc(pbkdf_algo: cstring, out: ^c.char, out_len: c.size_t, passphrase: cstring, salt: ^c.char,
salt_len, iterations: c.size_t) -> c.int ---
@(deprecated="Use botan.pwdhash_timed")
pbkdf_timed :: proc(pbkdf_algo: cstring, out: ^c.char, out_len: c.size_t, passphrase: cstring, salt: ^c.char,
salt_len, milliseconds_to_run: c.size_t, out_iterations_used: ^c.size_t) -> c.int ---
pwdhash :: proc(algo: cstring, param1, param2, param3: c.size_t, out: ^c.char, out_len: c.size_t, passphrase: cstring,
passphrase_len: c.size_t, salt: ^c.char, salt_len: c.size_t) -> c.int ---
pwdhash_timed :: proc(algo: cstring, msec: c.uint, param1, param2, param3: c.size_t, out: ^c.char, out_len: c.size_t,
passphrase: cstring, passphrase_len: c.size_t, salt: ^c.char, salt_len: c.size_t) -> c.int ---
@(deprecated="Use botan.pwdhash")
scrypt :: proc(out: ^c.char, out_len: c.size_t, passphrase: cstring, salt: ^c.char, salt_len, N, r, p: c.size_t) -> c.int ---
kdf :: proc(kdf_algo: cstring, out: ^c.char, out_len: c.size_t, secret: ^c.char, secret_lent: c.size_t, salt: ^c.char,
salt_len: c.size_t, label: ^c.char, label_len: c.size_t) -> c.int ---
block_cipher_init :: proc(bc: ^block_cipher_t, name: cstring) -> c.int ---
block_cipher_destroy :: proc(bc: block_cipher_t) -> c.int ---
block_cipher_clear :: proc(bc: block_cipher_t) -> c.int ---
block_cipher_set_key :: proc(bc: block_cipher_t, key: ^c.char, key_len: c.size_t) -> c.int ---
block_cipher_block_size :: proc(bc: block_cipher_t) -> c.int ---
block_cipher_encrypt_blocks :: proc(bc: block_cipher_t, input, out: ^c.char, blocks: c.size_t) -> c.int ---
block_cipher_decrypt_blocks :: proc(bc: block_cipher_t, input, out: ^c.char, blocks: c.size_t) -> c.int ---
block_cipher_name :: proc(bc: block_cipher_t, name: ^c.char, name_len: ^c.size_t) -> c.int ---
block_cipher_get_keyspec :: proc(bc: block_cipher_t, out_minimum_keylength, out_maximum_keylength, out_keylength_modulo: ^c.size_t) -> c.int ---
mp_init :: proc(mp: ^mp_t) -> c.int ---
mp_destroy :: proc(mp: mp_t) -> c.int ---
mp_to_hex :: proc(mp: mp_t, out: ^c.char) -> c.int ---
mp_to_str :: proc(mp: mp_t, base: c.char, out: ^c.char, out_len: ^c.size_t) -> c.int ---
mp_clear :: proc(mp: mp_t) -> c.int ---
mp_set_from_int :: proc(mp: mp_t, initial_value: c.int) -> c.int ---
mp_set_from_mp :: proc(dest, source: mp_t) -> c.int ---
mp_set_from_str :: proc(dest: mp_t, str: cstring) -> c.int ---
mp_set_from_radix_str :: proc(mp: mp_t, str: cstring, radix: c.size_t) -> c.int ---
mp_num_bits :: proc(n: mp_t, bits: ^c.size_t) -> c.int ---
mp_num_bytes :: proc(n: mp_t, bytes: ^c.size_t) -> c.int ---
mp_to_bin :: proc(mp: mp_t, vec: ^c.char) -> c.int ---
mp_from_bin :: proc(mp: mp_t, vec: ^c.char, vec_len: c.size_t) -> c.int ---
mp_to_uint32 :: proc(mp: mp_t, val: ^c.uint) -> c.int ---
mp_is_positive :: proc(mp: mp_t) -> c.int ---
mp_is_negative :: proc(mp: mp_t) -> c.int ---
mp_flip_sign :: proc(mp: mp_t) -> c.int ---
mp_is_zero :: proc(mp: mp_t) -> c.int ---
@(deprecated="Use botan.mp_get_bit(0)")
mp_is_odd :: proc(mp: mp_t) -> c.int ---
@(deprecated="Use botan.mp_get_bit(0)")
mp_is_even :: proc(mp: mp_t) -> c.int ---
mp_add_u32 :: proc(result, x: mp_t, y: c.uint) -> c.int ---
mp_sub_u32 :: proc(result, x: mp_t, y: c.uint) -> c.int ---
mp_add :: proc(result, x, y: mp_t) -> c.int ---
mp_sub :: proc(result, x, y: mp_t) -> c.int ---
mp_mul :: proc(result, x, y: mp_t) -> c.int ---
mp_div :: proc(quotient, remainder, x, y: mp_t) -> c.int ---
mp_mod_mul :: proc(result, x, y, mod: mp_t) -> c.int ---
mp_equal :: proc(x, y: mp_t) -> c.int ---
mp_cmp :: proc(result: ^c.int, x, y: mp_t) -> c.int ---
mp_swap :: proc(x, y: mp_t) -> c.int ---
mp_powmod :: proc(out, base, exponent, modulus: mp_t) -> c.int ---
mp_lshift :: proc(out, input: mp_t, shift: c.size_t) -> c.int ---
mp_rshift :: proc(out, input: mp_t, shift: c.size_t) -> c.int ---
mp_mod_inverse :: proc(out, input, modulus: mp_t) -> c.int ---
mp_rand_bits :: proc(rand_out: mp_t, rng: rng_t, bits: c.size_t) -> c.int ---
mp_rand_range :: proc(rand_out: mp_t, rng: rng_t, lower_bound, upper_bound: mp_t) -> c.int ---
mp_gcd :: proc(out, x, y: mp_t) -> c.int ---
mp_is_prime :: proc(n: mp_t, rng: rng_t, test_prob: c.size_t) -> c.int ---
mp_get_bit :: proc(n: mp_t, bit: c.size_t) -> c.int ---
mp_set_bit :: proc(n: mp_t, bit: c.size_t) -> c.int ---
mp_clear_bit :: proc(n: mp_t, bit: c.size_t) -> c.int ---
bcrypt_generate :: proc(out: ^c.char, out_len: ^c.size_t, password: cstring, rng: rng_t, work_factor: c.size_t, flags: c.uint) -> c.int ---
bcrypt_is_valid :: proc(pass, hash: cstring) -> c.int ---
privkey_create :: proc(key: ^privkey_t, algo_name, algo_params: cstring, rng: rng_t) -> c.int ---
@(deprecated="Use botan.privkey_create")
privkey_check_key :: proc(key: privkey_t, rng: rng_t, flags: c.uint) -> c.int ---
@(deprecated="Use botan.privkey_create")
privkey_create_rsa :: proc(key: ^privkey_t, rng: rng_t, bits: c.size_t) -> c.int ---
@(deprecated="Use botan.privkey_create")
privkey_create_ecdsa :: proc(key: ^privkey_t, rng: rng_t, params: cstring) -> c.int ---
@(deprecated="Use botan.privkey_create")
privkey_create_ecdh :: proc(key: ^privkey_t, rng: rng_t, params: cstring) -> c.int ---
@(deprecated="Use botan.privkey_create")
privkey_create_mceliece :: proc(key: ^privkey_t, rng: rng_t, n, t: c.size_t) -> c.int ---
@(deprecated="Use botan.privkey_create")
privkey_create_dh :: proc(key: ^privkey_t, rng: rng_t, param: cstring) -> c.int ---
privkey_create_dsa :: proc(key: ^privkey_t, rng: rng_t, pbits, qbits: c.size_t) -> c.int ---
privkey_create_elgamal :: proc(key: ^privkey_t, rng: rng_t, pbits, qbits: c.size_t) -> c.int ---
privkey_load :: proc(key: ^privkey_t, rng: rng_t, bits: ^c.char, length: c.size_t, password: cstring) -> c.int ---
privkey_destroy :: proc(key: privkey_t) -> c.int ---
privkey_export :: proc(key: privkey_t, out: ^c.char, out_len: ^c.size_t, flags: c.uint) -> c.int ---
privkey_algo_name :: proc(key: privkey_t, out: ^c.char, out_len: ^c.size_t) -> c.int ---
@(deprecated="Use botan.privkey_export_encrypted_pbkdf_{msec,iter}")
privkey_export_encrypted :: proc(key: privkey_t, out: ^c.char, out_len: ^c.size_t, rng: rng_t, passphrase, encryption_algo: cstring, flags: c.uint) -> c.int ---
privkey_export_encrypted_pbkdf_msec :: proc(key: privkey_t, out: ^c.char, out_len: ^c.size_t, rng: rng_t, passphrase: cstring, pbkdf_msec_runtime: c.uint,
pbkdf_iterations_out: ^c.size_t, cipher_algo, pbkdf_algo: cstring, flags: c.uint) -> c.int ---
privkey_export_encrypted_pbkdf_iter :: proc(key: privkey_t, out: ^c.char, out_len: ^c.size_t, rng: rng_t, passphrase: cstring, pbkdf_iterations: c.size_t,
cipher_algo, pbkdf_algo: cstring, flags: c.uint) -> c.int ---
pubkey_load :: proc(key: ^pubkey_t, bits: ^c.char, length: c.size_t) -> c.int ---
privkey_export_pubkey :: proc(out: ^pubkey_t, input: privkey_t) -> c.int ---
pubkey_export :: proc(key: pubkey_t, out: ^c.char, out_len: ^c.size_t, flags: c.uint) -> c.int ---
pubkey_algo_name :: proc(key: pubkey_t, out: ^c.char, out_len: ^c.size_t) -> c.int ---
pubkey_check_key :: proc(key: pubkey_t, rng: rng_t, flags: c.uint) -> c.int ---
pubkey_estimated_strength :: proc(key: pubkey_t, estimate: ^c.size_t) -> c.int ---
pubkey_fingerprint :: proc(key: pubkey_t, hash: cstring, out: ^c.char, out_len: ^c.size_t) -> c.int ---
pubkey_destroy :: proc(key: pubkey_t) -> c.int ---
pubkey_get_field :: proc(output: mp_t, key: pubkey_t, field_name: cstring) -> c.int ---
privkey_get_field :: proc(output: mp_t, key: privkey_t, field_name: cstring) -> c.int ---
privkey_load_rsa :: proc(key: ^privkey_t, p, q, e: mp_t) -> c.int ---
privkey_load_rsa_pkcs1 :: proc(key: ^privkey_t, bits: ^c.char, length: c.size_t) -> c.int ---
@(deprecated="Use botan.privkey_get_field")
privkey_rsa_get_p :: proc(p: mp_t, rsa_key: privkey_t) -> c.int ---
@(deprecated="Use botan.privkey_get_field")
privkey_rsa_get_q :: proc(q: mp_t, rsa_key: privkey_t) -> c.int ---
@(deprecated="Use botan.privkey_get_field")
privkey_rsa_get_d :: proc(d: mp_t, rsa_key: privkey_t) -> c.int ---
@(deprecated="Use botan.privkey_get_field")
privkey_rsa_get_n :: proc(n: mp_t, rsa_key: privkey_t) -> c.int ---
@(deprecated="Use botan.privkey_get_field")
privkey_rsa_get_e :: proc(e: mp_t, rsa_key: privkey_t) -> c.int ---
privkey_rsa_get_privkey :: proc(rsa_key: privkey_t, out: ^c.char, out_len: ^c.size_t, flags: c.uint) -> c.int ---
pubkey_load_rsa :: proc(key: ^pubkey_t, n, e: mp_t) -> c.int ---
@(deprecated="Use botan.pubkey_get_field")
pubkey_rsa_get_e :: proc(e: mp_t, rsa_key: pubkey_t) -> c.int ---
@(deprecated="Use botan.pubkey_get_field")
pubkey_rsa_get_n :: proc(n: mp_t, rsa_key: pubkey_t) -> c.int ---
privkey_load_dsa :: proc(key: ^privkey_t, p, q, g, x: mp_t) -> c.int ---
pubkey_load_dsa :: proc(key: ^pubkey_t, p, q, g, y: mp_t) -> c.int ---
@(deprecated="Use botan.pubkey_get_field")
privkey_dsa_get_x :: proc(n: mp_t, key: privkey_t) -> c.int ---
@(deprecated="Use botan.pubkey_get_field")
pubkey_dsa_get_p :: proc(p: mp_t, key: pubkey_t) -> c.int ---
@(deprecated="Use botan.pubkey_get_field")
pubkey_dsa_get_q :: proc(q: mp_t, key: pubkey_t) -> c.int ---
@(deprecated="Use botan.pubkey_get_field")
pubkey_dsa_get_g :: proc(d: mp_t, key: pubkey_t) -> c.int ---
@(deprecated="Use botan.pubkey_get_field")
pubkey_dsa_get_y :: proc(y: mp_t, key: pubkey_t) -> c.int ---
privkey_load_dh :: proc(key: ^privkey_t, p, g, y: mp_t) -> c.int ---
pubkey_load_dh :: proc(key: ^pubkey_t, p, g, x: mp_t) -> c.int ---
privkey_load_elgamal :: proc(key: ^privkey_t, p, g, y: mp_t) -> c.int ---
pubkey_load_elgamal :: proc(key: ^pubkey_t, p, g, x: mp_t) -> c.int ---
privkey_load_ed25519 :: proc(key: ^privkey_t, privkey: [32]c.char) -> c.int ---
pubkey_load_ed25519 :: proc(key: ^pubkey_t, pubkey: [32]c.char) -> c.int ---
privkey_ed25519_get_privkey :: proc(key: ^privkey_t, output: [64]c.char) -> c.int ---
pubkey_ed25519_get_pubkey :: proc(key: ^pubkey_t, pubkey: [32]c.char) -> c.int ---
privkey_load_x25519 :: proc(key: ^privkey_t, privkey: [32]c.char) -> c.int ---
pubkey_load_x25519 :: proc(key: ^pubkey_t, pubkey: [32]c.char) -> c.int ---
privkey_x25519_get_privkey :: proc(key: ^privkey_t, output: [32]c.char) -> c.int ---
pubkey_x25519_get_pubkey :: proc(key: ^pubkey_t, pubkey: [32]c.char) -> c.int ---
privkey_load_ecdsa :: proc(key: ^privkey_t, scalar: mp_t, curve_name: cstring) -> c.int ---
pubkey_load_ecdsa :: proc(key: ^pubkey_t, public_x, public_y: mp_t, curve_name: cstring) -> c.int ---
pubkey_load_ecdh :: proc(key: ^pubkey_t, public_x, public_y: mp_t, curve_name: cstring) -> c.int ---
privkey_load_ecdh :: proc(key: ^privkey_t, scalar: mp_t, curve_name: cstring) -> c.int ---
pubkey_load_sm2 :: proc(key: ^pubkey_t, public_x, public_y: mp_t, curve_name: cstring) -> c.int ---
privkey_load_sm2 :: proc(key: ^privkey_t, scalar: mp_t, curve_name: cstring) -> c.int ---
@(deprecated="Use botan.pubkey_load_sm2")
pubkey_load_sm2_enc :: proc(key: ^pubkey_t, public_x, public_y: mp_t, curve_name: cstring) -> c.int ---
@(deprecated="Use botan.privkey_load_sm2")
privkey_load_sm2_enc :: proc(key: ^privkey_t, scalar: mp_t, curve_name: cstring) -> c.int ---
pubkey_sm2_compute_za :: proc(out: ^c.char, out_len: ^c.size_t, ident, hash_algo: cstring, key: pubkey_t) -> c.int ---
pk_op_encrypt_create :: proc(op: ^pk_op_encrypt_t, key: pubkey_t, padding: cstring, flags: c.uint) -> c.int ---
pk_op_encrypt_destroy :: proc(op: pk_op_encrypt_t) -> c.int ---
pk_op_encrypt_output_length :: proc(op: pk_op_encrypt_t, ptext_len: c.size_t, ctext_len: ^c.size_t) -> c.int ---
pk_op_encrypt :: proc(op: pk_op_encrypt_t, rng: rng_t, out: ^c.char, out_len: ^c.size_t, plaintext: cstring, plaintext_len: c.size_t) -> c.int ---
pk_op_decrypt_create :: proc(op: ^pk_op_decrypt_t, key: privkey_t, padding: cstring, flags: c.uint) -> c.int ---
pk_op_decrypt_destroy :: proc(op: pk_op_decrypt_t) -> c.int ---
pk_op_decrypt_output_length :: proc(op: pk_op_decrypt_t, ptext_len: c.size_t, ctext_len: ^c.size_t) -> c.int ---
pk_op_decrypt :: proc(op: pk_op_decrypt_t, rng: rng_t, out: ^c.char, out_len: ^c.size_t, ciphertext: cstring, ciphertext_len: c.size_t) -> c.int ---
pk_op_sign_create :: proc(op: ^pk_op_sign_t, key: privkey_t, hash_and_padding: cstring, flags: c.uint) -> c.int ---
pk_op_sign_destroy :: proc(op: pk_op_sign_t) -> c.int ---
pk_op_sign_output_length :: proc(op: pk_op_sign_t, olen: ^c.size_t) -> c.int ---
pk_op_sign_update :: proc(op: pk_op_sign_t, input: ^c.char, input_len: c.size_t) -> c.int ---
pk_op_sign_finish :: proc(op: pk_op_sign_t, rng: rng_t, sig: ^c.char, sig_len: ^c.size_t) -> c.int ---
pk_op_verify_create :: proc(op: ^pk_op_verify_t, hash_and_padding: cstring, flags: c.uint) -> c.int ---
pk_op_verify_destroy :: proc(op: pk_op_verify_t) -> c.int ---
pk_op_verify_update :: proc(op: pk_op_verify_t, input: ^c.char, input_len: c.size_t) -> c.int ---
pk_op_verify_finish :: proc(op: pk_op_verify_t, sig: ^c.char, sig_len: c.size_t) -> c.int ---
pk_op_key_agreement_create :: proc(op: ^pk_op_ka_t, kdf: cstring, flags: c.uint) -> c.int ---
pk_op_key_agreement_destroy :: proc(op: pk_op_ka_t) -> c.int ---
pk_op_key_agreement_export_public :: proc(key: privkey_t, out: ^c.char, out_len: ^c.size_t) -> c.int ---
pk_op_key_agreement_size :: proc(op: pk_op_ka_t, out_len: ^c.size_t) -> c.int ---
pk_op_key_agreement :: proc(op: pk_op_ka_t, out: ^c.char, out_len: ^c.size_t, other_key: ^c.char, other_key_len: c.size_t, salt: ^c.char,
salt_len: c.size_t) -> c.int ---
pkcs_hash_id :: proc(hash_name: cstring, pkcs_id: ^c.char, pkcs_id_len: ^c.size_t) -> c.int ---
@(deprecated="Poorly specified, avoid in new code")
mceies_encrypt :: proc(mce_key: pubkey_t, rng: rng_t, aead: cstring, pt: ^c.char, pt_len: c.size_t, ad: ^c.char, ad_len: c.size_t,
ct: ^c.char, ct_len: ^c.size_t) -> c.int ---
@(deprecated="Poorly specified, avoid in new code")
mceies_decrypt :: proc(mce_key: privkey_t, aead: cstring, ct: ^c.char, ct_len: c.size_t, ad: ^c.char, ad_len: c.size_t, pt: ^c.char,
pt_len: ^c.size_t) -> c.int ---
x509_cert_load :: proc(cert_obj: ^x509_cert_t, cert: ^c.char, cert_len: c.size_t) -> c.int ---
x509_cert_load_file :: proc(cert_obj: ^x509_cert_t, filename: cstring) -> c.int ---
x509_cert_destroy :: proc(cert: x509_cert_t) -> c.int ---
x509_cert_dup :: proc(new_cert: ^x509_cert_t, cert: x509_cert_t) -> c.int ---
x509_cert_get_time_starts :: proc(cert: x509_cert_t, out: ^c.char, out_len: ^c.size_t) -> c.int ---
x509_cert_get_time_expires :: proc(cert: x509_cert_t, out: ^c.char, out_len: ^c.size_t) -> c.int ---
x509_cert_not_before :: proc(cert: x509_cert_t, time_since_epoch: ^c.ulonglong) -> c.int ---
x509_cert_not_after :: proc(cert: x509_cert_t, time_since_epoch: ^c.ulonglong) -> c.int ---
x509_cert_get_fingerprint :: proc(cert: x509_cert_t, hash: cstring, out: ^c.char, out_len: ^c.size_t) -> c.int ---
x509_cert_get_serial_number :: proc(cert: x509_cert_t, out: ^c.char, out_len: ^c.size_t) -> c.int ---
x509_cert_get_authority_key_id :: proc(cert: x509_cert_t, out: ^c.char, out_len: ^c.size_t) -> c.int ---
x509_cert_get_subject_key_id :: proc(cert: x509_cert_t, out: ^c.char, out_len: ^c.size_t) -> c.int ---
x509_cert_get_public_key_bits :: proc(cert: x509_cert_t, out: ^c.char, out_len: ^c.size_t) -> c.int ---
x509_cert_get_public_key :: proc(cert: x509_cert_t, key: ^pubkey_t) -> c.int ---
x509_cert_get_issuer_dn :: proc(cert: x509_cert_t, key: ^c.char, index: c.size_t, out: ^c.char, out_len: ^c.size_t) -> c.int ---
x509_cert_get_subject_dn :: proc(cert: x509_cert_t, key: ^c.char, index: c.size_t, out: ^c.char, out_len: ^c.size_t) -> c.int ---
x509_cert_to_string :: proc(cert: x509_cert_t, out: ^c.char, out_len: ^c.size_t) -> c.int ---
x509_cert_allowed_usage :: proc(cert: x509_cert_t, key_usage: c.uint) -> c.int ---
x509_cert_hostname_match :: proc(cert: x509_cert_t, hostname: cstring) -> c.int ---
x509_cert_verify :: proc(validation_result: ^c.int, cert: x509_cert_t, intermediates: ^x509_cert_t, intermediates_len: c.size_t, trusted: ^x509_cert_t,
trusted_len: c.size_t, trusted_path: cstring, required_strength: c.size_t, hostname: cstring, reference_time: c.ulonglong) -> c.int ---
x509_cert_validation_status :: proc(code: c.int) -> cstring ---
x509_crl_load_file :: proc(crl_obj: ^x509_crl_t, crl_path: cstring) -> c.int ---
x509_crl_load :: proc(crl_obj: ^x509_crl_t, crl_bits: ^c.char, crl_bits_len: c.size_t) -> c.int ---
x509_crl_destroy :: proc(crl: x509_crl_t) -> c.int ---
x509_is_revoked :: proc(crl: x509_crl_t, cert: x509_cert_t) -> c.int ---
x509_cert_verify_with_crl :: proc(validation_result: ^c.int, cert: x509_cert_t, intermediates: ^x509_cert_t, intermediates_len: c.size_t, trusted: ^x509_cert_t,
trusted_len: c.size_t, crls: ^x509_crl_t, crls_len: c.size_t, trusted_path: cstring, required_strength: c.size_t,
hostname: cstring, reference_time: c.ulonglong) -> c.int ---
key_wrap3394 :: proc(key: ^c.char, key_len: c.size_t, kek: ^c.char, kek_len: c.size_t, wrapped_key: ^c.char, wrapped_key_len: ^c.size_t) -> c.int ---
key_unwrap3394 :: proc(wrapped_key: ^c.char, wrapped_key_len: c.size_t, kek: ^c.char, kek_len: c.size_t, key: ^c.char, key_len: ^c.size_t) -> c.int ---
hotp_init :: proc(hotp: ^hotp_t, key: ^c.char, key_len: c.size_t, hash_algo: cstring, digits: c.size_t) -> c.int ---
hotp_destroy :: proc(hotp: hotp_t) -> c.int ---
hotp_generate :: proc(hotp: hotp_t, hotp_code: ^c.uint, hotp_counter: c.ulonglong) -> c.int ---
hotp_check :: proc(hotp: hotp_t, next_hotp_counter: ^c.ulonglong, hotp_code: c.uint, hotp_counter: c.ulonglong, resync_range: c.size_t) -> c.int ---
totp_init :: proc(totp: ^totp_t, key: ^c.char, key_len: c.size_t, hash_algo: cstring, digits, time_step: c.size_t) -> c.int ---
totp_destroy :: proc(totp: totp_t) -> c.int ---
totp_generate :: proc(totp: totp_t, totp_code: ^c.uint, timestamp: c.ulonglong) -> c.int ---
totp_check :: proc(totp: totp_t, totp_code: ^c.uint, timestamp: c.ulonglong, acceptable_clock_drift: c.size_t) -> c.int ---
fpe_fe1_init :: proc(fpe: ^fpe_t, n: mp_t, key: ^c.char, key_len, rounds: c.size_t, flags: c.uint) -> c.int ---
fpe_destroy :: proc(fpe: fpe_t) -> c.int ---
fpe_encrypt :: proc(fpe: fpe_t, x: mp_t, tweak: ^c.char, tweak_len: c.size_t) -> c.int ---
fpe_decrypt :: proc(fpe: fpe_t, x: mp_t, tweak: ^c.char, tweak_len: c.size_t) -> c.int ---
}

View File

@@ -1,498 +0,0 @@
package botan
/*
Copyright 2021 zhibog
Made available under the BSD-3 license.
List of contributors:
zhibog: Initial creation and testing of the bindings.
Implementation of the context for the Botan side.
*/
import "core:os"
import "core:io"
import "core:fmt"
import "core:strings"
import "../_ctx"
hash_bytes_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [16]byte {
hash: [16]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._16, 16), 0)
hash_update(c, len(data) == 0 ? nil : &data[0], uint(len(data)))
hash_final(c, &hash[0])
hash_destroy(c)
return hash
}
hash_bytes_20 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [20]byte {
hash: [20]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._20, 20), 0)
hash_update(c, len(data) == 0 ? nil : &data[0], uint(len(data)))
hash_final(c, &hash[0])
hash_destroy(c)
return hash
}
hash_bytes_24 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [24]byte {
hash: [24]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._24, 24), 0)
hash_update(c, len(data) == 0 ? nil : &data[0], uint(len(data)))
hash_final(c, &hash[0])
hash_destroy(c)
return hash
}
hash_bytes_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [28]byte {
hash: [28]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._28, 28), 0)
hash_update(c, len(data) == 0 ? nil : &data[0], uint(len(data)))
hash_final(c, &hash[0])
hash_destroy(c)
return hash
}
hash_bytes_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._32, 32), 0)
hash_update(c, len(data) == 0 ? nil : &data[0], uint(len(data)))
hash_final(c, &hash[0])
hash_destroy(c)
return hash
}
hash_bytes_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [48]byte {
hash: [48]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._48, 48), 0)
hash_update(c, len(data) == 0 ? nil : &data[0], uint(len(data)))
hash_final(c, &hash[0])
hash_destroy(c)
return hash
}
hash_bytes_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [64]byte {
hash: [64]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._64, 64), 0)
hash_update(c, len(data) == 0 ? nil : &data[0], uint(len(data)))
hash_final(c, &hash[0])
hash_destroy(c)
return hash
}
hash_bytes_128 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [128]byte {
hash: [128]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._128, 128), 0)
hash_update(c, len(data) == 0 ? nil : &data[0], uint(len(data)))
hash_final(c, &hash[0])
hash_destroy(c)
return hash
}
hash_bytes_slice :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte, bit_size: int, allocator := context.allocator) -> []byte {
hash := make([]byte, bit_size, allocator)
c: hash_t
hash_init(&c, _check_ctx(ctx, nil, bit_size), 0)
hash_update(c, len(data) == 0 ? nil : &data[0], uint(len(data)))
hash_final(c, &hash[0])
hash_destroy(c)
return hash[:]
}
hash_file_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
if !load_at_once {
return hash_stream_16(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_16(ctx, buf[:]), ok
}
}
return [16]byte{}, false
}
hash_file_20 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([20]byte, bool) {
if !load_at_once {
return hash_stream_20(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_20(ctx, buf[:]), ok
}
}
return [20]byte{}, false
}
hash_file_24 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([24]byte, bool) {
if !load_at_once {
return hash_stream_24(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_24(ctx, buf[:]), ok
}
}
return [24]byte{}, false
}
hash_file_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
if !load_at_once {
return hash_stream_28(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_28(ctx, buf[:]), ok
}
}
return [28]byte{}, false
}
hash_file_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_32(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_32(ctx, buf[:]), ok
}
}
return [32]byte{}, false
}
hash_file_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
if !load_at_once {
return hash_stream_48(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_48(ctx, buf[:]), ok
}
}
return [48]byte{}, false
}
hash_file_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
if !load_at_once {
return hash_stream_64(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_64(ctx, buf[:]), ok
}
}
return [64]byte{}, false
}
hash_file_128 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([128]byte, bool) {
if !load_at_once {
return hash_stream_128(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_128(ctx, buf[:]), ok
}
}
return [128]byte{}, false
}
hash_file_slice :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, bit_size: int, load_at_once := false, allocator := context.allocator) -> ([]byte, bool) {
if !load_at_once {
return hash_stream_slice(ctx, os.stream_from_handle(hd), bit_size, allocator)
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_slice(ctx, buf[:], bit_size, allocator), ok
}
}
return nil, false
}
hash_stream_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, s: io.Stream) -> ([16]byte, bool) {
hash: [16]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._16, 16), 0)
buf := make([]byte, 512)
defer delete(buf)
i := 1
for i > 0 {
i, _ = s->impl_read(buf)
if i > 0 {
hash_update(c, len(buf) == 0 ? nil : &buf[0], uint(i))
}
}
hash_final(c, &hash[0])
hash_destroy(c)
return hash, true
}
hash_stream_20 :: #force_inline proc(ctx: ^_ctx.Hash_Context, s: io.Stream) -> ([20]byte, bool) {
hash: [20]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._20, 20), 0)
buf := make([]byte, 512)
defer delete(buf)
i := 1
for i > 0 {
i, _ = s->impl_read(buf)
if i > 0 {
hash_update(c, len(buf) == 0 ? nil : &buf[0], uint(i))
}
}
hash_final(c, &hash[0])
hash_destroy(c)
return hash, true
}
hash_stream_24 :: #force_inline proc(ctx: ^_ctx.Hash_Context, s: io.Stream) -> ([24]byte, bool) {
hash: [24]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._24, 24), 0)
buf := make([]byte, 512)
defer delete(buf)
i := 1
for i > 0 {
i, _ = s->impl_read(buf)
if i > 0 {
hash_update(c, len(buf) == 0 ? nil : &buf[0], uint(i))
}
}
hash_final(c, &hash[0])
hash_destroy(c)
return hash, true
}
hash_stream_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, s: io.Stream) -> ([28]byte, bool) {
hash: [28]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._28, 28), 0)
buf := make([]byte, 512)
defer delete(buf)
i := 1
for i > 0 {
i, _ = s->impl_read(buf)
if i > 0 {
hash_update(c, len(buf) == 0 ? nil : &buf[0], uint(i))
}
}
hash_final(c, &hash[0])
hash_destroy(c)
return hash, true
}
hash_stream_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, s: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._32, 32), 0)
buf := make([]byte, 512)
defer delete(buf)
i := 1
for i > 0 {
i, _ = s->impl_read(buf)
if i > 0 {
hash_update(c, len(buf) == 0 ? nil : &buf[0], uint(i))
}
}
hash_final(c, &hash[0])
hash_destroy(c)
return hash, true
}
hash_stream_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, s: io.Stream) -> ([48]byte, bool) {
hash: [48]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._48, 48), 0)
buf := make([]byte, 512)
defer delete(buf)
i := 1
for i > 0 {
i, _ = s->impl_read(buf)
if i > 0 {
hash_update(c, len(buf) == 0 ? nil : &buf[0], uint(i))
}
}
hash_final(c, &hash[0])
hash_destroy(c)
return hash, true
}
hash_stream_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, s: io.Stream) -> ([64]byte, bool) {
hash: [64]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._64, 64), 0)
buf := make([]byte, 512)
defer delete(buf)
i := 1
for i > 0 {
i, _ = s->impl_read(buf)
if i > 0 {
hash_update(c, len(buf) == 0 ? nil : &buf[0], uint(i))
}
}
hash_final(c, &hash[0])
hash_destroy(c)
return hash, true
}
hash_stream_128 :: #force_inline proc(ctx: ^_ctx.Hash_Context, s: io.Stream) -> ([128]byte, bool) {
hash: [128]byte
c: hash_t
hash_init(&c, _check_ctx(ctx, _ctx.Hash_Size._128, 128), 0)
buf := make([]byte, 512)
defer delete(buf)
i := 1
for i > 0 {
i, _ = s->impl_read(buf)
if i > 0 {
hash_update(c, len(buf) == 0 ? nil : &buf[0], uint(i))
}
}
hash_final(c, &hash[0])
hash_destroy(c)
return hash, true
}
hash_stream_slice :: #force_inline proc(ctx: ^_ctx.Hash_Context, s: io.Stream, bit_size: int, allocator := context.allocator) -> ([]byte, bool) {
hash := make([]byte, bit_size, allocator)
c: hash_t
hash_init(&c, _check_ctx(ctx, nil, bit_size), 0)
buf := make([]byte, 512)
defer delete(buf)
i := 1
for i > 0 {
i, _ = s->impl_read(buf)
if i > 0 {
hash_update(c, len(buf) == 0 ? nil : &buf[0], uint(i))
}
}
hash_final(c, &hash[0])
hash_destroy(c)
return hash[:], true
}
init :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
c: hash_t
hash_init(&c, ctx.botan_hash_algo, 0)
ctx.external_ctx = c
}
update :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.external_ctx.(hash_t); ok {
hash_update(c, len(data) == 0 ? nil : &data[0], uint(len(data)))
}
}
final :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.external_ctx.(hash_t); ok {
hash_final(c, &hash[0])
hash_destroy(c)
}
}
assign_hash_vtable :: proc(ctx: ^_ctx.Hash_Context, hash_algo: cstring) {
ctx.init = init
ctx.update = update
ctx.final = final
ctx.botan_hash_algo = hash_algo
switch hash_algo {
case HASH_MD4, HASH_MD5:
ctx.hash_bytes_16 = hash_bytes_16
ctx.hash_file_16 = hash_file_16
ctx.hash_stream_16 = hash_stream_16
case HASH_SHA1, HASH_RIPEMD_160:
ctx.hash_bytes_20 = hash_bytes_20
ctx.hash_file_20 = hash_file_20
ctx.hash_stream_20 = hash_stream_20
case HASH_SHA2, HASH_SHA3:
ctx.hash_bytes_28 = hash_bytes_28
ctx.hash_file_28 = hash_file_28
ctx.hash_stream_28 = hash_stream_28
ctx.hash_bytes_32 = hash_bytes_32
ctx.hash_file_32 = hash_file_32
ctx.hash_stream_32 = hash_stream_32
ctx.hash_bytes_48 = hash_bytes_48
ctx.hash_file_48 = hash_file_48
ctx.hash_stream_48 = hash_stream_48
ctx.hash_bytes_64 = hash_bytes_64
ctx.hash_file_64 = hash_file_64
ctx.hash_stream_64 = hash_stream_64
case HASH_GOST, HASH_WHIRLPOOL, HASH_SM3:
ctx.hash_bytes_32 = hash_bytes_32
ctx.hash_file_32 = hash_file_32
ctx.hash_stream_32 = hash_stream_32
case HASH_STREEBOG:
ctx.hash_bytes_32 = hash_bytes_32
ctx.hash_file_32 = hash_file_32
ctx.hash_stream_32 = hash_stream_32
ctx.hash_bytes_64 = hash_bytes_64
ctx.hash_file_64 = hash_file_64
ctx.hash_stream_64 = hash_stream_64
case HASH_BLAKE2B:
ctx.hash_bytes_64 = hash_bytes_64
ctx.hash_file_64 = hash_file_64
ctx.hash_stream_64 = hash_stream_64
case HASH_TIGER:
ctx.hash_bytes_16 = hash_bytes_16
ctx.hash_file_16 = hash_file_16
ctx.hash_stream_16 = hash_stream_16
ctx.hash_bytes_20 = hash_bytes_20
ctx.hash_file_20 = hash_file_20
ctx.hash_stream_20 = hash_stream_20
ctx.hash_bytes_24 = hash_bytes_24
ctx.hash_file_24 = hash_file_24
ctx.hash_stream_24 = hash_stream_24
case HASH_SKEIN_512:
ctx.hash_bytes_slice = hash_bytes_slice
ctx.hash_file_slice = hash_file_slice
ctx.hash_stream_slice = hash_stream_slice
}
}
_check_ctx :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash_size: _ctx.Hash_Size, hash_size_val: int) -> cstring {
ctx.hash_size = hash_size
ctx.hash_size_val = hash_size_val
switch ctx.botan_hash_algo {
case HASH_SHA2:
#partial switch hash_size {
case ._28: return HASH_SHA_224
case ._32: return HASH_SHA_256
case ._48: return HASH_SHA_384
case ._64: return HASH_SHA_512
}
case HASH_SHA3:
#partial switch hash_size {
case ._28: return HASH_SHA3_224
case ._32: return HASH_SHA3_256
case ._48: return HASH_SHA3_384
case ._64: return HASH_SHA3_512
}
case HASH_KECCAK:
#partial switch hash_size {
case ._28: return HASH_KECCAK_224
case ._32: return HASH_KECCAK_256
case ._48: return HASH_KECCAK_384
case ._64: return HASH_KECCAK_512
}
case HASH_STREEBOG:
#partial switch hash_size {
case ._32: return HASH_STREEBOG_256
case ._64: return HASH_STREEBOG_512
}
case HASH_TIGER:
#partial switch hash_size {
case ._16: return HASH_TIGER_128
case ._20: return HASH_TIGER_160
case ._24: return HASH_TIGER_192
}
case HASH_SKEIN_512:
return strings.unsafe_string_to_cstring(fmt.tprintf("Skein-512(%d)", hash_size_val * 8))
case: return ctx.botan_hash_algo
}
return nil
}

View File

@@ -6,7 +6,6 @@ package gost
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the GOST hashing algorithm, as defined in RFC 5831 <https://datatracker.ietf.org/doc/html/rfc5831>
*/
@@ -15,42 +14,6 @@ import "core:mem"
import "core:os"
import "core:io"
import "../botan"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
_assign_hash_vtable(ctx)
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_32 = hash_bytes_odin
ctx.hash_file_32 = hash_file_odin
ctx.hash_stream_32 = hash_stream_odin
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan does nothing, since MD2 is not available in Botan
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_GOST)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
*/
@@ -64,22 +27,44 @@ hash_string :: proc(data: string) -> [32]byte {
// hash_bytes will hash the given input and return the
// computed hash
hash_bytes :: proc(data: []byte) -> [32]byte {
_create_gost_ctx()
return _hash_impl->hash_bytes_32(data)
hash: [32]byte
ctx: Gost_Context
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream will read the stream in chunks and compute a
// hash from its contents
hash_stream :: proc(s: io.Stream) -> ([32]byte, bool) {
_create_gost_ctx()
return _hash_impl->hash_stream_32(s)
hash: [32]byte
ctx: Gost_Context
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file will read the file provided by the given handle
// and compute a hash
hash_file :: proc(hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
_create_gost_ctx()
return _hash_impl->hash_file_32(hd, load_at_once)
if !load_at_once {
return hash_stream(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes(buf[:]), ok
}
}
return [32]byte{}, false
}
hash :: proc {
@@ -93,85 +78,77 @@ hash :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Gost_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
init :: proc "contextless" (ctx: ^Gost_Context) {
sbox: [8][16]u32 = {
{ 10, 4, 5, 6, 8, 1, 3, 7, 13, 12, 14, 0, 9, 2, 11, 15 },
{ 5, 15, 4, 0, 2, 13, 11, 9, 1, 7, 6, 3, 12, 14, 10, 8 },
{ 7, 15, 12, 14, 9, 4, 1, 0, 3, 11, 5, 2, 6, 10, 8, 13 },
{ 4, 10, 7, 12, 0, 15, 2, 8, 14, 1, 6, 5, 13, 11, 9, 3 },
{ 7, 6, 4, 11, 9, 12, 2, 10, 1, 8, 0, 14, 15, 13, 3, 5 },
{ 7, 6, 2, 4, 13, 9, 15, 0, 10, 1, 5, 11, 8, 14, 12, 3 },
{ 13, 14, 4, 1, 7, 0, 5, 10, 3, 12, 8, 15, 6, 2, 9, 11 },
{ 1, 3, 10, 9, 5, 11, 4, 15, 8, 6, 7, 14, 13, 0, 2, 12 },
}
return hash
}
hash_stream_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Gost_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_odin(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin(ctx, buf[:]), ok
i := 0
for a := 0; a < 16; a += 1 {
ax := sbox[1][a] << 15
bx := sbox[3][a] << 23
cx := sbox[5][a]
cx = (cx >> 1) | (cx << 31)
dx := sbox[7][a] << 7
for b := 0; b < 16; b, i = b + 1, i + 1 {
SBOX_1[i] = ax | (sbox[0][b] << 11)
SBOX_2[i] = bx | (sbox[2][b] << 19)
SBOX_3[i] = cx | (sbox[4][b] << 27)
SBOX_4[i] = dx | (sbox[6][b] << 3)
}
}
return [32]byte{}, false
}
@(private)
_create_gost_ctx :: #force_inline proc() {
ctx: Gost_Context
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._32
}
update :: proc(ctx: ^Gost_Context, data: []byte) {
length := byte(len(data))
j: byte
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_gost_ctx()
if c, ok := ctx.internal_ctx.(Gost_Context); ok {
init_odin(&c)
i := ctx.partial_bytes
for i < 32 && j < length {
ctx.partial[i] = data[j]
i, j = i + 1, j + 1
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Gost_Context); ok {
update_odin(&c, data)
if i < 32 {
ctx.partial_bytes = i
return
}
bytes(ctx, ctx.partial[:], 256)
for (j + 32) < length {
bytes(ctx, data[j:], 256)
j += 32
}
i = 0
for j < length {
ctx.partial[i] = data[j]
i, j = i + 1, j + 1
}
ctx.partial_bytes = i
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Gost_Context); ok {
final_odin(&c, hash)
final :: proc(ctx: ^Gost_Context, hash: []byte) {
if ctx.partial_bytes > 0 {
mem.set(&ctx.partial[ctx.partial_bytes], 0, 32 - int(ctx.partial_bytes))
bytes(ctx, ctx.partial[:], u32(ctx.partial_bytes) << 3)
}
compress(ctx.hash[:], ctx.len[:])
compress(ctx.hash[:], ctx.sum[:])
for i, j := 0, 0; i < 8; i, j = i + 1, j + 4 {
hash[j] = byte(ctx.hash[i])
hash[j + 1] = byte(ctx.hash[i] >> 8)
hash[j + 2] = byte(ctx.hash[i] >> 16)
hash[j + 3] = byte(ctx.hash[i] >> 24)
}
}
@@ -187,12 +164,12 @@ Gost_Context :: struct {
partial_bytes: byte,
}
SBOX_1 : [256]u32
SBOX_2 : [256]u32
SBOX_3 : [256]u32
SBOX_4 : [256]u32
SBOX_1: [256]u32
SBOX_2: [256]u32
SBOX_3: [256]u32
SBOX_4: [256]u32
GOST_ENCRYPT_ROUND :: #force_inline proc "contextless"(l, r, t, k1, k2: u32) -> (u32, u32, u32) {
ENCRYPT_ROUND :: #force_inline proc "contextless" (l, r, t, k1, k2: u32) -> (u32, u32, u32) {
l, r, t := l, r, t
t = (k1) + r
l ~= SBOX_1[t & 0xff] ~ SBOX_2[(t >> 8) & 0xff] ~ SBOX_3[(t >> 16) & 0xff] ~ SBOX_4[t >> 24]
@@ -201,30 +178,30 @@ GOST_ENCRYPT_ROUND :: #force_inline proc "contextless"(l, r, t, k1, k2: u32) ->
return l, r, t
}
GOST_ENCRYPT :: #force_inline proc "contextless"(a, b, c: u32, key: []u32) -> (l, r, t: u32) {
l, r, t = GOST_ENCRYPT_ROUND(a, b, c, key[0], key[1])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[2], key[3])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[4], key[5])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[6], key[7])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[0], key[1])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[2], key[3])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[4], key[5])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[6], key[7])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[0], key[1])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[2], key[3])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[4], key[5])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[6], key[7])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[7], key[6])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[5], key[4])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[3], key[2])
l, r, t = GOST_ENCRYPT_ROUND(l, r, t, key[1], key[0])
ENCRYPT :: #force_inline proc "contextless" (a, b, c: u32, key: []u32) -> (l, r, t: u32) {
l, r, t = ENCRYPT_ROUND(a, b, c, key[0], key[1])
l, r, t = ENCRYPT_ROUND(l, r, t, key[2], key[3])
l, r, t = ENCRYPT_ROUND(l, r, t, key[4], key[5])
l, r, t = ENCRYPT_ROUND(l, r, t, key[6], key[7])
l, r, t = ENCRYPT_ROUND(l, r, t, key[0], key[1])
l, r, t = ENCRYPT_ROUND(l, r, t, key[2], key[3])
l, r, t = ENCRYPT_ROUND(l, r, t, key[4], key[5])
l, r, t = ENCRYPT_ROUND(l, r, t, key[6], key[7])
l, r, t = ENCRYPT_ROUND(l, r, t, key[0], key[1])
l, r, t = ENCRYPT_ROUND(l, r, t, key[2], key[3])
l, r, t = ENCRYPT_ROUND(l, r, t, key[4], key[5])
l, r, t = ENCRYPT_ROUND(l, r, t, key[6], key[7])
l, r, t = ENCRYPT_ROUND(l, r, t, key[7], key[6])
l, r, t = ENCRYPT_ROUND(l, r, t, key[5], key[4])
l, r, t = ENCRYPT_ROUND(l, r, t, key[3], key[2])
l, r, t = ENCRYPT_ROUND(l, r, t, key[1], key[0])
t = r
r = l
l = t
return
}
gost_bytes :: proc(ctx: ^Gost_Context, buf: []byte, bits: u32) {
bytes :: proc(ctx: ^Gost_Context, buf: []byte, bits: u32) {
a, c: u32
m: [8]u32
@@ -237,14 +214,14 @@ gost_bytes :: proc(ctx: ^Gost_Context, buf: []byte, bits: u32) {
c = c < a ? 1 : 0
}
gost_compress(ctx.hash[:], m[:])
compress(ctx.hash[:], m[:])
ctx.len[0] += bits
if ctx.len[0] < bits {
ctx.len[1] += 1
}
}
gost_compress :: proc(h, m: []u32) {
compress :: proc(h, m: []u32) {
key, u, v, w, s: [8]u32
copy(u[:], h)
@@ -272,7 +249,7 @@ gost_compress :: proc(h, m: []u32) {
r := h[i]
l := h[i + 1]
t: u32
l, r, t = GOST_ENCRYPT(l, r, 0, key[:])
l, r, t = ENCRYPT(l, r, 0, key[:])
s[i] = r
s[i + 1] = l
@@ -380,78 +357,4 @@ gost_compress :: proc(h, m: []u32) {
h[7] = v[0] ~ (v[0] >> 16) ~ (v[1] << 16) ~ (v[1] >> 16) ~ (v[2] << 16) ~
(v[3] >> 16) ~ v[3] ~ (v[4] << 16) ~ v[4] ~ (v[5] >> 16) ~ v[5] ~
(v[6] << 16) ~ (v[6] >> 16) ~ (v[7] << 16) ~ v[7]
}
init_odin :: proc(ctx: ^Gost_Context) {
sbox: [8][16]u32 = {
{ 10, 4, 5, 6, 8, 1, 3, 7, 13, 12, 14, 0, 9, 2, 11, 15 },
{ 5, 15, 4, 0, 2, 13, 11, 9, 1, 7, 6, 3, 12, 14, 10, 8 },
{ 7, 15, 12, 14, 9, 4, 1, 0, 3, 11, 5, 2, 6, 10, 8, 13 },
{ 4, 10, 7, 12, 0, 15, 2, 8, 14, 1, 6, 5, 13, 11, 9, 3 },
{ 7, 6, 4, 11, 9, 12, 2, 10, 1, 8, 0, 14, 15, 13, 3, 5 },
{ 7, 6, 2, 4, 13, 9, 15, 0, 10, 1, 5, 11, 8, 14, 12, 3 },
{ 13, 14, 4, 1, 7, 0, 5, 10, 3, 12, 8, 15, 6, 2, 9, 11 },
{ 1, 3, 10, 9, 5, 11, 4, 15, 8, 6, 7, 14, 13, 0, 2, 12 },
}
i := 0
for a := 0; a < 16; a += 1 {
ax := sbox[1][a] << 15
bx := sbox[3][a] << 23
cx := sbox[5][a]
cx = (cx >> 1) | (cx << 31)
dx := sbox[7][a] << 7
for b := 0; b < 16; b, i = b + 1, i + 1 {
SBOX_1[i] = ax | (sbox[0][b] << 11)
SBOX_2[i] = bx | (sbox[2][b] << 19)
SBOX_3[i] = cx | (sbox[4][b] << 27)
SBOX_4[i] = dx | (sbox[6][b] << 3)
}
}
}
update_odin :: proc(ctx: ^Gost_Context, data: []byte) {
length := byte(len(data))
j: byte
i := ctx.partial_bytes
for i < 32 && j < length {
ctx.partial[i] = data[j]
i, j = i + 1, j + 1
}
if i < 32 {
ctx.partial_bytes = i
return
}
gost_bytes(ctx, ctx.partial[:], 256)
for (j + 32) < length {
gost_bytes(ctx, data[j:], 256)
j += 32
}
i = 0
for j < length {
ctx.partial[i] = data[j]
i, j = i + 1, j + 1
}
ctx.partial_bytes = i
}
final_odin :: proc(ctx: ^Gost_Context, hash: []byte) {
if ctx.partial_bytes > 0 {
mem.set(&ctx.partial[ctx.partial_bytes], 0, 32 - int(ctx.partial_bytes))
gost_bytes(ctx, ctx.partial[:], u32(ctx.partial_bytes) << 3)
}
gost_compress(ctx.hash[:], ctx.len[:])
gost_compress(ctx.hash[:], ctx.sum[:])
for i, j := 0, 0; i < 8; i, j = i + 1, j + 4 {
hash[j] = byte(ctx.hash[i])
hash[j + 1] = byte(ctx.hash[i] >> 8)
hash[j + 2] = byte(ctx.hash[i] >> 16)
hash[j + 3] = byte(ctx.hash[i] >> 24)
}
}

View File

@@ -6,7 +6,6 @@ package groestl
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the GROESTL hashing algorithm, as defined in <http://www.groestl.info/Groestl.zip>
*/
@@ -14,70 +13,6 @@ package groestl
import "core:os"
import "core:io"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_28 = hash_bytes_odin_28
ctx.hash_file_28 = hash_file_odin_28
ctx.hash_stream_28 = hash_stream_odin_28
ctx.hash_bytes_32 = hash_bytes_odin_32
ctx.hash_file_32 = hash_file_odin_32
ctx.hash_stream_32 = hash_stream_odin_32
ctx.hash_bytes_48 = hash_bytes_odin_48
ctx.hash_file_48 = hash_file_odin_48
ctx.hash_stream_48 = hash_stream_odin_48
ctx.hash_bytes_64 = hash_bytes_odin_64
ctx.hash_file_64 = hash_file_odin_64
ctx.hash_stream_64 = hash_stream_odin_64
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan does nothing, since GROESTL is not available in Botan
@(warning="GROESTL is not provided by the Botan API. Odin implementation will be used")
use_botan :: #force_inline proc() {
use_odin()
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
@(private)
_create_groestl_ctx :: #force_inline proc(size: _ctx.Hash_Size) {
ctx: Groestl_Context
#partial switch size {
case ._28: ctx.hashbitlen = 224
case ._32: ctx.hashbitlen = 256
case ._48: ctx.hashbitlen = 384
case ._64: ctx.hashbitlen = 512
}
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = size
}
/*
High level API
*/
@@ -91,22 +26,46 @@ hash_string_224 :: proc(data: string) -> [28]byte {
// hash_bytes_224 will hash the given input and return the
// computed hash
hash_bytes_224 :: proc(data: []byte) -> [28]byte {
_create_groestl_ctx(._28)
return _hash_impl->hash_bytes_28(data)
hash: [28]byte
ctx: Groestl_Context
ctx.hashbitlen = 224
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_224 will read the stream in chunks and compute a
// hash from its contents
hash_stream_224 :: proc(s: io.Stream) -> ([28]byte, bool) {
_create_groestl_ctx(._28)
return _hash_impl->hash_stream_28(s)
hash: [28]byte
ctx: Groestl_Context
ctx.hashbitlen = 224
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_224 will read the file provided by the given handle
// and compute a hash
hash_file_224 :: proc(hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
_create_groestl_ctx(._28)
return _hash_impl->hash_file_28(hd, load_at_once)
if !load_at_once {
return hash_stream_224(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_224(buf[:]), ok
}
}
return [28]byte{}, false
}
hash_224 :: proc {
@@ -125,22 +84,46 @@ hash_string_256 :: proc(data: string) -> [32]byte {
// hash_bytes_256 will hash the given input and return the
// computed hash
hash_bytes_256 :: proc(data: []byte) -> [32]byte {
_create_groestl_ctx(._32)
return _hash_impl->hash_bytes_32(data)
hash: [32]byte
ctx: Groestl_Context
ctx.hashbitlen = 256
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_256 will read the stream in chunks and compute a
// hash from its contents
hash_stream_256 :: proc(s: io.Stream) -> ([32]byte, bool) {
_create_groestl_ctx(._32)
return _hash_impl->hash_stream_32(s)
hash: [32]byte
ctx: Groestl_Context
ctx.hashbitlen = 256
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_256 will read the file provided by the given handle
// and compute a hash
hash_file_256 :: proc(hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
_create_groestl_ctx(._32)
return _hash_impl->hash_file_32(hd, load_at_once)
if !load_at_once {
return hash_stream_256(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_256(buf[:]), ok
}
}
return [32]byte{}, false
}
hash_256 :: proc {
@@ -159,22 +142,46 @@ hash_string_384 :: proc(data: string) -> [48]byte {
// hash_bytes_384 will hash the given input and return the
// computed hash
hash_bytes_384 :: proc(data: []byte) -> [48]byte {
_create_groestl_ctx(._48)
return _hash_impl->hash_bytes_48(data)
hash: [48]byte
ctx: Groestl_Context
ctx.hashbitlen = 384
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_384 will read the stream in chunks and compute a
// hash from its contents
hash_stream_384 :: proc(s: io.Stream) -> ([48]byte, bool) {
_create_groestl_ctx(._48)
return _hash_impl->hash_stream_48(s)
hash: [48]byte
ctx: Groestl_Context
ctx.hashbitlen = 384
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_384 will read the file provided by the given handle
// and compute a hash
hash_file_384 :: proc(hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
_create_groestl_ctx(._48)
return _hash_impl->hash_file_48(hd, load_at_once)
if !load_at_once {
return hash_stream_384(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_384(buf[:]), ok
}
}
return [48]byte{}, false
}
hash_384 :: proc {
@@ -193,22 +200,46 @@ hash_string_512 :: proc(data: string) -> [64]byte {
// hash_bytes_512 will hash the given input and return the
// computed hash
hash_bytes_512 :: proc(data: []byte) -> [64]byte {
_create_groestl_ctx(._64)
return _hash_impl->hash_bytes_64(data)
hash: [64]byte
ctx: Groestl_Context
ctx.hashbitlen = 512
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_512 will read the stream in chunks and compute a
// hash from its contents
hash_stream_512 :: proc(s: io.Stream) -> ([64]byte, bool) {
_create_groestl_ctx(._64)
return _hash_impl->hash_stream_64(s)
hash: [64]byte
ctx: Groestl_Context
ctx.hashbitlen = 512
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_512 will read the file provided by the given handle
// and compute a hash
hash_file_512 :: proc(hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
_create_groestl_ctx(._64)
return _hash_impl->hash_file_64(hd, load_at_once)
if !load_at_once {
return hash_stream_512(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_512(buf[:]), ok
}
}
return [64]byte{}, false
}
hash_512 :: proc {
@@ -222,201 +253,101 @@ hash_512 :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [28]byte {
hash: [28]byte
if c, ok := ctx.internal_ctx.(Groestl_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([28]byte, bool) {
hash: [28]byte
if c, ok := ctx.internal_ctx.(Groestl_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
init :: proc(ctx: ^Groestl_Context) {
assert(ctx.hashbitlen == 224 || ctx.hashbitlen == 256 || ctx.hashbitlen == 384 || ctx.hashbitlen == 512, "hashbitlen must be set to 224, 256, 384 or 512")
if ctx.hashbitlen <= 256 {
ctx.rounds = 10
ctx.columns = 8
ctx.statesize = 64
} else {
return hash, false
ctx.rounds = 14
ctx.columns = 16
ctx.statesize = 128
}
for i := 8 - size_of(i32); i < 8; i += 1 {
ctx.chaining[i][ctx.columns - 1] = byte(ctx.hashbitlen >> (8 * (7 - uint(i))))
}
}
hash_file_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
if !load_at_once {
return hash_stream_odin_28(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_28(ctx, buf[:]), ok
update :: proc(ctx: ^Groestl_Context, data: []byte) {
databitlen := len(data) * 8
msglen := databitlen / 8
rem := databitlen % 8
i: int
assert(ctx.bits_in_last_byte == 0)
if ctx.buf_ptr != 0 {
for i = 0; ctx.buf_ptr < ctx.statesize && i < msglen; i, ctx.buf_ptr = i + 1, ctx.buf_ptr + 1 {
ctx.buffer[ctx.buf_ptr] = data[i]
}
}
return [28]byte{}, false
}
hash_bytes_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Groestl_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Groestl_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
if ctx.buf_ptr < ctx.statesize {
if rem != 0 {
ctx.bits_in_last_byte = rem
ctx.buffer[ctx.buf_ptr] = data[i]
ctx.buf_ptr += 1
}
return
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
ctx.buf_ptr = 0
transform(ctx, ctx.buffer[:], u32(ctx.statesize))
}
transform(ctx, data[i:], u32(msglen - i))
i += ((msglen - i) / ctx.statesize) * ctx.statesize
for i < msglen {
ctx.buffer[ctx.buf_ptr] = data[i]
i, ctx.buf_ptr = i + 1, ctx.buf_ptr + 1
}
if rem != 0 {
ctx.bits_in_last_byte = rem
ctx.buffer[ctx.buf_ptr] = data[i]
ctx.buf_ptr += 1
}
}
hash_file_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_odin_32(ctx, os.stream_from_handle(hd))
final :: proc(ctx: ^Groestl_Context, hash: []byte) {
hashbytelen := ctx.hashbitlen / 8
if ctx.bits_in_last_byte != 0 {
ctx.buffer[ctx.buf_ptr - 1] &= ((1 << uint(ctx.bits_in_last_byte)) - 1) << (8 - uint(ctx.bits_in_last_byte))
ctx.buffer[ctx.buf_ptr - 1] ~= 0x1 << (7 - uint(ctx.bits_in_last_byte))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_32(ctx, buf[:]), ok
ctx.buffer[ctx.buf_ptr] = 0x80
ctx.buf_ptr += 1
}
if ctx.buf_ptr > ctx.statesize - 8 {
for ctx.buf_ptr < ctx.statesize {
ctx.buffer[ctx.buf_ptr] = 0
ctx.buf_ptr += 1
}
transform(ctx, ctx.buffer[:], u32(ctx.statesize))
ctx.buf_ptr = 0
}
return [32]byte{}, false
}
hash_bytes_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [48]byte {
hash: [48]byte
if c, ok := ctx.internal_ctx.(Groestl_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
for ctx.buf_ptr < ctx.statesize - 8 {
ctx.buffer[ctx.buf_ptr] = 0
ctx.buf_ptr += 1
}
return hash
}
hash_stream_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([48]byte, bool) {
hash: [48]byte
if c, ok := ctx.internal_ctx.(Groestl_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
ctx.block_counter += 1
ctx.buf_ptr = ctx.statesize
for ctx.buf_ptr > ctx.statesize - 8 {
ctx.buf_ptr -= 1
ctx.buffer[ctx.buf_ptr] = byte(ctx.block_counter)
ctx.block_counter >>= 8
}
}
hash_file_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
if !load_at_once {
return hash_stream_odin_48(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_48(ctx, buf[:]), ok
}
}
return [48]byte{}, false
}
transform(ctx, ctx.buffer[:], u32(ctx.statesize))
output_transformation(ctx)
hash_bytes_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [64]byte {
hash: [64]byte
if c, ok := ctx.internal_ctx.(Groestl_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([64]byte, bool) {
hash: [64]byte
if c, ok := ctx.internal_ctx.(Groestl_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
if !load_at_once {
return hash_stream_odin_64(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_64(ctx, buf[:]), ok
}
}
return [64]byte{}, false
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_groestl_ctx(ctx.hash_size)
if c, ok := ctx.internal_ctx.(Groestl_Context); ok {
init_odin(&c)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Groestl_Context); ok {
update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Groestl_Context); ok {
final_odin(&c, hash)
for i, j := ctx.statesize - hashbytelen , 0; i < ctx.statesize; i, j = i + 1, j + 1 {
hash[j] = ctx.chaining[i % 8][i / 8]
}
}
@@ -631,100 +562,3 @@ add_roundconstant :: proc(x: [][16]byte, columns: int, round: byte, v: Groestl_V
}
}
}
init_odin :: proc(ctx: ^Groestl_Context) {
if ctx.hashbitlen <= 256 {
ctx.rounds = 10
ctx.columns = 8
ctx.statesize = 64
} else {
ctx.rounds = 14
ctx.columns = 16
ctx.statesize = 128
}
for i := 8 - size_of(i32); i < 8; i += 1 {
ctx.chaining[i][ctx.columns - 1] = byte(ctx.hashbitlen >> (8 * (7 - uint(i))))
}
}
update_odin :: proc(ctx: ^Groestl_Context, data: []byte) {
databitlen := len(data) * 8
msglen := databitlen / 8
rem := databitlen % 8
i: int
assert(ctx.bits_in_last_byte == 0)
if ctx.buf_ptr != 0 {
for i = 0; ctx.buf_ptr < ctx.statesize && i < msglen; i, ctx.buf_ptr = i + 1, ctx.buf_ptr + 1 {
ctx.buffer[ctx.buf_ptr] = data[i]
}
if ctx.buf_ptr < ctx.statesize {
if rem != 0 {
ctx.bits_in_last_byte = rem
ctx.buffer[ctx.buf_ptr] = data[i]
ctx.buf_ptr += 1
}
return
}
ctx.buf_ptr = 0
transform(ctx, ctx.buffer[:], u32(ctx.statesize))
}
transform(ctx, data[i:], u32(msglen - i))
i += ((msglen - i) / ctx.statesize) * ctx.statesize
for i < msglen {
ctx.buffer[ctx.buf_ptr] = data[i]
i, ctx.buf_ptr = i + 1, ctx.buf_ptr + 1
}
if rem != 0 {
ctx.bits_in_last_byte = rem
ctx.buffer[ctx.buf_ptr] = data[i]
ctx.buf_ptr += 1
}
}
final_odin :: proc(ctx: ^Groestl_Context, hash: []byte) {
hashbytelen := ctx.hashbitlen / 8
if ctx.bits_in_last_byte != 0 {
ctx.buffer[ctx.buf_ptr - 1] &= ((1 << uint(ctx.bits_in_last_byte)) - 1) << (8 - uint(ctx.bits_in_last_byte))
ctx.buffer[ctx.buf_ptr - 1] ~= 0x1 << (7 - uint(ctx.bits_in_last_byte))
} else {
ctx.buffer[ctx.buf_ptr] = 0x80
ctx.buf_ptr += 1
}
if ctx.buf_ptr > ctx.statesize - 8 {
for ctx.buf_ptr < ctx.statesize {
ctx.buffer[ctx.buf_ptr] = 0
ctx.buf_ptr += 1
}
transform(ctx, ctx.buffer[:], u32(ctx.statesize))
ctx.buf_ptr = 0
}
for ctx.buf_ptr < ctx.statesize - 8 {
ctx.buffer[ctx.buf_ptr] = 0
ctx.buf_ptr += 1
}
ctx.block_counter += 1
ctx.buf_ptr = ctx.statesize
for ctx.buf_ptr > ctx.statesize - 8 {
ctx.buf_ptr -= 1
ctx.buffer[ctx.buf_ptr] = byte(ctx.block_counter)
ctx.block_counter >>= 8
}
transform(ctx, ctx.buffer[:], u32(ctx.statesize))
output_transformation(ctx)
for i, j := ctx.statesize - hashbytelen , 0; i < ctx.statesize; i, j = i + 1, j + 1 {
hash[j] = ctx.chaining[i % 8][i / 8]
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -6,7 +6,6 @@ package jh
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the JH hashing algorithm, as defined in <https://www3.ntu.edu.sg/home/wuhj/research/jh/index.html>
*/
@@ -14,70 +13,6 @@ package jh
import "core:os"
import "core:io"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_28 = hash_bytes_odin_28
ctx.hash_file_28 = hash_file_odin_28
ctx.hash_stream_28 = hash_stream_odin_28
ctx.hash_bytes_32 = hash_bytes_odin_32
ctx.hash_file_32 = hash_file_odin_32
ctx.hash_stream_32 = hash_stream_odin_32
ctx.hash_bytes_48 = hash_bytes_odin_48
ctx.hash_file_48 = hash_file_odin_48
ctx.hash_stream_48 = hash_stream_odin_48
ctx.hash_bytes_64 = hash_bytes_odin_64
ctx.hash_file_64 = hash_file_odin_64
ctx.hash_stream_64 = hash_stream_odin_64
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan does nothing, since JH is not available in Botan
@(warning="JH is not provided by the Botan API. Odin implementation will be used")
use_botan :: #force_inline proc() {
use_odin()
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
@(private)
_create_jh_ctx :: #force_inline proc(size: _ctx.Hash_Size) {
ctx: Jh_Context
#partial switch size {
case ._28: ctx.hashbitlen = 224
case ._32: ctx.hashbitlen = 256
case ._48: ctx.hashbitlen = 384
case ._64: ctx.hashbitlen = 512
}
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = size
}
/*
High level API
*/
@@ -91,22 +26,46 @@ hash_string_224 :: proc(data: string) -> [28]byte {
// hash_bytes_224 will hash the given input and return the
// computed hash
hash_bytes_224 :: proc(data: []byte) -> [28]byte {
_create_jh_ctx(._28)
return _hash_impl->hash_bytes_28(data)
hash: [28]byte
ctx: Jh_Context
ctx.hashbitlen = 224
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_224 will read the stream in chunks and compute a
// hash from its contents
hash_stream_224 :: proc(s: io.Stream) -> ([28]byte, bool) {
_create_jh_ctx(._28)
return _hash_impl->hash_stream_28(s)
hash: [28]byte
ctx: Jh_Context
ctx.hashbitlen = 224
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_224 will read the file provided by the given handle
// and compute a hash
hash_file_224 :: proc(hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
_create_jh_ctx(._28)
return _hash_impl->hash_file_28(hd, load_at_once)
if !load_at_once {
return hash_stream_224(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_224(buf[:]), ok
}
}
return [28]byte{}, false
}
hash_224 :: proc {
@@ -125,22 +84,46 @@ hash_string_256 :: proc(data: string) -> [32]byte {
// hash_bytes_256 will hash the given input and return the
// computed hash
hash_bytes_256 :: proc(data: []byte) -> [32]byte {
_create_jh_ctx(._32)
return _hash_impl->hash_bytes_32(data)
hash: [32]byte
ctx: Jh_Context
ctx.hashbitlen = 256
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_256 will read the stream in chunks and compute a
// hash from its contents
hash_stream_256 :: proc(s: io.Stream) -> ([32]byte, bool) {
_create_jh_ctx(._32)
return _hash_impl->hash_stream_32(s)
hash: [32]byte
ctx: Jh_Context
ctx.hashbitlen = 256
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_256 will read the file provided by the given handle
// and compute a hash
hash_file_256 :: proc(hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
_create_jh_ctx(._32)
return _hash_impl->hash_file_32(hd, load_at_once)
if !load_at_once {
return hash_stream_256(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_256(buf[:]), ok
}
}
return [32]byte{}, false
}
hash_256 :: proc {
@@ -159,22 +142,46 @@ hash_string_384 :: proc(data: string) -> [48]byte {
// hash_bytes_384 will hash the given input and return the
// computed hash
hash_bytes_384 :: proc(data: []byte) -> [48]byte {
_create_jh_ctx(._48)
return _hash_impl->hash_bytes_48(data)
hash: [48]byte
ctx: Jh_Context
ctx.hashbitlen = 384
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_384 will read the stream in chunks and compute a
// hash from its contents
hash_stream_384 :: proc(s: io.Stream) -> ([48]byte, bool) {
_create_jh_ctx(._48)
return _hash_impl->hash_stream_48(s)
hash: [48]byte
ctx: Jh_Context
ctx.hashbitlen = 384
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_384 will read the file provided by the given handle
// and compute a hash
hash_file_384 :: proc(hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
_create_jh_ctx(._48)
return _hash_impl->hash_file_48(hd, load_at_once)
if !load_at_once {
return hash_stream_384(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_384(buf[:]), ok
}
}
return [48]byte{}, false
}
hash_384 :: proc {
@@ -193,22 +200,46 @@ hash_string_512 :: proc(data: string) -> [64]byte {
// hash_bytes_512 will hash the given input and return the
// computed hash
hash_bytes_512 :: proc(data: []byte) -> [64]byte {
_create_jh_ctx(._64)
return _hash_impl->hash_bytes_64(data)
hash: [64]byte
ctx: Jh_Context
ctx.hashbitlen = 512
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_512 will read the stream in chunks and compute a
// hash from its contents
hash_stream_512 :: proc(s: io.Stream) -> ([64]byte, bool) {
_create_jh_ctx(._64)
return _hash_impl->hash_stream_64(s)
hash: [64]byte
ctx: Jh_Context
ctx.hashbitlen = 512
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_512 will read the file provided by the given handle
// and compute a hash
hash_file_512 :: proc(hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
_create_jh_ctx(._64)
return _hash_impl->hash_file_64(hd, load_at_once)
if !load_at_once {
return hash_stream_512(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_512(buf[:]), ok
}
}
return [64]byte{}, false
}
hash_512 :: proc {
@@ -222,201 +253,98 @@ hash_512 :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
init :: proc(ctx: ^Jh_Context) {
assert(ctx.hashbitlen == 224 || ctx.hashbitlen == 256 || ctx.hashbitlen == 384 || ctx.hashbitlen == 512, "hashbitlen must be set to 224, 256, 384 or 512")
ctx.H[1] = byte(ctx.hashbitlen) & 0xff
ctx.H[0] = byte(ctx.hashbitlen >> 8) & 0xff
F8(ctx)
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
}
update :: proc(ctx: ^Jh_Context, data: []byte) {
databitlen := u64(len(data)) * 8
ctx.databitlen += databitlen
i := u64(0)
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [28]byte {
hash: [28]byte
if c, ok := ctx.internal_ctx.(Jh_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
if (ctx.buffer_size > 0) && ((ctx.buffer_size + databitlen) < 512) {
if (databitlen & 7) == 0 {
copy(ctx.buffer[ctx.buffer_size >> 3:], data[:64 - (ctx.buffer_size >> 3)])
} else {
copy(ctx.buffer[ctx.buffer_size >> 3:], data[:64 - (ctx.buffer_size >> 3) + 1])
}
ctx.buffer_size += databitlen
databitlen = 0
}
return hash
}
hash_stream_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([28]byte, bool) {
hash: [28]byte
if c, ok := ctx.internal_ctx.(Jh_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
if (ctx.buffer_size > 0 ) && ((ctx.buffer_size + databitlen) >= 512) {
copy(ctx.buffer[ctx.buffer_size >> 3:], data[:64 - (ctx.buffer_size >> 3)])
i = 64 - (ctx.buffer_size >> 3)
databitlen = databitlen - (512 - ctx.buffer_size)
F8(ctx)
ctx.buffer_size = 0
}
for databitlen >= 512 {
copy(ctx.buffer[:], data[i:i + 64])
F8(ctx)
i += 64
databitlen -= 512
}
if databitlen > 0 {
if (databitlen & 7) == 0 {
copy(ctx.buffer[:], data[i:i + ((databitlen & 0x1ff) >> 3)])
} else {
copy(ctx.buffer[:], data[i:i + ((databitlen & 0x1ff) >> 3) + 1])
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
ctx.buffer_size = databitlen
}
}
hash_file_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
if !load_at_once {
return hash_stream_odin_28(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_28(ctx, buf[:]), ok
final :: proc(ctx: ^Jh_Context, hash: []byte) {
if ctx.databitlen & 0x1ff == 0 {
for i := 0; i < 64; i += 1 {
ctx.buffer[i] = 0
}
}
return [28]byte{}, false
}
hash_bytes_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Jh_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Jh_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
ctx.buffer[0] = 0x80
ctx.buffer[63] = byte(ctx.databitlen) & 0xff
ctx.buffer[62] = byte(ctx.databitlen >> 8) & 0xff
ctx.buffer[61] = byte(ctx.databitlen >> 16) & 0xff
ctx.buffer[60] = byte(ctx.databitlen >> 24) & 0xff
ctx.buffer[59] = byte(ctx.databitlen >> 32) & 0xff
ctx.buffer[58] = byte(ctx.databitlen >> 40) & 0xff
ctx.buffer[57] = byte(ctx.databitlen >> 48) & 0xff
ctx.buffer[56] = byte(ctx.databitlen >> 56) & 0xff
F8(ctx)
} else {
return hash, false
}
}
hash_file_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_odin_32(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_32(ctx, buf[:]), ok
if ctx.buffer_size & 7 == 0 {
for i := (ctx.databitlen & 0x1ff) >> 3; i < 64; i += 1 {
ctx.buffer[i] = 0
}
} else {
for i := ((ctx.databitlen & 0x1ff) >> 3) + 1; i < 64; i += 1 {
ctx.buffer[i] = 0
}
}
}
return [32]byte{}, false
}
hash_bytes_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [48]byte {
hash: [48]byte
if c, ok := ctx.internal_ctx.(Jh_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([48]byte, bool) {
hash: [48]byte
if c, ok := ctx.internal_ctx.(Jh_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
ctx.buffer[(ctx.databitlen & 0x1ff) >> 3] |= 1 << (7 - (ctx.databitlen & 7))
F8(ctx)
for i := 0; i < 64; i += 1 {
ctx.buffer[i] = 0
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
ctx.buffer[63] = byte(ctx.databitlen) & 0xff
ctx.buffer[62] = byte(ctx.databitlen >> 8) & 0xff
ctx.buffer[61] = byte(ctx.databitlen >> 16) & 0xff
ctx.buffer[60] = byte(ctx.databitlen >> 24) & 0xff
ctx.buffer[59] = byte(ctx.databitlen >> 32) & 0xff
ctx.buffer[58] = byte(ctx.databitlen >> 40) & 0xff
ctx.buffer[57] = byte(ctx.databitlen >> 48) & 0xff
ctx.buffer[56] = byte(ctx.databitlen >> 56) & 0xff
F8(ctx)
}
}
hash_file_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
if !load_at_once {
return hash_stream_odin_48(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_48(ctx, buf[:]), ok
}
}
return [48]byte{}, false
}
hash_bytes_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [64]byte {
hash: [64]byte
if c, ok := ctx.internal_ctx.(Jh_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([64]byte, bool) {
hash: [64]byte
if c, ok := ctx.internal_ctx.(Jh_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
if !load_at_once {
return hash_stream_odin_64(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_64(ctx, buf[:]), ok
}
}
return [64]byte{}, false
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_jh_ctx(ctx.hash_size)
if c, ok := ctx.internal_ctx.(Jh_Context); ok {
init_odin(&c)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Jh_Context); ok {
update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Jh_Context); ok {
final_odin(&c, hash)
switch ctx.hashbitlen {
case 224: copy(hash[:], ctx.H[100:128])
case 256: copy(hash[:], ctx.H[96:128])
case 384: copy(hash[:], ctx.H[80:128])
case 512: copy(hash[:], ctx.H[64:128])
}
}
@@ -424,7 +352,7 @@ _final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
JH implementation
*/
JH_ROUNDCONSTANT_ZERO := [64]byte {
ROUNDCONSTANT_ZERO := [64]byte {
0x6, 0xa, 0x0, 0x9, 0xe, 0x6, 0x6, 0x7,
0xf, 0x3, 0xb, 0xc, 0xc, 0x9, 0x0, 0x8,
0xb, 0x2, 0xf, 0xb, 0x1, 0x3, 0x6, 0x6,
@@ -435,7 +363,7 @@ JH_ROUNDCONSTANT_ZERO := [64]byte {
0x0, 0x6, 0x6, 0x7, 0x3, 0x2, 0x2, 0xa,
}
JH_S := [2][16]byte {
SBOX := [2][16]byte {
{9, 0, 4, 11, 13, 12, 3, 15, 1, 10, 2, 6, 7, 5, 8, 14},
{3, 12, 6, 13, 5, 7, 1, 9, 15, 2, 0, 4, 11, 10, 14, 8},
}
@@ -450,7 +378,7 @@ Jh_Context :: struct {
buffer: [64]byte,
}
JH_E8_finaldegroup :: proc(ctx: ^Jh_Context) {
E8_finaldegroup :: proc(ctx: ^Jh_Context) {
t0,t1,t2,t3: byte
tem: [256]byte
for i := 0; i < 128; i += 1 {
@@ -473,11 +401,11 @@ JH_E8_finaldegroup :: proc(ctx: ^Jh_Context) {
}
}
jh_update_roundconstant :: proc(ctx: ^Jh_Context) {
update_roundconstant :: proc(ctx: ^Jh_Context) {
tem: [64]byte
t: byte
for i := 0; i < 64; i += 1 {
tem[i] = JH_S[0][ctx.roundconstant[i]]
tem[i] = SBOX[0][ctx.roundconstant[i]]
}
for i := 0; i < 64; i += 2 {
tem[i + 1] ~= ((tem[i] << 1) ~ (tem[i] >> 3) ~ ((tem[i] >> 2) & 2)) & 0xf
@@ -499,14 +427,14 @@ jh_update_roundconstant :: proc(ctx: ^Jh_Context) {
}
}
JH_R8 :: proc(ctx: ^Jh_Context) {
R8 :: proc(ctx: ^Jh_Context) {
t: byte
tem, roundconstant_expanded: [256]byte
for i := u32(0); i < 256; i += 1 {
roundconstant_expanded[i] = (ctx.roundconstant[i >> 2] >> (3 - (i & 3)) ) & 1
}
for i := 0; i < 256; i += 1 {
tem[i] = JH_S[roundconstant_expanded[i]][ctx.A[i]]
tem[i] = SBOX[roundconstant_expanded[i]][ctx.A[i]]
}
for i := 0; i < 256; i += 2 {
tem[i+1] ~= ((tem[i] << 1) ~ (tem[i] >> 3) ~ ((tem[i] >> 2) & 2)) & 0xf
@@ -528,7 +456,7 @@ JH_R8 :: proc(ctx: ^Jh_Context) {
}
}
JH_E8_initialgroup :: proc(ctx: ^Jh_Context) {
E8_initialgroup :: proc(ctx: ^Jh_Context) {
t0, t1, t2, t3: byte
tem: [256]byte
for i := u32(0); i < 256; i += 1 {
@@ -544,118 +472,24 @@ JH_E8_initialgroup :: proc(ctx: ^Jh_Context) {
}
}
JH_E8 :: proc(ctx: ^Jh_Context) {
E8 :: proc(ctx: ^Jh_Context) {
for i := 0; i < 64; i += 1 {
ctx.roundconstant[i] = JH_ROUNDCONSTANT_ZERO[i]
ctx.roundconstant[i] = ROUNDCONSTANT_ZERO[i]
}
JH_E8_initialgroup(ctx)
E8_initialgroup(ctx)
for i := 0; i < 42; i += 1 {
JH_R8(ctx)
jh_update_roundconstant(ctx)
R8(ctx)
update_roundconstant(ctx)
}
JH_E8_finaldegroup(ctx)
E8_finaldegroup(ctx)
}
JH_F8 :: proc(ctx: ^Jh_Context) {
F8 :: proc(ctx: ^Jh_Context) {
for i := 0; i < 64; i += 1 {
ctx.H[i] ~= ctx.buffer[i]
}
JH_E8(ctx)
E8(ctx)
for i := 0; i < 64; i += 1 {
ctx.H[i + 64] ~= ctx.buffer[i]
}
}
init_odin :: proc(ctx: ^Jh_Context) {
ctx.H[1] = byte(ctx.hashbitlen) & 0xff
ctx.H[0] = byte(ctx.hashbitlen >> 8) & 0xff
JH_F8(ctx)
}
update_odin :: proc(ctx: ^Jh_Context, data: []byte) {
databitlen := u64(len(data)) * 8
ctx.databitlen += databitlen
i := u64(0)
if (ctx.buffer_size > 0) && ((ctx.buffer_size + databitlen) < 512) {
if (databitlen & 7) == 0 {
copy(ctx.buffer[ctx.buffer_size >> 3:], data[:64 - (ctx.buffer_size >> 3)])
} else {
copy(ctx.buffer[ctx.buffer_size >> 3:], data[:64 - (ctx.buffer_size >> 3) + 1])
}
ctx.buffer_size += databitlen
databitlen = 0
}
if (ctx.buffer_size > 0 ) && ((ctx.buffer_size + databitlen) >= 512) {
copy(ctx.buffer[ctx.buffer_size >> 3:], data[:64 - (ctx.buffer_size >> 3)])
i = 64 - (ctx.buffer_size >> 3)
databitlen = databitlen - (512 - ctx.buffer_size)
JH_F8(ctx)
ctx.buffer_size = 0
}
for databitlen >= 512 {
copy(ctx.buffer[:], data[i:i + 64])
JH_F8(ctx)
i += 64
databitlen -= 512
}
if databitlen > 0 {
if (databitlen & 7) == 0 {
copy(ctx.buffer[:], data[i:i + ((databitlen & 0x1ff) >> 3)])
} else {
copy(ctx.buffer[:], data[i:i + ((databitlen & 0x1ff) >> 3) + 1])
}
ctx.buffer_size = databitlen
}
}
final_odin :: proc(ctx: ^Jh_Context, hash: []byte) {
if ctx.databitlen & 0x1ff == 0 {
for i := 0; i < 64; i += 1 {
ctx.buffer[i] = 0
}
ctx.buffer[0] = 0x80
ctx.buffer[63] = byte(ctx.databitlen) & 0xff
ctx.buffer[62] = byte(ctx.databitlen >> 8) & 0xff
ctx.buffer[61] = byte(ctx.databitlen >> 16) & 0xff
ctx.buffer[60] = byte(ctx.databitlen >> 24) & 0xff
ctx.buffer[59] = byte(ctx.databitlen >> 32) & 0xff
ctx.buffer[58] = byte(ctx.databitlen >> 40) & 0xff
ctx.buffer[57] = byte(ctx.databitlen >> 48) & 0xff
ctx.buffer[56] = byte(ctx.databitlen >> 56) & 0xff
JH_F8(ctx)
} else {
if ctx.buffer_size & 7 == 0 {
for i := (ctx.databitlen & 0x1ff) >> 3; i < 64; i += 1 {
ctx.buffer[i] = 0
}
} else {
for i := ((ctx.databitlen & 0x1ff) >> 3) + 1; i < 64; i += 1 {
ctx.buffer[i] = 0
}
}
ctx.buffer[(ctx.databitlen & 0x1ff) >> 3] |= 1 << (7 - (ctx.databitlen & 7))
JH_F8(ctx)
for i := 0; i < 64; i += 1 {
ctx.buffer[i] = 0
}
ctx.buffer[63] = byte(ctx.databitlen) & 0xff
ctx.buffer[62] = byte(ctx.databitlen >> 8) & 0xff
ctx.buffer[61] = byte(ctx.databitlen >> 16) & 0xff
ctx.buffer[60] = byte(ctx.databitlen >> 24) & 0xff
ctx.buffer[59] = byte(ctx.databitlen >> 32) & 0xff
ctx.buffer[58] = byte(ctx.databitlen >> 40) & 0xff
ctx.buffer[57] = byte(ctx.databitlen >> 48) & 0xff
ctx.buffer[56] = byte(ctx.databitlen >> 56) & 0xff
JH_F8(ctx)
}
switch ctx.hashbitlen {
case 224: copy(hash[:], ctx.H[100:128])
case 256: copy(hash[:], ctx.H[96:128])
case 384: copy(hash[:], ctx.H[80:128])
case 512: copy(hash[:], ctx.H[64:128])
}
}

View File

@@ -6,7 +6,6 @@ package keccak
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Interface for the Keccak hashing algorithm.
This is done because the padding in the SHA3 standard was changed by the NIST, resulting in a different output.
@@ -15,57 +14,8 @@ package keccak
import "core:os"
import "core:io"
import "../botan"
import "../_ctx"
import "../_sha3"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_28 = hash_bytes_odin_28
ctx.hash_file_28 = hash_file_odin_28
ctx.hash_stream_28 = hash_stream_odin_28
ctx.hash_bytes_32 = hash_bytes_odin_32
ctx.hash_file_32 = hash_file_odin_32
ctx.hash_stream_32 = hash_stream_odin_32
ctx.hash_bytes_48 = hash_bytes_odin_48
ctx.hash_file_48 = hash_file_odin_48
ctx.hash_stream_48 = hash_stream_odin_48
ctx.hash_bytes_64 = hash_bytes_odin_64
ctx.hash_file_64 = hash_file_odin_64
ctx.hash_stream_64 = hash_stream_odin_64
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_KECCAK)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
@@ -80,22 +30,48 @@ hash_string_224 :: proc(data: string) -> [28]byte {
// hash_bytes_224 will hash the given input and return the
// computed hash
hash_bytes_224 :: proc(data: []byte) -> [28]byte {
_create_keccak_ctx(28)
return _hash_impl->hash_bytes_28(data)
hash: [28]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 28
ctx.is_keccak = true
_sha3.init(&ctx)
_sha3.update(&ctx, data)
_sha3.final(&ctx, hash[:])
return hash
}
// hash_stream_224 will read the stream in chunks and compute a
// hash from its contents
hash_stream_224 :: proc(s: io.Stream) -> ([28]byte, bool) {
_create_keccak_ctx(28)
return _hash_impl->hash_stream_28(s)
hash: [28]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 28
ctx.is_keccak = true
_sha3.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_sha3.update(&ctx, buf[:read])
}
}
_sha3.final(&ctx, hash[:])
return hash, true
}
// hash_file_224 will read the file provided by the given handle
// and compute a hash
hash_file_224 :: proc(hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
_create_keccak_ctx(28)
return _hash_impl->hash_file_28(hd, load_at_once)
if !load_at_once {
return hash_stream_224(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_224(buf[:]), ok
}
}
return [28]byte{}, false
}
hash_224 :: proc {
@@ -114,22 +90,48 @@ hash_string_256 :: proc(data: string) -> [32]byte {
// hash_bytes_256 will hash the given input and return the
// computed hash
hash_bytes_256 :: proc(data: []byte) -> [32]byte {
_create_keccak_ctx(32)
return _hash_impl->hash_bytes_32(data)
hash: [32]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 32
ctx.is_keccak = true
_sha3.init(&ctx)
_sha3.update(&ctx, data)
_sha3.final(&ctx, hash[:])
return hash
}
// hash_stream_256 will read the stream in chunks and compute a
// hash from its contents
hash_stream_256 :: proc(s: io.Stream) -> ([32]byte, bool) {
_create_keccak_ctx(32)
return _hash_impl->hash_stream_32(s)
hash: [32]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 32
ctx.is_keccak = true
_sha3.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_sha3.update(&ctx, buf[:read])
}
}
_sha3.final(&ctx, hash[:])
return hash, true
}
// hash_file_256 will read the file provided by the given handle
// and compute a hash
hash_file_256 :: proc(hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
_create_keccak_ctx(32)
return _hash_impl->hash_file_32(hd, load_at_once)
if !load_at_once {
return hash_stream_256(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_256(buf[:]), ok
}
}
return [32]byte{}, false
}
hash_256 :: proc {
@@ -148,22 +150,48 @@ hash_string_384 :: proc(data: string) -> [48]byte {
// hash_bytes_384 will hash the given input and return the
// computed hash
hash_bytes_384 :: proc(data: []byte) -> [48]byte {
_create_keccak_ctx(48)
return _hash_impl->hash_bytes_48(data)
hash: [48]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 48
ctx.is_keccak = true
_sha3.init(&ctx)
_sha3.update(&ctx, data)
_sha3.final(&ctx, hash[:])
return hash
}
// hash_stream_384 will read the stream in chunks and compute a
// hash from its contents
hash_stream_384 :: proc(s: io.Stream) -> ([48]byte, bool) {
_create_keccak_ctx(48)
return _hash_impl->hash_stream_48(s)
hash: [48]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 48
ctx.is_keccak = true
_sha3.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_sha3.update(&ctx, buf[:read])
}
}
_sha3.final(&ctx, hash[:])
return hash, true
}
// hash_file_384 will read the file provided by the given handle
// and compute a hash
hash_file_384 :: proc(hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
_create_keccak_ctx(48)
return _hash_impl->hash_file_48(hd, load_at_once)
if !load_at_once {
return hash_stream_384(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_384(buf[:]), ok
}
}
return [48]byte{}, false
}
hash_384 :: proc {
@@ -182,22 +210,48 @@ hash_string_512 :: proc(data: string) -> [64]byte {
// hash_bytes_512 will hash the given input and return the
// computed hash
hash_bytes_512 :: proc(data: []byte) -> [64]byte {
_create_keccak_ctx(64)
return _hash_impl->hash_bytes_64(data)
hash: [64]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 64
ctx.is_keccak = true
_sha3.init(&ctx)
_sha3.update(&ctx, data)
_sha3.final(&ctx, hash[:])
return hash
}
// hash_stream_512 will read the stream in chunks and compute a
// hash from its contents
hash_stream_512 :: proc(s: io.Stream) -> ([64]byte, bool) {
_create_keccak_ctx(64)
return _hash_impl->hash_stream_64(s)
hash: [64]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 64
ctx.is_keccak = true
_sha3.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_sha3.update(&ctx, buf[:read])
}
}
_sha3.final(&ctx, hash[:])
return hash, true
}
// hash_file_512 will read the file provided by the given handle
// and compute a hash
hash_file_512 :: proc(hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
_create_keccak_ctx(64)
return _hash_impl->hash_file_64(hd, load_at_once)
if !load_at_once {
return hash_stream_512(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_512(buf[:]), ok
}
}
return [64]byte{}, false
}
hash_512 :: proc {
@@ -211,219 +265,17 @@ hash_512 :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
Sha3_Context :: _sha3.Sha3_Context
init :: proc(ctx: ^_sha3.Sha3_Context) {
ctx.is_keccak = true
_sha3.init(ctx)
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
update :: proc "contextless" (ctx: ^_sha3.Sha3_Context, data: []byte) {
_sha3.update(ctx, data)
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [28]byte {
hash: [28]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
_sha3.update_odin(&c, data)
_sha3.final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([28]byte, bool) {
hash: [28]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_sha3.update_odin(&c, buf[:read])
}
}
_sha3.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
if !load_at_once {
return hash_stream_odin_28(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_28(ctx, buf[:]), ok
}
}
return [28]byte{}, false
}
hash_bytes_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
_sha3.update_odin(&c, data)
_sha3.final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_sha3.update_odin(&c, buf[:read])
}
}
_sha3.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_odin_32(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_32(ctx, buf[:]), ok
}
}
return [32]byte{}, false
}
hash_bytes_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [48]byte {
hash: [48]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
_sha3.update_odin(&c, data)
_sha3.final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([48]byte, bool) {
hash: [48]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_sha3.update_odin(&c, buf[:read])
}
}
_sha3.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
if !load_at_once {
return hash_stream_odin_48(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_48(ctx, buf[:]), ok
}
}
return [48]byte{}, false
}
hash_bytes_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [64]byte {
hash: [64]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
_sha3.update_odin(&c, data)
_sha3.final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([64]byte, bool) {
hash: [64]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_sha3.update_odin(&c, buf[:read])
}
}
_sha3.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
if !load_at_once {
return hash_stream_odin_64(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_64(ctx, buf[:]), ok
}
}
return [64]byte{}, false
}
@(private)
_create_keccak_ctx :: #force_inline proc(mdlen: int) {
ctx: _sha3.Sha3_Context
ctx.mdlen = mdlen
ctx.is_keccak = true
_hash_impl.internal_ctx = ctx
switch mdlen {
case 28: _hash_impl.hash_size = ._28
case 32: _hash_impl.hash_size = ._32
case 48: _hash_impl.hash_size = ._48
case 64: _hash_impl.hash_size = ._64
}
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
#partial switch ctx.hash_size {
case ._28: _create_keccak_ctx(28)
case ._32: _create_keccak_ctx(32)
case ._48: _create_keccak_ctx(48)
case ._64: _create_keccak_ctx(64)
}
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.final_odin(&c, hash)
}
final :: proc "contextless" (ctx: ^_sha3.Sha3_Context, hash: []byte) {
_sha3.final(ctx, hash)
}

View File

@@ -6,7 +6,6 @@ package md2
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the MD2 hashing algorithm, as defined in RFC 1319 <https://datatracker.ietf.org/doc/html/rfc1319>
*/
@@ -14,48 +13,6 @@ package md2
import "core:os"
import "core:io"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_16 = hash_bytes_odin
ctx.hash_file_16 = hash_file_odin
ctx.hash_stream_16 = hash_stream_odin
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan does nothing, since MD2 is not available in Botan
@(warning="MD2 is not provided by the Botan API. Odin implementation will be used")
use_botan :: #force_inline proc() {
use_odin()
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
*/
@@ -69,22 +26,44 @@ hash_string :: proc(data: string) -> [16]byte {
// hash_bytes will hash the given input and return the
// computed hash
hash_bytes :: proc(data: []byte) -> [16]byte {
_create_md2_ctx()
return _hash_impl->hash_bytes_16(data)
hash: [16]byte
ctx: Md2_Context
// init(&ctx) No-op
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream will read the stream in chunks and compute a
// hash from its contents
hash_stream :: proc(s: io.Stream) -> ([16]byte, bool) {
_create_md2_ctx()
return _hash_impl->hash_stream_16(s)
hash: [16]byte
ctx: Md2_Context
// init(&ctx) No-op
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file will read the file provided by the given handle
// and compute a hash
hash_file :: proc(hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
_create_md2_ctx()
return _hash_impl->hash_file_16(hd, load_at_once)
if !load_at_once {
return hash_stream(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes(buf[:]), ok
}
}
return [16]byte{}, false
}
hash :: proc {
@@ -98,85 +77,32 @@ hash :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
@(warning="Init is a no-op for MD2")
init :: proc(ctx: ^Md2_Context) {
// No action needed here
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
update :: proc(ctx: ^Md2_Context, data: []byte) {
for i := 0; i < len(data); i += 1 {
ctx.data[ctx.datalen] = data[i]
ctx.datalen += 1
if (ctx.datalen == 16) {
transform(ctx, ctx.data[:])
ctx.datalen = 0
}
}
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [16]byte {
hash: [16]byte
if c, ok := ctx.internal_ctx.(Md2_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
final :: proc(ctx: ^Md2_Context, hash: []byte) {
to_pad := byte(16 - ctx.datalen)
for ctx.datalen < 16 {
ctx.data[ctx.datalen] = to_pad
ctx.datalen += 1
}
return hash
}
hash_stream_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([16]byte, bool) {
hash: [16]byte
if c, ok := ctx.internal_ctx.(Md2_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
if !load_at_once {
return hash_stream_odin(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin(ctx, buf[:]), ok
}
}
return [16]byte{}, false
}
@(private)
_create_md2_ctx :: #force_inline proc() {
ctx: Md2_Context
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._16
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_md2_ctx()
if c, ok := ctx.internal_ctx.(Md2_Context); ok {
init_odin(&c)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Md2_Context); ok {
update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Md2_Context); ok {
final_odin(&c, hash)
transform(ctx, ctx.data[:])
transform(ctx, ctx.checksum[:])
for i := 0; i < 16; i += 1 {
hash[i] = ctx.state[i]
}
}
@@ -232,31 +158,3 @@ transform :: proc(ctx: ^Md2_Context, data: []byte) {
t = ctx.checksum[j]
}
}
init_odin :: proc(ctx: ^Md2_Context) {
// No action needed here
}
update_odin :: proc(ctx: ^Md2_Context, data: []byte) {
for i := 0; i < len(data); i += 1 {
ctx.data[ctx.datalen] = data[i]
ctx.datalen += 1
if (ctx.datalen == 16) {
transform(ctx, ctx.data[:])
ctx.datalen = 0
}
}
}
final_odin :: proc(ctx: ^Md2_Context, hash: []byte) {
to_pad := byte(16 - ctx.datalen)
for ctx.datalen < 16 {
ctx.data[ctx.datalen] = to_pad
ctx.datalen += 1
}
transform(ctx, ctx.data[:])
transform(ctx, ctx.checksum[:])
for i := 0; i < 16; i += 1 {
hash[i] = ctx.state[i]
}
}

View File

@@ -16,47 +16,6 @@ import "core:os"
import "core:io"
import "../util"
import "../botan"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_16 = hash_bytes_odin
ctx.hash_file_16 = hash_file_odin
ctx.hash_stream_16 = hash_stream_odin
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_MD4)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
@@ -71,22 +30,44 @@ hash_string :: proc(data: string) -> [16]byte {
// hash_bytes will hash the given input and return the
// computed hash
hash_bytes :: proc(data: []byte) -> [16]byte {
_create_md4_ctx()
return _hash_impl->hash_bytes_16(data)
hash: [16]byte
ctx: Md4_Context
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream will read the stream in chunks and compute a
// hash from its contents
hash_stream :: proc(s: io.Stream) -> ([16]byte, bool) {
_create_md4_ctx()
return _hash_impl->hash_stream_16(s)
hash: [16]byte
ctx: Md4_Context
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file will read the file provided by the given handle
// and compute a hash
hash_file :: proc(hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
_create_md4_ctx()
return _hash_impl->hash_file_16(hd, load_at_once)
if !load_at_once {
return hash_stream(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes(buf[:]), ok
}
}
return [16]byte{}, false
}
hash :: proc {
@@ -100,85 +81,61 @@ hash :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
init :: proc(ctx: ^Md4_Context) {
ctx.state[0] = 0x67452301
ctx.state[1] = 0xefcdab89
ctx.state[2] = 0x98badcfe
ctx.state[3] = 0x10325476
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [16]byte {
hash: [16]byte
if c, ok := ctx.internal_ctx.(Md4_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([16]byte, bool) {
hash: [16]byte
if c, ok := ctx.internal_ctx.(Md4_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
if !load_at_once {
return hash_stream_odin(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin(ctx, buf[:]), ok
update :: proc(ctx: ^Md4_Context, data: []byte) {
for i := 0; i < len(data); i += 1 {
ctx.data[ctx.datalen] = data[i]
ctx.datalen += 1
if(ctx.datalen == BLOCK_SIZE) {
transform(ctx, ctx.data[:])
ctx.bitlen += 512
ctx.datalen = 0
}
}
return [16]byte{}, false
}
@(private)
_create_md4_ctx :: #force_inline proc() {
ctx: Md4_Context
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._16
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_md4_ctx()
if c, ok := ctx.internal_ctx.(Md4_Context); ok {
init_odin(&c)
final :: proc(ctx: ^Md4_Context, hash: []byte) {
i := ctx.datalen
if ctx.datalen < 56 {
ctx.data[i] = 0x80
i += 1
for i < 56 {
ctx.data[i] = 0x00
i += 1
}
} else if ctx.datalen >= 56 {
ctx.data[i] = 0x80
i += 1
for i < BLOCK_SIZE {
ctx.data[i] = 0x00
i += 1
}
transform(ctx, ctx.data[:])
mem.set(&ctx.data, 0, 56)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Md4_Context); ok {
update_odin(&c, data)
}
}
ctx.bitlen += u64(ctx.datalen * 8)
ctx.data[56] = byte(ctx.bitlen)
ctx.data[57] = byte(ctx.bitlen >> 8)
ctx.data[58] = byte(ctx.bitlen >> 16)
ctx.data[59] = byte(ctx.bitlen >> 24)
ctx.data[60] = byte(ctx.bitlen >> 32)
ctx.data[61] = byte(ctx.bitlen >> 40)
ctx.data[62] = byte(ctx.bitlen >> 48)
ctx.data[63] = byte(ctx.bitlen >> 56)
transform(ctx, ctx.data[:])
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Md4_Context); ok {
final_odin(&c, hash)
for i = 0; i < 4; i += 1 {
hash[i] = byte(ctx.state[0] >> (i * 8)) & 0x000000ff
hash[i + 4] = byte(ctx.state[1] >> (i * 8)) & 0x000000ff
hash[i + 8] = byte(ctx.state[2] >> (i * 8)) & 0x000000ff
hash[i + 12] = byte(ctx.state[3] >> (i * 8)) & 0x000000ff
}
}
@@ -282,61 +239,3 @@ transform :: proc(ctx: ^Md4_Context, data: []byte) {
ctx.state[2] += c
ctx.state[3] += d
}
init_odin :: proc(ctx: ^Md4_Context) {
ctx.state[0] = 0x67452301
ctx.state[1] = 0xefcdab89
ctx.state[2] = 0x98badcfe
ctx.state[3] = 0x10325476
}
update_odin :: proc(ctx: ^Md4_Context, data: []byte) {
for i := 0; i < len(data); i += 1 {
ctx.data[ctx.datalen] = data[i]
ctx.datalen += 1
if(ctx.datalen == BLOCK_SIZE) {
transform(ctx, ctx.data[:])
ctx.bitlen += 512
ctx.datalen = 0
}
}
}
final_odin :: proc(ctx: ^Md4_Context, hash: []byte) {
i := ctx.datalen
if ctx.datalen < 56 {
ctx.data[i] = 0x80
i += 1
for i < 56 {
ctx.data[i] = 0x00
i += 1
}
} else if ctx.datalen >= 56 {
ctx.data[i] = 0x80
i += 1
for i < BLOCK_SIZE {
ctx.data[i] = 0x00
i += 1
}
transform(ctx, ctx.data[:])
mem.set(&ctx.data, 0, 56)
}
ctx.bitlen += u64(ctx.datalen * 8)
ctx.data[56] = byte(ctx.bitlen)
ctx.data[57] = byte(ctx.bitlen >> 8)
ctx.data[58] = byte(ctx.bitlen >> 16)
ctx.data[59] = byte(ctx.bitlen >> 24)
ctx.data[60] = byte(ctx.bitlen >> 32)
ctx.data[61] = byte(ctx.bitlen >> 40)
ctx.data[62] = byte(ctx.bitlen >> 48)
ctx.data[63] = byte(ctx.bitlen >> 56)
transform(ctx, ctx.data[:])
for i = 0; i < 4; i += 1 {
hash[i] = byte(ctx.state[0] >> (i * 8)) & 0x000000ff
hash[i + 4] = byte(ctx.state[1] >> (i * 8)) & 0x000000ff
hash[i + 8] = byte(ctx.state[2] >> (i * 8)) & 0x000000ff
hash[i + 12] = byte(ctx.state[3] >> (i * 8)) & 0x000000ff
}
}

View File

@@ -6,7 +6,6 @@ package md5
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the MD5 hashing algorithm, as defined in RFC 1321 <https://datatracker.ietf.org/doc/html/rfc1321>
*/
@@ -16,47 +15,6 @@ import "core:os"
import "core:io"
import "../util"
import "../botan"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_16 = hash_bytes_odin
ctx.hash_file_16 = hash_file_odin
ctx.hash_stream_16 = hash_stream_odin
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_MD5)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
@@ -71,22 +29,44 @@ hash_string :: proc(data: string) -> [16]byte {
// hash_bytes will hash the given input and return the
// computed hash
hash_bytes :: proc(data: []byte) -> [16]byte {
_create_md5_ctx()
return _hash_impl->hash_bytes_16(data)
hash: [16]byte
ctx: Md5_Context
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream will read the stream in chunks and compute a
// hash from its contents
hash_stream :: proc(s: io.Stream) -> ([16]byte, bool) {
_create_md5_ctx()
return _hash_impl->hash_stream_16(s)
hash: [16]byte
ctx: Md5_Context
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file will read the file provided by the given handle
// and compute a hash
hash_file :: proc(hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
_create_md5_ctx()
return _hash_impl->hash_file_16(hd, load_at_once)
if !load_at_once {
return hash_stream(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes(buf[:]), ok
}
}
return [16]byte{}, false
}
hash :: proc {
@@ -100,85 +80,63 @@ hash :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
init :: proc(ctx: ^Md5_Context) {
ctx.state[0] = 0x67452301
ctx.state[1] = 0xefcdab89
ctx.state[2] = 0x98badcfe
ctx.state[3] = 0x10325476
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [16]byte {
hash: [16]byte
if c, ok := ctx.internal_ctx.(Md5_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([16]byte, bool) {
hash: [16]byte
if c, ok := ctx.internal_ctx.(Md5_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
if !load_at_once {
return hash_stream_odin(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin(ctx, buf[:]), ok
update :: proc(ctx: ^Md5_Context, data: []byte) {
for i := 0; i < len(data); i += 1 {
ctx.data[ctx.datalen] = data[i]
ctx.datalen += 1
if(ctx.datalen == BLOCK_SIZE) {
transform(ctx, ctx.data[:])
ctx.bitlen += 512
ctx.datalen = 0
}
}
return [16]byte{}, false
}
@(private)
_create_md5_ctx :: #force_inline proc() {
ctx: Md5_Context
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._16
}
final :: proc(ctx: ^Md5_Context, hash: []byte){
i : u32
i = ctx.datalen
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_md5_ctx()
if c, ok := ctx.internal_ctx.(Md5_Context); ok {
init_odin(&c)
if ctx.datalen < 56 {
ctx.data[i] = 0x80
i += 1
for i < 56 {
ctx.data[i] = 0x00
i += 1
}
} else if ctx.datalen >= 56 {
ctx.data[i] = 0x80
i += 1
for i < BLOCK_SIZE {
ctx.data[i] = 0x00
i += 1
}
transform(ctx, ctx.data[:])
mem.set(&ctx.data, 0, 56)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Md5_Context); ok {
update_odin(&c, data)
}
}
ctx.bitlen += u64(ctx.datalen * 8)
ctx.data[56] = byte(ctx.bitlen)
ctx.data[57] = byte(ctx.bitlen >> 8)
ctx.data[58] = byte(ctx.bitlen >> 16)
ctx.data[59] = byte(ctx.bitlen >> 24)
ctx.data[60] = byte(ctx.bitlen >> 32)
ctx.data[61] = byte(ctx.bitlen >> 40)
ctx.data[62] = byte(ctx.bitlen >> 48)
ctx.data[63] = byte(ctx.bitlen >> 56)
transform(ctx, ctx.data[:])
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Md5_Context); ok {
final_odin(&c, hash)
for i = 0; i < 4; i += 1 {
hash[i] = byte(ctx.state[0] >> (i * 8)) & 0x000000ff
hash[i + 4] = byte(ctx.state[1] >> (i * 8)) & 0x000000ff
hash[i + 8] = byte(ctx.state[2] >> (i * 8)) & 0x000000ff
hash[i + 12] = byte(ctx.state[3] >> (i * 8)) & 0x000000ff
}
}
@@ -303,63 +261,3 @@ transform :: proc(ctx: ^Md5_Context, data: []byte) {
ctx.state[2] += c
ctx.state[3] += d
}
init_odin :: proc(ctx: ^Md5_Context) {
ctx.state[0] = 0x67452301
ctx.state[1] = 0xefcdab89
ctx.state[2] = 0x98badcfe
ctx.state[3] = 0x10325476
}
update_odin :: proc(ctx: ^Md5_Context, data: []byte) {
for i := 0; i < len(data); i += 1 {
ctx.data[ctx.datalen] = data[i]
ctx.datalen += 1
if(ctx.datalen == BLOCK_SIZE) {
transform(ctx, ctx.data[:])
ctx.bitlen += 512
ctx.datalen = 0
}
}
}
final_odin :: proc(ctx: ^Md5_Context, hash: []byte){
i : u32
i = ctx.datalen
if ctx.datalen < 56 {
ctx.data[i] = 0x80
i += 1
for i < 56 {
ctx.data[i] = 0x00
i += 1
}
} else if ctx.datalen >= 56 {
ctx.data[i] = 0x80
i += 1
for i < BLOCK_SIZE {
ctx.data[i] = 0x00
i += 1
}
transform(ctx, ctx.data[:])
mem.set(&ctx.data, 0, 56)
}
ctx.bitlen += u64(ctx.datalen * 8)
ctx.data[56] = byte(ctx.bitlen)
ctx.data[57] = byte(ctx.bitlen >> 8)
ctx.data[58] = byte(ctx.bitlen >> 16)
ctx.data[59] = byte(ctx.bitlen >> 24)
ctx.data[60] = byte(ctx.bitlen >> 32)
ctx.data[61] = byte(ctx.bitlen >> 40)
ctx.data[62] = byte(ctx.bitlen >> 48)
ctx.data[63] = byte(ctx.bitlen >> 56)
transform(ctx, ctx.data[:])
for i = 0; i < 4; i += 1 {
hash[i] = byte(ctx.state[0] >> (i * 8)) & 0x000000ff
hash[i + 4] = byte(ctx.state[1] >> (i * 8)) & 0x000000ff
hash[i + 8] = byte(ctx.state[2] >> (i * 8)) & 0x000000ff
hash[i + 12] = byte(ctx.state[3] >> (i * 8)) & 0x000000ff
}
}

View File

@@ -6,7 +6,6 @@ package ripemd
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation for the RIPEMD hashing algorithm as defined in <https://homes.esat.kuleuven.be/~bosselae/ripemd160.html>
*/
@@ -15,56 +14,6 @@ import "core:os"
import "core:io"
import "../util"
import "../botan"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_16 = hash_bytes_odin_16
ctx.hash_file_16 = hash_file_odin_16
ctx.hash_stream_16 = hash_stream_odin_16
ctx.hash_bytes_20 = hash_bytes_odin_20
ctx.hash_file_20 = hash_file_odin_20
ctx.hash_stream_20 = hash_stream_odin_20
ctx.hash_bytes_32 = hash_bytes_odin_32
ctx.hash_file_32 = hash_file_odin_32
ctx.hash_stream_32 = hash_stream_odin_32
ctx.hash_bytes_40 = hash_bytes_odin_40
ctx.hash_file_40 = hash_file_odin_40
ctx.hash_stream_40 = hash_stream_odin_40
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_RIPEMD_160)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
@@ -79,22 +28,44 @@ hash_string_128 :: proc(data: string) -> [16]byte {
// hash_bytes_128 will hash the given input and return the
// computed hash
hash_bytes_128 :: proc(data: []byte) -> [16]byte {
_create_ripemd_ctx(16)
return _hash_impl->hash_bytes_16(data)
hash: [16]byte
ctx: Ripemd128_Context
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_128 will read the stream in chunks and compute a
// hash from its contents
hash_stream_128 :: proc(s: io.Stream) -> ([16]byte, bool) {
_create_ripemd_ctx(16)
return _hash_impl->hash_stream_16(s)
hash: [16]byte
ctx: Ripemd128_Context
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_128 will read the file provided by the given handle
// and compute a hash
hash_file_128 :: proc(hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
_create_ripemd_ctx(16)
return _hash_impl->hash_file_16(hd, load_at_once)
if !load_at_once {
return hash_stream_128(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_128(buf[:]), ok
}
}
return [16]byte{}, false
}
hash_128 :: proc {
@@ -113,22 +84,44 @@ hash_string_160 :: proc(data: string) -> [20]byte {
// hash_bytes_160 will hash the given input and return the
// computed hash
hash_bytes_160 :: proc(data: []byte) -> [20]byte {
_create_ripemd_ctx(20)
return _hash_impl->hash_bytes_20(data)
hash: [20]byte
ctx: Ripemd160_Context
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_160 will read the stream in chunks and compute a
// hash from its contents
hash_stream_160 :: proc(s: io.Stream) -> ([20]byte, bool) {
_create_ripemd_ctx(20)
return _hash_impl->hash_stream_20(s)
hash: [20]byte
ctx: Ripemd160_Context
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_160 will read the file provided by the given handle
// and compute a hash
hash_file_160 :: proc(hd: os.Handle, load_at_once := false) -> ([20]byte, bool) {
_create_ripemd_ctx(20)
return _hash_impl->hash_file_20(hd, load_at_once)
if !load_at_once {
return hash_stream_160(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_160(buf[:]), ok
}
}
return [20]byte{}, false
}
hash_160 :: proc {
@@ -147,22 +140,44 @@ hash_string_256 :: proc(data: string) -> [32]byte {
// hash_bytes_256 will hash the given input and return the
// computed hash
hash_bytes_256 :: proc(data: []byte) -> [32]byte {
_create_ripemd_ctx(32)
return _hash_impl->hash_bytes_32(data)
hash: [32]byte
ctx: Ripemd256_Context
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_256 will read the stream in chunks and compute a
// hash from its contents
hash_stream_256 :: proc(s: io.Stream) -> ([32]byte, bool) {
_create_ripemd_ctx(32)
return _hash_impl->hash_stream_32(s)
hash: [32]byte
ctx: Ripemd256_Context
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_256 will read the file provided by the given handle
// and compute a hash
hash_file_256 :: proc(hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
_create_ripemd_ctx(32)
return _hash_impl->hash_file_32(hd, load_at_once)
if !load_at_once {
return hash_stream_256(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_256(buf[:]), ok
}
}
return [32]byte{}, false
}
hash_256 :: proc {
@@ -181,22 +196,44 @@ hash_string_320 :: proc(data: string) -> [40]byte {
// hash_bytes_320 will hash the given input and return the
// computed hash
hash_bytes_320 :: proc(data: []byte) -> [40]byte {
_create_ripemd_ctx(40)
return _hash_impl->hash_bytes_40(data)
hash: [40]byte
ctx: Ripemd320_Context
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_320 will read the stream in chunks and compute a
// hash from its contents
hash_stream_320 :: proc(s: io.Stream) -> ([40]byte, bool) {
_create_ripemd_ctx(40)
return _hash_impl->hash_stream_40(s)
hash: [40]byte
ctx: Ripemd320_Context
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_320 will read the file provided by the given handle
// and compute a hash
hash_file_320 :: proc(hd: os.Handle, load_at_once := false) -> ([40]byte, bool) {
_create_ripemd_ctx(40)
return _hash_impl->hash_file_40(hd, load_at_once)
if !load_at_once {
return hash_stream_320(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_320(buf[:]), ok
}
}
return [40]byte{}, false
}
hash_320 :: proc {
@@ -206,261 +243,122 @@ hash_320 :: proc {
hash_string_320,
}
hash_bytes_odin_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [16]byte {
hash: [16]byte
if c, ok := ctx.internal_ctx.(Ripemd128_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
/*
Low level API
*/
init :: proc(ctx: ^$T) {
when T == Ripemd128_Context {
ctx.s[0], ctx.s[1], ctx.s[2], ctx.s[3] = S0, S1, S2, S3
} else when T == Ripemd160_Context {
ctx.s[0], ctx.s[1], ctx.s[2], ctx.s[3], ctx.s[4] = S0, S1, S2, S3, S4
} else when T == Ripemd256_Context {
ctx.s[0], ctx.s[1], ctx.s[2], ctx.s[3] = S0, S1, S2, S3
ctx.s[4], ctx.s[5], ctx.s[6], ctx.s[7] = S5, S6, S7, S8
} else when T == Ripemd320_Context {
ctx.s[0], ctx.s[1], ctx.s[2], ctx.s[3], ctx.s[4] = S0, S1, S2, S3, S4
ctx.s[5], ctx.s[6], ctx.s[7], ctx.s[8], ctx.s[9] = S5, S6, S7, S8, S9
}
return hash
}
hash_stream_odin_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([16]byte, bool) {
hash: [16]byte
if c, ok := ctx.internal_ctx.(Ripemd128_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
update :: proc(ctx: ^$T, data: []byte) {
ctx.tc += u64(len(data))
data := data
if ctx.nx > 0 {
n := len(data)
when T == Ripemd128_Context {
if n > RIPEMD_128_BLOCK_SIZE - ctx.nx {
n = RIPEMD_128_BLOCK_SIZE - ctx.nx
}
} else when T == Ripemd160_Context {
if n > RIPEMD_160_BLOCK_SIZE - ctx.nx {
n = RIPEMD_160_BLOCK_SIZE - ctx.nx
}
} else when T == Ripemd256_Context{
if n > RIPEMD_256_BLOCK_SIZE - ctx.nx {
n = RIPEMD_256_BLOCK_SIZE - ctx.nx
}
} else when T == Ripemd320_Context{
if n > RIPEMD_320_BLOCK_SIZE - ctx.nx {
n = RIPEMD_320_BLOCK_SIZE - ctx.nx
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
if !load_at_once {
return hash_stream_odin_16(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_16(ctx, buf[:]), ok
for i := 0; i < n; i += 1 {
ctx.x[ctx.nx + i] = data[i]
}
}
return [16]byte{}, false
}
hash_bytes_odin_20 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [20]byte {
hash: [20]byte
if c, ok := ctx.internal_ctx.(Ripemd160_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_20 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([20]byte, bool) {
hash: [20]byte
if c, ok := ctx.internal_ctx.(Ripemd160_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
ctx.nx += n
when T == Ripemd128_Context {
if ctx.nx == RIPEMD_128_BLOCK_SIZE {
block(ctx, ctx.x[0:])
ctx.nx = 0
}
} else when T == Ripemd160_Context {
if ctx.nx == RIPEMD_160_BLOCK_SIZE {
block(ctx, ctx.x[0:])
ctx.nx = 0
}
} else when T == Ripemd256_Context{
if ctx.nx == RIPEMD_256_BLOCK_SIZE {
block(ctx, ctx.x[0:])
ctx.nx = 0
}
} else when T == Ripemd320_Context{
if ctx.nx == RIPEMD_320_BLOCK_SIZE {
block(ctx, ctx.x[0:])
ctx.nx = 0
}
}
final_odin(&c, hash[:])
return hash, true
data = data[n:]
}
n := block(ctx, data)
data = data[n:]
if len(data) > 0 {
ctx.nx = copy(ctx.x[:], data)
}
}
final :: proc(ctx: ^$T, hash: []byte) {
d := ctx
tc := d.tc
tmp: [64]byte
tmp[0] = 0x80
if tc % 64 < 56 {
update(d, tmp[0:56 - tc % 64])
} else {
return hash, false
update(d, tmp[0:64 + 56 - tc % 64])
}
tc <<= 3
for i : u32 = 0; i < 8; i += 1 {
tmp[i] = byte(tc >> (8 * i))
}
update(d, tmp[0:8])
when T == Ripemd128_Context {
size :: RIPEMD_128_SIZE
} else when T == Ripemd160_Context {
size :: RIPEMD_160_SIZE
} else when T == Ripemd256_Context{
size :: RIPEMD_256_SIZE
} else when T == Ripemd320_Context{
size :: RIPEMD_320_SIZE
}
digest: [size]byte
for s, i in d.s {
digest[i * 4] = byte(s)
digest[i * 4 + 1] = byte(s >> 8)
digest[i * 4 + 2] = byte(s >> 16)
digest[i * 4 + 3] = byte(s >> 24)
}
copy(hash[:], digest[:])
}
hash_file_odin_20 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([20]byte, bool) {
if !load_at_once {
return hash_stream_odin_20(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_20(ctx, buf[:]), ok
}
}
return [20]byte{}, false
}
hash_bytes_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Ripemd256_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Ripemd256_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_odin_32(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_32(ctx, buf[:]), ok
}
}
return [32]byte{}, false
}
hash_bytes_odin_40 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [40]byte {
hash: [40]byte
if c, ok := ctx.internal_ctx.(Ripemd320_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_40 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([40]byte, bool) {
hash: [40]byte
if c, ok := ctx.internal_ctx.(Ripemd320_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_40 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([40]byte, bool) {
if !load_at_once {
return hash_stream_odin_40(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_40(ctx, buf[:]), ok
}
}
return [40]byte{}, false
}
@(private)
_create_ripemd_ctx :: #force_inline proc(hash_size: int) {
switch hash_size {
case 16:
ctx: Ripemd128_Context
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._16
case 20:
ctx: Ripemd160_Context
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._20
case 32:
ctx: Ripemd256_Context
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._32
case 40:
ctx: Ripemd320_Context
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._40
}
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
#partial switch ctx.hash_size {
case ._16:
_create_ripemd_ctx(16)
if c, ok := ctx.internal_ctx.(Ripemd128_Context); ok {
init_odin(&c)
}
case ._20:
_create_ripemd_ctx(20)
if c, ok := ctx.internal_ctx.(Ripemd160_Context); ok {
init_odin(&c)
}
case ._32:
_create_ripemd_ctx(32)
if c, ok := ctx.internal_ctx.(Ripemd256_Context); ok {
init_odin(&c)
}
case ._40:
_create_ripemd_ctx(40)
if c, ok := ctx.internal_ctx.(Ripemd320_Context); ok {
init_odin(&c)
}
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
#partial switch ctx.hash_size {
case ._16:
if c, ok := ctx.internal_ctx.(Ripemd128_Context); ok {
update_odin(&c, data)
}
case ._20:
if c, ok := ctx.internal_ctx.(Ripemd160_Context); ok {
update_odin(&c, data)
}
case ._32:
if c, ok := ctx.internal_ctx.(Ripemd256_Context); ok {
update_odin(&c, data)
}
case ._40:
if c, ok := ctx.internal_ctx.(Ripemd320_Context); ok {
update_odin(&c, data)
}
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
#partial switch ctx.hash_size {
case ._16:
if c, ok := ctx.internal_ctx.(Ripemd128_Context); ok {
final_odin(&c, hash)
}
case ._20:
if c, ok := ctx.internal_ctx.(Ripemd160_Context); ok {
final_odin(&c, hash)
}
case ._32:
if c, ok := ctx.internal_ctx.(Ripemd256_Context); ok {
final_odin(&c, hash)
}
case ._40:
if c, ok := ctx.internal_ctx.(Ripemd320_Context); ok {
final_odin(&c, hash)
}
}
}
/*
RIPEMD implementation
@@ -574,20 +472,6 @@ RIPEMD_160_R1 := [80]uint {
8, 5, 12, 9, 12, 5, 14, 6, 8, 13, 6, 5, 15, 13, 11, 11,
}
init_odin :: proc(ctx: ^$T) {
when T == Ripemd128_Context {
ctx.s[0], ctx.s[1], ctx.s[2], ctx.s[3] = S0, S1, S2, S3
} else when T == Ripemd160_Context {
ctx.s[0], ctx.s[1], ctx.s[2], ctx.s[3], ctx.s[4] = S0, S1, S2, S3, S4
} else when T == Ripemd256_Context {
ctx.s[0], ctx.s[1], ctx.s[2], ctx.s[3] = S0, S1, S2, S3
ctx.s[4], ctx.s[5], ctx.s[6], ctx.s[7] = S5, S6, S7, S8
} else when T == Ripemd320_Context {
ctx.s[0], ctx.s[1], ctx.s[2], ctx.s[3], ctx.s[4] = S0, S1, S2, S3, S4
ctx.s[5], ctx.s[6], ctx.s[7], ctx.s[8], ctx.s[9] = S5, S6, S7, S8, S9
}
}
block :: #force_inline proc (ctx: ^$T, p: []byte) -> int {
when T == Ripemd128_Context {
return ripemd_128_block(ctx, p)
@@ -948,101 +832,3 @@ ripemd_320_block :: proc(ctx: ^$T, p: []byte) -> int {
}
return n
}
update_odin :: proc(ctx: ^$T, p: []byte) {
ctx.tc += u64(len(p))
p := p
if ctx.nx > 0 {
n := len(p)
when T == Ripemd128_Context {
if n > RIPEMD_128_BLOCK_SIZE - ctx.nx {
n = RIPEMD_128_BLOCK_SIZE - ctx.nx
}
} else when T == Ripemd160_Context {
if n > RIPEMD_160_BLOCK_SIZE - ctx.nx {
n = RIPEMD_160_BLOCK_SIZE - ctx.nx
}
} else when T == Ripemd256_Context{
if n > RIPEMD_256_BLOCK_SIZE - ctx.nx {
n = RIPEMD_256_BLOCK_SIZE - ctx.nx
}
} else when T == Ripemd320_Context{
if n > RIPEMD_320_BLOCK_SIZE - ctx.nx {
n = RIPEMD_320_BLOCK_SIZE - ctx.nx
}
}
for i := 0; i < n; i += 1 {
ctx.x[ctx.nx + i] = p[i]
}
ctx.nx += n
when T == Ripemd128_Context {
if ctx.nx == RIPEMD_128_BLOCK_SIZE {
block(ctx, ctx.x[0:])
ctx.nx = 0
}
} else when T == Ripemd160_Context {
if ctx.nx == RIPEMD_160_BLOCK_SIZE {
block(ctx, ctx.x[0:])
ctx.nx = 0
}
} else when T == Ripemd256_Context{
if ctx.nx == RIPEMD_256_BLOCK_SIZE {
block(ctx, ctx.x[0:])
ctx.nx = 0
}
} else when T == Ripemd320_Context{
if ctx.nx == RIPEMD_320_BLOCK_SIZE {
block(ctx, ctx.x[0:])
ctx.nx = 0
}
}
p = p[n:]
}
n := block(ctx, p)
p = p[n:]
if len(p) > 0 {
ctx.nx = copy(ctx.x[:], p)
}
}
final_odin :: proc(ctx: ^$T, hash: []byte) {
d := ctx
tc := d.tc
tmp: [64]byte
tmp[0] = 0x80
if tc % 64 < 56 {
update_odin(d, tmp[0:56 - tc % 64])
} else {
update_odin(d, tmp[0:64 + 56 - tc % 64])
}
tc <<= 3
for i : u32 = 0; i < 8; i += 1 {
tmp[i] = byte(tc >> (8 * i))
}
update_odin(d, tmp[0:8])
when T == Ripemd128_Context {
size :: RIPEMD_128_SIZE
} else when T == Ripemd160_Context {
size :: RIPEMD_160_SIZE
} else when T == Ripemd256_Context{
size :: RIPEMD_256_SIZE
} else when T == Ripemd320_Context{
size :: RIPEMD_320_SIZE
}
digest: [size]byte
for s, i in d.s {
digest[i * 4] = byte(s)
digest[i * 4 + 1] = byte(s >> 8)
digest[i * 4 + 2] = byte(s >> 16)
digest[i * 4 + 3] = byte(s >> 24)
}
copy(hash[:], digest[:])
}

View File

@@ -6,7 +6,6 @@ package sha1
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the SHA1 hashing algorithm, as defined in RFC 3174 <https://datatracker.ietf.org/doc/html/rfc3174>
*/
@@ -16,52 +15,10 @@ import "core:os"
import "core:io"
import "../util"
import "../botan"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_20 = hash_bytes_odin
ctx.hash_file_20 = hash_file_odin
ctx.hash_stream_20 = hash_stream_odin
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_SHA1)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
*/
// hash_string will hash the given input and return the
// computed hash
hash_string :: proc(data: string) -> [20]byte {
@@ -71,22 +28,44 @@ hash_string :: proc(data: string) -> [20]byte {
// hash_bytes will hash the given input and return the
// computed hash
hash_bytes :: proc(data: []byte) -> [20]byte {
_create_sha1_ctx()
return _hash_impl->hash_bytes_20(data)
hash: [20]byte
ctx: Sha1_Context
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream will read the stream in chunks and compute a
// hash from its contents
hash_stream :: proc(s: io.Stream) -> ([20]byte, bool) {
_create_sha1_ctx()
return _hash_impl->hash_stream_20(s)
hash: [20]byte
ctx: Sha1_Context
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file will read the file provided by the given handle
// and compute a hash
hash_file :: proc(hd: os.Handle, load_at_once := false) -> ([20]byte, bool) {
_create_sha1_ctx()
return _hash_impl->hash_file_20(hd, load_at_once)
if !load_at_once {
return hash_stream(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes(buf[:]), ok
}
}
return [20]byte{}, false
}
hash :: proc {
@@ -100,86 +79,70 @@ hash :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
init :: proc(ctx: ^Sha1_Context) {
ctx.state[0] = 0x67452301
ctx.state[1] = 0xefcdab89
ctx.state[2] = 0x98badcfe
ctx.state[3] = 0x10325476
ctx.state[4] = 0xc3d2e1f0
ctx.k[0] = 0x5a827999
ctx.k[1] = 0x6ed9eba1
ctx.k[2] = 0x8f1bbcdc
ctx.k[3] = 0xca62c1d6
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
update :: proc(ctx: ^Sha1_Context, data: []byte) {
for i := 0; i < len(data); i += 1 {
ctx.data[ctx.datalen] = data[i]
ctx.datalen += 1
if (ctx.datalen == BLOCK_SIZE) {
transform(ctx, ctx.data[:])
ctx.bitlen += 512
ctx.datalen = 0
}
}
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
final :: proc(ctx: ^Sha1_Context, hash: []byte) {
i := ctx.datalen
hash_bytes_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [20]byte {
hash: [20]byte
if c, ok := ctx.internal_ctx.(Sha1_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([20]byte, bool) {
hash: [20]byte
if c, ok := ctx.internal_ctx.(Sha1_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([20]byte, bool) {
if !load_at_once {
return hash_stream_odin(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin(ctx, buf[:]), ok
if ctx.datalen < 56 {
ctx.data[i] = 0x80
i += 1
for i < 56 {
ctx.data[i] = 0x00
i += 1
}
}
return [20]byte{}, false
}
}
else {
ctx.data[i] = 0x80
i += 1
for i < BLOCK_SIZE {
ctx.data[i] = 0x00
i += 1
}
transform(ctx, ctx.data[:])
mem.set(&ctx.data, 0, 56)
}
@(private)
_create_sha1_ctx :: #force_inline proc() {
ctx: Sha1_Context
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._20
}
ctx.bitlen += u64(ctx.datalen * 8)
ctx.data[63] = u8(ctx.bitlen)
ctx.data[62] = u8(ctx.bitlen >> 8)
ctx.data[61] = u8(ctx.bitlen >> 16)
ctx.data[60] = u8(ctx.bitlen >> 24)
ctx.data[59] = u8(ctx.bitlen >> 32)
ctx.data[58] = u8(ctx.bitlen >> 40)
ctx.data[57] = u8(ctx.bitlen >> 48)
ctx.data[56] = u8(ctx.bitlen >> 56)
transform(ctx, ctx.data[:])
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_sha1_ctx()
if c, ok := ctx.internal_ctx.(Sha1_Context); ok {
init_odin(&c)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Sha1_Context); ok {
update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Sha1_Context); ok {
final_odin(&c, hash)
}
for j: u32 = 0; j < 4; j += 1 {
hash[j] = u8(ctx.state[0] >> (24 - j * 8)) & 0x000000ff
hash[j + 4] = u8(ctx.state[1] >> (24 - j * 8)) & 0x000000ff
hash[j + 8] = u8(ctx.state[2] >> (24 - j * 8)) & 0x000000ff
hash[j + 12] = u8(ctx.state[3] >> (24 - j * 8)) & 0x000000ff
hash[j + 16] = u8(ctx.state[4] >> (24 - j * 8)) & 0x000000ff
}
}
/*
@@ -258,69 +221,3 @@ transform :: proc(ctx: ^Sha1_Context, data: []byte) {
ctx.state[3] += d
ctx.state[4] += e
}
init_odin :: proc(ctx: ^Sha1_Context) {
ctx.state[0] = 0x67452301
ctx.state[1] = 0xefcdab89
ctx.state[2] = 0x98badcfe
ctx.state[3] = 0x10325476
ctx.state[4] = 0xc3d2e1f0
ctx.k[0] = 0x5a827999
ctx.k[1] = 0x6ed9eba1
ctx.k[2] = 0x8f1bbcdc
ctx.k[3] = 0xca62c1d6
}
update_odin :: proc(ctx: ^Sha1_Context, data: []byte) {
for i := 0; i < len(data); i += 1 {
ctx.data[ctx.datalen] = data[i]
ctx.datalen += 1
if (ctx.datalen == BLOCK_SIZE) {
transform(ctx, ctx.data[:])
ctx.bitlen += 512
ctx.datalen = 0
}
}
}
final_odin :: proc(ctx: ^Sha1_Context, hash: []byte) {
i := ctx.datalen
if ctx.datalen < 56 {
ctx.data[i] = 0x80
i += 1
for i < 56 {
ctx.data[i] = 0x00
i += 1
}
}
else {
ctx.data[i] = 0x80
i += 1
for i < BLOCK_SIZE {
ctx.data[i] = 0x00
i += 1
}
transform(ctx, ctx.data[:])
mem.set(&ctx.data, 0, 56)
}
ctx.bitlen += u64(ctx.datalen * 8)
ctx.data[63] = u8(ctx.bitlen)
ctx.data[62] = u8(ctx.bitlen >> 8)
ctx.data[61] = u8(ctx.bitlen >> 16)
ctx.data[60] = u8(ctx.bitlen >> 24)
ctx.data[59] = u8(ctx.bitlen >> 32)
ctx.data[58] = u8(ctx.bitlen >> 40)
ctx.data[57] = u8(ctx.bitlen >> 48)
ctx.data[56] = u8(ctx.bitlen >> 56)
transform(ctx, ctx.data[:])
for j: u32 = 0; j < 4; j += 1 {
hash[j] = u8(ctx.state[0] >> (24 - j * 8)) & 0x000000ff
hash[j + 4] = u8(ctx.state[1] >> (24 - j * 8)) & 0x000000ff
hash[j + 8] = u8(ctx.state[2] >> (24 - j * 8)) & 0x000000ff
hash[j + 12] = u8(ctx.state[3] >> (24 - j * 8)) & 0x000000ff
hash[j + 16] = u8(ctx.state[4] >> (24 - j * 8)) & 0x000000ff
}
}

View File

@@ -6,7 +6,6 @@ package sha2
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the SHA2 hashing algorithm, as defined in <https://csrc.nist.gov/csrc/media/publications/fips/180/2/archive/2002-08-01/documents/fips180-2.pdf>
and in RFC 3874 <https://datatracker.ietf.org/doc/html/rfc3874>
@@ -17,72 +16,6 @@ import "core:os"
import "core:io"
import "../util"
import "../botan"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_28 = hash_bytes_odin_28
ctx.hash_file_28 = hash_file_odin_28
ctx.hash_stream_28 = hash_stream_odin_28
ctx.hash_bytes_32 = hash_bytes_odin_32
ctx.hash_file_32 = hash_file_odin_32
ctx.hash_stream_32 = hash_stream_odin_32
ctx.hash_bytes_48 = hash_bytes_odin_48
ctx.hash_file_48 = hash_file_odin_48
ctx.hash_stream_48 = hash_stream_odin_48
ctx.hash_bytes_64 = hash_bytes_odin_64
ctx.hash_file_64 = hash_file_odin_64
ctx.hash_stream_64 = hash_stream_odin_64
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_SHA2)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
@(private)
_create_sha256_ctx :: #force_inline proc(is224: bool) {
ctx: Sha256_Context
ctx.is224 = is224
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = is224 ? ._28 : ._32
}
@(private)
_create_sha512_ctx :: #force_inline proc(is384: bool) {
ctx: Sha512_Context
ctx.is384 = is384
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = is384 ? ._48 : ._64
}
/*
High level API
@@ -97,22 +30,46 @@ hash_string_224 :: proc(data: string) -> [28]byte {
// hash_bytes_224 will hash the given input and return the
// computed hash
hash_bytes_224 :: proc(data: []byte) -> [28]byte {
_create_sha256_ctx(true)
return _hash_impl->hash_bytes_28(data)
hash: [28]byte
ctx: Sha256_Context
ctx.is224 = true
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_224 will read the stream in chunks and compute a
// hash from its contents
hash_stream_224 :: proc(s: io.Stream) -> ([28]byte, bool) {
_create_sha256_ctx(true)
return _hash_impl->hash_stream_28(s)
hash: [28]byte
ctx: Sha512_Context
ctx.is384 = false
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_224 will read the file provided by the given handle
// and compute a hash
hash_file_224 :: proc(hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
_create_sha256_ctx(true)
return _hash_impl->hash_file_28(hd, load_at_once)
if !load_at_once {
return hash_stream_224(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_224(buf[:]), ok
}
}
return [28]byte{}, false
}
hash_224 :: proc {
@@ -131,22 +88,46 @@ hash_string_256 :: proc(data: string) -> [32]byte {
// hash_bytes_256 will hash the given input and return the
// computed hash
hash_bytes_256 :: proc(data: []byte) -> [32]byte {
_create_sha256_ctx(false)
return _hash_impl->hash_bytes_32(data)
hash: [32]byte
ctx: Sha256_Context
ctx.is224 = false
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_256 will read the stream in chunks and compute a
// hash from its contents
hash_stream_256 :: proc(s: io.Stream) -> ([32]byte, bool) {
_create_sha256_ctx(false)
return _hash_impl->hash_stream_32(s)
hash: [32]byte
ctx: Sha512_Context
ctx.is384 = false
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_256 will read the file provided by the given handle
// and compute a hash
hash_file_256 :: proc(hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
_create_sha256_ctx(false)
return _hash_impl->hash_file_32(hd, load_at_once)
if !load_at_once {
return hash_stream_256(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_256(buf[:]), ok
}
}
return [32]byte{}, false
}
hash_256 :: proc {
@@ -165,22 +146,46 @@ hash_string_384 :: proc(data: string) -> [48]byte {
// hash_bytes_384 will hash the given input and return the
// computed hash
hash_bytes_384 :: proc(data: []byte) -> [48]byte {
_create_sha512_ctx(true)
return _hash_impl->hash_bytes_48(data)
hash: [48]byte
ctx: Sha512_Context
ctx.is384 = true
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_384 will read the stream in chunks and compute a
// hash from its contents
hash_stream_384 :: proc(s: io.Stream) -> ([48]byte, bool) {
_create_sha512_ctx(true)
return _hash_impl->hash_stream_48(s)
hash: [48]byte
ctx: Sha512_Context
ctx.is384 = true
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_384 will read the file provided by the given handle
// and compute a hash
hash_file_384 :: proc(hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
_create_sha512_ctx(true)
return _hash_impl->hash_file_48(hd, load_at_once)
if !load_at_once {
return hash_stream_384(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_384(buf[:]), ok
}
}
return [48]byte{}, false
}
hash_384 :: proc {
@@ -199,22 +204,46 @@ hash_string_512 :: proc(data: string) -> [64]byte {
// hash_bytes_512 will hash the given input and return the
// computed hash
hash_bytes_512 :: proc(data: []byte) -> [64]byte {
_create_sha512_ctx(false)
return _hash_impl->hash_bytes_64(data)
hash: [64]byte
ctx: Sha512_Context
ctx.is384 = false
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_512 will read the stream in chunks and compute a
// hash from its contents
hash_stream_512 :: proc(s: io.Stream) -> ([64]byte, bool) {
_create_sha512_ctx(false)
return _hash_impl->hash_stream_64(s)
hash: [64]byte
ctx: Sha512_Context
ctx.is384 = false
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_512 will read the file provided by the given handle
// and compute a hash
hash_file_512 :: proc(hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
_create_sha512_ctx(false)
return _hash_impl->hash_file_64(hd, load_at_once)
if !load_at_once {
return hash_stream_512(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_512(buf[:]), ok
}
}
return [64]byte{}, false
}
hash_512 :: proc {
@@ -228,225 +257,121 @@ hash_512 :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [28]byte {
hash: [28]byte
if c, ok := ctx.internal_ctx.(Sha256_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([28]byte, bool) {
hash: [28]byte
if c, ok := ctx.internal_ctx.(Sha256_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
init :: proc(ctx: ^$T) {
when T == Sha256_Context {
if ctx.is224 {
ctx.h[0] = 0xc1059ed8
ctx.h[1] = 0x367cd507
ctx.h[2] = 0x3070dd17
ctx.h[3] = 0xf70e5939
ctx.h[4] = 0xffc00b31
ctx.h[5] = 0x68581511
ctx.h[6] = 0x64f98fa7
ctx.h[7] = 0xbefa4fa4
} else {
ctx.h[0] = 0x6a09e667
ctx.h[1] = 0xbb67ae85
ctx.h[2] = 0x3c6ef372
ctx.h[3] = 0xa54ff53a
ctx.h[4] = 0x510e527f
ctx.h[5] = 0x9b05688c
ctx.h[6] = 0x1f83d9ab
ctx.h[7] = 0x5be0cd19
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
if !load_at_once {
return hash_stream_odin_28(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_28(ctx, buf[:]), ok
} else when T == Sha512_Context {
if ctx.is384 {
ctx.h[0] = 0xcbbb9d5dc1059ed8
ctx.h[1] = 0x629a292a367cd507
ctx.h[2] = 0x9159015a3070dd17
ctx.h[3] = 0x152fecd8f70e5939
ctx.h[4] = 0x67332667ffc00b31
ctx.h[5] = 0x8eb44a8768581511
ctx.h[6] = 0xdb0c2e0d64f98fa7
ctx.h[7] = 0x47b5481dbefa4fa4
} else {
ctx.h[0] = 0x6a09e667f3bcc908
ctx.h[1] = 0xbb67ae8584caa73b
ctx.h[2] = 0x3c6ef372fe94f82b
ctx.h[3] = 0xa54ff53a5f1d36f1
ctx.h[4] = 0x510e527fade682d1
ctx.h[5] = 0x9b05688c2b3e6c1f
ctx.h[6] = 0x1f83d9abfb41bd6b
ctx.h[7] = 0x5be0cd19137e2179
}
}
return [28]byte{}, false
}
hash_bytes_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Sha256_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
update :: proc(ctx: ^$T, data: []byte) {
length := uint(len(data))
block_nb: uint
new_len, rem_len, tmp_len: uint
shifted_message := make([]byte, length)
when T == Sha256_Context {
CURR_BLOCK_SIZE :: SHA256_BLOCK_SIZE
} else when T == Sha512_Context {
CURR_BLOCK_SIZE :: SHA512_BLOCK_SIZE
}
return hash
}
hash_stream_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Sha256_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
tmp_len = CURR_BLOCK_SIZE - ctx.length
rem_len = length < tmp_len ? length : tmp_len
copy(ctx.block[ctx.length:], data[:rem_len])
hash_file_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_odin_32(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_32(ctx, buf[:]), ok
}
}
return [32]byte{}, false
}
hash_bytes_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [48]byte {
hash: [48]byte
if c, ok := ctx.internal_ctx.(Sha512_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([48]byte, bool) {
hash: [48]byte
if c, ok := ctx.internal_ctx.(Sha512_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
if !load_at_once {
return hash_stream_odin_48(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_48(ctx, buf[:]), ok
}
}
return [48]byte{}, false
}
hash_bytes_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [64]byte {
hash: [64]byte
if c, ok := ctx.internal_ctx.(Sha512_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([64]byte, bool) {
hash: [64]byte
if c, ok := ctx.internal_ctx.(Sha512_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
if !load_at_once {
return hash_stream_odin_64(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_64(ctx, buf[:]), ok
}
}
return [64]byte{}, false
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
if ctx.hash_size == ._28 || ctx.hash_size == ._32 {
_create_sha256_ctx(ctx.hash_size == ._28)
if c, ok := ctx.internal_ctx.(Sha256_Context); ok {
init_odin(&c)
}
if ctx.length + length < CURR_BLOCK_SIZE {
ctx.length += length
return
}
if ctx.hash_size == ._48 || ctx.hash_size == ._64 {
_create_sha512_ctx(ctx.hash_size == ._48)
if c, ok := ctx.internal_ctx.(Sha512_Context); ok {
init_odin(&c)
}
}
new_len = length - rem_len
block_nb = new_len / CURR_BLOCK_SIZE
shifted_message = data[rem_len:]
sha2_transf(ctx, ctx.block[:], 1)
sha2_transf(ctx, shifted_message, block_nb)
rem_len = new_len % CURR_BLOCK_SIZE
when T == Sha256_Context {copy(ctx.block[:], shifted_message[block_nb << 6:rem_len])}
else when T == Sha512_Context {copy(ctx.block[:], shifted_message[block_nb << 7:rem_len])}
ctx.length = rem_len
when T == Sha256_Context {ctx.tot_len += (block_nb + 1) << 6}
else when T == Sha512_Context {ctx.tot_len += (block_nb + 1) << 7}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
#partial switch ctx.hash_size {
case ._28, ._32:
if c, ok := ctx.internal_ctx.(Sha256_Context); ok {
update_odin(&c, data)
}
case ._48, ._64:
if c, ok := ctx.internal_ctx.(Sha512_Context); ok {
update_odin(&c, data)
}
}
}
final :: proc(ctx: ^$T, hash: []byte) {
block_nb, pm_len, len_b: u32
i: i32
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
#partial switch ctx.hash_size {
case ._28, ._32:
if c, ok := ctx.internal_ctx.(Sha256_Context); ok {
final_odin(&c, hash)
}
case ._48, ._64:
if c, ok := ctx.internal_ctx.(Sha512_Context); ok {
final_odin(&c, hash)
}
}
when T == Sha256_Context {CURR_BLOCK_SIZE :: SHA256_BLOCK_SIZE}
else when T == Sha512_Context {CURR_BLOCK_SIZE :: SHA512_BLOCK_SIZE}
when T == Sha256_Context {block_nb = 1 + ((CURR_BLOCK_SIZE - 9) < (ctx.length % CURR_BLOCK_SIZE) ? 1 : 0)}
else when T == Sha512_Context {block_nb = 1 + ((CURR_BLOCK_SIZE - 17) < (ctx.length % CURR_BLOCK_SIZE) ? 1 : 0)}
len_b = u32(ctx.tot_len + ctx.length) << 3
when T == Sha256_Context {pm_len = block_nb << 6}
else when T == Sha512_Context {pm_len = block_nb << 7}
mem.set(rawptr(&(ctx.block[ctx.length:])[0]), 0, int(uint(pm_len) - ctx.length))
ctx.block[ctx.length] = 0x80
util.PUT_U32_BE(ctx.block[pm_len - 4:], len_b)
sha2_transf(ctx, ctx.block[:], uint(block_nb))
when T == Sha256_Context {
if ctx.is224 {
for i = 0; i < 7; i += 1 {util.PUT_U32_BE(hash[i << 2:], ctx.h[i])}
} else {
for i = 0; i < 8; i += 1 {util.PUT_U32_BE(hash[i << 2:], ctx.h[i])}
}
} else when T == Sha512_Context {
if ctx.is384 {
for i = 0; i < 6; i += 1 {util.PUT_U64_BE(hash[i << 3:], ctx.h[i])}
} else {
for i = 0; i < 8; i += 1 {util.PUT_U64_BE(hash[i << 3:], ctx.h[i])}
}
}
}
/*
@@ -590,50 +515,6 @@ PACK64 :: #force_inline proc "contextless"(b: []byte, x: ^u64) {
x^ = u64(b[7]) | u64(b[6]) << 8 | u64(b[5]) << 16 | u64(b[4]) << 24 | u64(b[3]) << 32 | u64(b[2]) << 40 | u64(b[1]) << 48 | u64(b[0]) << 56
}
init_odin :: proc(ctx: ^$T) {
when T == Sha256_Context {
if ctx.is224 {
ctx.h[0] = 0xc1059ed8
ctx.h[1] = 0x367cd507
ctx.h[2] = 0x3070dd17
ctx.h[3] = 0xf70e5939
ctx.h[4] = 0xffc00b31
ctx.h[5] = 0x68581511
ctx.h[6] = 0x64f98fa7
ctx.h[7] = 0xbefa4fa4
} else {
ctx.h[0] = 0x6a09e667
ctx.h[1] = 0xbb67ae85
ctx.h[2] = 0x3c6ef372
ctx.h[3] = 0xa54ff53a
ctx.h[4] = 0x510e527f
ctx.h[5] = 0x9b05688c
ctx.h[6] = 0x1f83d9ab
ctx.h[7] = 0x5be0cd19
}
} else when T == Sha512_Context {
if ctx.is384 {
ctx.h[0] = 0xcbbb9d5dc1059ed8
ctx.h[1] = 0x629a292a367cd507
ctx.h[2] = 0x9159015a3070dd17
ctx.h[3] = 0x152fecd8f70e5939
ctx.h[4] = 0x67332667ffc00b31
ctx.h[5] = 0x8eb44a8768581511
ctx.h[6] = 0xdb0c2e0d64f98fa7
ctx.h[7] = 0x47b5481dbefa4fa4
} else {
ctx.h[0] = 0x6a09e667f3bcc908
ctx.h[1] = 0xbb67ae8584caa73b
ctx.h[2] = 0x3c6ef372fe94f82b
ctx.h[3] = 0xa54ff53a5f1d36f1
ctx.h[4] = 0x510e527fade682d1
ctx.h[5] = 0x9b05688c2b3e6c1f
ctx.h[6] = 0x1f83d9abfb41bd6b
ctx.h[7] = 0x5be0cd19137e2179
}
}
}
sha2_transf :: proc(ctx: ^$T, data: []byte, block_nb: uint) {
when T == Sha256_Context {
w: [64]u32
@@ -710,76 +591,3 @@ sha2_transf :: proc(ctx: ^$T, data: []byte, block_nb: uint) {
}
}
}
update_odin :: proc(ctx: ^$T, data: []byte) {
length := uint(len(data))
block_nb: uint
new_len, rem_len, tmp_len: uint
shifted_message := make([]byte, length)
when T == Sha256_Context {
CURR_BLOCK_SIZE :: SHA256_BLOCK_SIZE
} else when T == Sha512_Context {
CURR_BLOCK_SIZE :: SHA512_BLOCK_SIZE
}
tmp_len = CURR_BLOCK_SIZE - ctx.length
rem_len = length < tmp_len ? length : tmp_len
copy(ctx.block[ctx.length:], data[:rem_len])
if ctx.length + length < CURR_BLOCK_SIZE {
ctx.length += length
return
}
new_len = length - rem_len
block_nb = new_len / CURR_BLOCK_SIZE
shifted_message = data[rem_len:]
sha2_transf(ctx, ctx.block[:], 1)
sha2_transf(ctx, shifted_message, block_nb)
rem_len = new_len % CURR_BLOCK_SIZE
when T == Sha256_Context {copy(ctx.block[:], shifted_message[block_nb << 6:rem_len])}
else when T == Sha512_Context {copy(ctx.block[:], shifted_message[block_nb << 7:rem_len])}
ctx.length = rem_len
when T == Sha256_Context {ctx.tot_len += (block_nb + 1) << 6}
else when T == Sha512_Context {ctx.tot_len += (block_nb + 1) << 7}
}
final_odin :: proc(ctx: ^$T, hash: []byte) {
block_nb, pm_len, len_b: u32
i: i32
when T == Sha256_Context {CURR_BLOCK_SIZE :: SHA256_BLOCK_SIZE}
else when T == Sha512_Context {CURR_BLOCK_SIZE :: SHA512_BLOCK_SIZE}
when T == Sha256_Context {block_nb = 1 + ((CURR_BLOCK_SIZE - 9) < (ctx.length % CURR_BLOCK_SIZE) ? 1 : 0)}
else when T == Sha512_Context {block_nb = 1 + ((CURR_BLOCK_SIZE - 17) < (ctx.length % CURR_BLOCK_SIZE) ? 1 : 0)}
len_b = u32(ctx.tot_len + ctx.length) << 3
when T == Sha256_Context {pm_len = block_nb << 6}
else when T == Sha512_Context {pm_len = block_nb << 7}
mem.set(rawptr(&(ctx.block[ctx.length:])[0]), 0, int(uint(pm_len) - ctx.length))
ctx.block[ctx.length] = 0x80
util.PUT_U32_BE(ctx.block[pm_len - 4:], len_b)
sha2_transf(ctx, ctx.block[:], uint(block_nb))
when T == Sha256_Context {
if ctx.is224 {
for i = 0; i < 7; i += 1 {util.PUT_U32_BE(hash[i << 2:], ctx.h[i])}
} else {
for i = 0; i < 8; i += 1 {util.PUT_U32_BE(hash[i << 2:], ctx.h[i])}
}
} else when T == Sha512_Context {
if ctx.is384 {
for i = 0; i < 6; i += 1 {util.PUT_U64_BE(hash[i << 3:], ctx.h[i])}
} else {
for i = 0; i < 8; i += 1 {util.PUT_U64_BE(hash[i << 3:], ctx.h[i])}
}
}
}

View File

@@ -6,7 +6,6 @@ package sha3
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Interface for the SHA3 hashing algorithm. The SHAKE functionality can be found in package shake.
If you wish to compute a Keccak hash, you can use the keccak package, it will use the original padding.
@@ -15,58 +14,8 @@ package sha3
import "core:os"
import "core:io"
import "../botan"
import "../_ctx"
import "../_sha3"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_28 = hash_bytes_odin_28
ctx.hash_file_28 = hash_file_odin_28
ctx.hash_stream_28 = hash_stream_odin_28
ctx.hash_bytes_32 = hash_bytes_odin_32
ctx.hash_file_32 = hash_file_odin_32
ctx.hash_stream_32 = hash_stream_odin_32
ctx.hash_bytes_48 = hash_bytes_odin_48
ctx.hash_file_48 = hash_file_odin_48
ctx.hash_stream_48 = hash_stream_odin_48
ctx.hash_bytes_64 = hash_bytes_odin_64
ctx.hash_file_64 = hash_file_odin_64
ctx.hash_stream_64 = hash_stream_odin_64
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_SHA3)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
*/
@@ -80,22 +29,46 @@ hash_string_224 :: proc(data: string) -> [28]byte {
// hash_bytes_224 will hash the given input and return the
// computed hash
hash_bytes_224 :: proc(data: []byte) -> [28]byte {
_create_sha3_ctx(28)
return _hash_impl->hash_bytes_28(data)
hash: [28]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 28
_sha3.init(&ctx)
_sha3.update(&ctx, data)
_sha3.final(&ctx, hash[:])
return hash
}
// hash_stream_224 will read the stream in chunks and compute a
// hash from its contents
hash_stream_224 :: proc(s: io.Stream) -> ([28]byte, bool) {
_create_sha3_ctx(28)
return _hash_impl->hash_stream_28(s)
hash: [28]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 28
_sha3.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_sha3.update(&ctx, buf[:read])
}
}
_sha3.final(&ctx, hash[:])
return hash, true
}
// hash_file_224 will read the file provided by the given handle
// and compute a hash
hash_file_224 :: proc(hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
_create_sha3_ctx(28)
return _hash_impl->hash_file_28(hd, load_at_once)
if !load_at_once {
return hash_stream_224(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_224(buf[:]), ok
}
}
return [28]byte{}, false
}
hash_224 :: proc {
@@ -114,22 +87,46 @@ hash_string_256 :: proc(data: string) -> [32]byte {
// hash_bytes_256 will hash the given input and return the
// computed hash
hash_bytes_256 :: proc(data: []byte) -> [32]byte {
_create_sha3_ctx(32)
return _hash_impl->hash_bytes_32(data)
hash: [32]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 32
_sha3.init(&ctx)
_sha3.update(&ctx, data)
_sha3.final(&ctx, hash[:])
return hash
}
// hash_stream_256 will read the stream in chunks and compute a
// hash from its contents
hash_stream_256 :: proc(s: io.Stream) -> ([32]byte, bool) {
_create_sha3_ctx(32)
return _hash_impl->hash_stream_32(s)
hash: [32]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 32
_sha3.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_sha3.update(&ctx, buf[:read])
}
}
_sha3.final(&ctx, hash[:])
return hash, true
}
// hash_file_256 will read the file provided by the given handle
// and compute a hash
hash_file_256 :: proc(hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
_create_sha3_ctx(32)
return _hash_impl->hash_file_32(hd, load_at_once)
if !load_at_once {
return hash_stream_256(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_256(buf[:]), ok
}
}
return [32]byte{}, false
}
hash_256 :: proc {
@@ -148,22 +145,46 @@ hash_string_384 :: proc(data: string) -> [48]byte {
// hash_bytes_384 will hash the given input and return the
// computed hash
hash_bytes_384 :: proc(data: []byte) -> [48]byte {
_create_sha3_ctx(48)
return _hash_impl->hash_bytes_48(data)
hash: [48]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 48
_sha3.init(&ctx)
_sha3.update(&ctx, data)
_sha3.final(&ctx, hash[:])
return hash
}
// hash_stream_384 will read the stream in chunks and compute a
// hash from its contents
hash_stream_384 :: proc(s: io.Stream) -> ([48]byte, bool) {
_create_sha3_ctx(48)
return _hash_impl->hash_stream_48(s)
hash: [48]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 48
_sha3.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_sha3.update(&ctx, buf[:read])
}
}
_sha3.final(&ctx, hash[:])
return hash, true
}
// hash_file_384 will read the file provided by the given handle
// and compute a hash
hash_file_384 :: proc(hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
_create_sha3_ctx(48)
return _hash_impl->hash_file_48(hd, load_at_once)
if !load_at_once {
return hash_stream_384(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_384(buf[:]), ok
}
}
return [48]byte{}, false
}
hash_384 :: proc {
@@ -182,22 +203,46 @@ hash_string_512 :: proc(data: string) -> [64]byte {
// hash_bytes_512 will hash the given input and return the
// computed hash
hash_bytes_512 :: proc(data: []byte) -> [64]byte {
_create_sha3_ctx(64)
return _hash_impl->hash_bytes_64(data)
hash: [64]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 64
_sha3.init(&ctx)
_sha3.update(&ctx, data)
_sha3.final(&ctx, hash[:])
return hash
}
// hash_stream_512 will read the stream in chunks and compute a
// hash from its contents
hash_stream_512 :: proc(s: io.Stream) -> ([64]byte, bool) {
_create_sha3_ctx(64)
return _hash_impl->hash_stream_64(s)
hash: [64]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 64
_sha3.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_sha3.update(&ctx, buf[:read])
}
}
_sha3.final(&ctx, hash[:])
return hash, true
}
// hash_file_512 will read the file provided by the given handle
// and compute a hash
hash_file_512 :: proc(hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
_create_sha3_ctx(64)
return _hash_impl->hash_file_64(hd, load_at_once)
if !load_at_once {
return hash_stream_512(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_512(buf[:]), ok
}
}
return [64]byte{}, false
}
hash_512 :: proc {
@@ -211,218 +256,16 @@ hash_512 :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
Sha3_Context :: _sha3.Sha3_Context
init :: proc(ctx: ^_sha3.Sha3_Context) {
_sha3.init(ctx)
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
update :: proc "contextless" (ctx: ^_sha3.Sha3_Context, data: []byte) {
_sha3.update(ctx, data)
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [28]byte {
hash: [28]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
_sha3.update_odin(&c, data)
_sha3.final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([28]byte, bool) {
hash: [28]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_sha3.update_odin(&c, buf[:read])
}
}
_sha3.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_28 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([28]byte, bool) {
if !load_at_once {
return hash_stream_odin_28(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_28(ctx, buf[:]), ok
}
}
return [28]byte{}, false
}
hash_bytes_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
_sha3.update_odin(&c, data)
_sha3.final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_sha3.update_odin(&c, buf[:read])
}
}
_sha3.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_odin_32(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_32(ctx, buf[:]), ok
}
}
return [32]byte{}, false
}
hash_bytes_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [48]byte {
hash: [48]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
_sha3.update_odin(&c, data)
_sha3.final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([48]byte, bool) {
hash: [48]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_sha3.update_odin(&c, buf[:read])
}
}
_sha3.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_48 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([48]byte, bool) {
if !load_at_once {
return hash_stream_odin_48(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_48(ctx, buf[:]), ok
}
}
return [48]byte{}, false
}
hash_bytes_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [64]byte {
hash: [64]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
_sha3.update_odin(&c, data)
_sha3.final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([64]byte, bool) {
hash: [64]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_sha3.update_odin(&c, buf[:read])
}
}
_sha3.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
if !load_at_once {
return hash_stream_odin_64(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_64(ctx, buf[:]), ok
}
}
return [64]byte{}, false
}
@(private)
_create_sha3_ctx :: #force_inline proc(mdlen: int) {
ctx: _sha3.Sha3_Context
ctx.mdlen = mdlen
_hash_impl.internal_ctx = ctx
switch mdlen {
case 28: _hash_impl.hash_size = ._28
case 32: _hash_impl.hash_size = ._32
case 48: _hash_impl.hash_size = ._48
case 64: _hash_impl.hash_size = ._64
}
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
#partial switch ctx.hash_size {
case ._28: _create_sha3_ctx(28)
case ._32: _create_sha3_ctx(32)
case ._48: _create_sha3_ctx(48)
case ._64: _create_sha3_ctx(64)
}
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.final_odin(&c, hash)
}
final :: proc "contextless" (ctx: ^_sha3.Sha3_Context, hash: []byte) {
_sha3.final(ctx, hash)
}

View File

@@ -6,7 +6,6 @@ package shake
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Interface for the SHAKE hashing algorithm.
The SHA3 functionality can be found in package sha3.
@@ -15,52 +14,8 @@ package shake
import "core:os"
import "core:io"
import "../botan"
import "../_ctx"
import "../_sha3"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_16 = hash_bytes_odin_16
ctx.hash_file_16 = hash_file_odin_16
ctx.hash_stream_16 = hash_stream_odin_16
ctx.hash_bytes_32 = hash_bytes_odin_32
ctx.hash_file_32 = hash_file_odin_32
ctx.hash_stream_32 = hash_stream_odin_32
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_SHAKE)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
*/
@@ -74,22 +29,48 @@ hash_string_128 :: proc(data: string) -> [16]byte {
// hash_bytes_128 will hash the given input and return the
// computed hash
hash_bytes_128 :: proc(data: []byte) -> [16]byte {
_create_shake_ctx(16)
return _hash_impl->hash_bytes_16(data)
hash: [16]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 16
_sha3.init(&ctx)
_sha3.update(&ctx, data)
_sha3.shake_xof(&ctx)
_sha3.shake_out(&ctx, hash[:])
return hash
}
// hash_stream_128 will read the stream in chunks and compute a
// hash from its contents
hash_stream_128 :: proc(s: io.Stream) -> ([16]byte, bool) {
_create_shake_ctx(16)
return _hash_impl->hash_stream_16(s)
hash: [16]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 16
_sha3.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_sha3.update(&ctx, buf[:read])
}
}
_sha3.shake_xof(&ctx)
_sha3.shake_out(&ctx, hash[:])
return hash, true
}
// hash_file_128 will read the file provided by the given handle
// and compute a hash
hash_file_128 :: proc(hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
_create_shake_ctx(16)
return _hash_impl->hash_file_16(hd, load_at_once)
if !load_at_once {
return hash_stream_128(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_128(buf[:]), ok
}
}
return [16]byte{}, false
}
hash_128 :: proc {
@@ -108,22 +89,48 @@ hash_string_256 :: proc(data: string) -> [32]byte {
// hash_bytes_256 will hash the given input and return the
// computed hash
hash_bytes_256 :: proc(data: []byte) -> [32]byte {
_create_shake_ctx(32)
return _hash_impl->hash_bytes_32(data)
hash: [32]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 32
_sha3.init(&ctx)
_sha3.update(&ctx, data)
_sha3.shake_xof(&ctx)
_sha3.shake_out(&ctx, hash[:])
return hash
}
// hash_stream_256 will read the stream in chunks and compute a
// hash from its contents
hash_stream_256 :: proc(s: io.Stream) -> ([32]byte, bool) {
_create_shake_ctx(32)
return _hash_impl->hash_stream_32(s)
hash: [32]byte
ctx: _sha3.Sha3_Context
ctx.mdlen = 32
_sha3.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_sha3.update(&ctx, buf[:read])
}
}
_sha3.shake_xof(&ctx)
_sha3.shake_out(&ctx, hash[:])
return hash, true
}
// hash_file_256 will read the file provided by the given handle
// and compute a hash
hash_file_256 :: proc(hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
_create_shake_ctx(32)
return _hash_impl->hash_file_32(hd, load_at_once)
if !load_at_once {
return hash_stream_256(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_256(buf[:]), ok
}
}
return [32]byte{}, false
}
hash_256 :: proc {
@@ -137,137 +144,17 @@ hash_256 :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
Sha3_Context :: _sha3.Sha3_Context
init :: proc(ctx: ^_sha3.Sha3_Context) {
_sha3.init(ctx)
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
update :: proc "contextless" (ctx: ^_sha3.Sha3_Context, data: []byte) {
_sha3.update(ctx, data)
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [16]byte {
hash: [16]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
_sha3.update_odin(&c, data)
_sha3.shake_xof_odin(&c)
_sha3.shake_out_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([16]byte, bool) {
hash: [16]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_sha3.update_odin(&c, buf[:read])
}
}
_sha3.shake_xof_odin(&c)
_sha3.shake_out_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
if !load_at_once {
return hash_stream_odin_16(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_16(ctx, buf[:]), ok
}
}
return [16]byte{}, false
}
hash_bytes_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
_sha3.update_odin(&c, data)
_sha3.shake_xof_odin(&c)
_sha3.shake_out_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_sha3.update_odin(&c, buf[:read])
}
}
_sha3.shake_xof_odin(&c)
_sha3.shake_out_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_odin_32(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_32(ctx, buf[:]), ok
}
}
return [32]byte{}, false
}
@(private)
_create_shake_ctx :: #force_inline proc(mdlen: int) {
ctx: _sha3.Sha3_Context
ctx.mdlen = mdlen
_hash_impl.internal_ctx = ctx
switch mdlen {
case 16: _hash_impl.hash_size = ._16
case 32: _hash_impl.hash_size = ._32
}
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
#partial switch ctx.hash_size {
case ._16: _create_shake_ctx(16)
case ._32: _create_shake_ctx(32)
}
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.init_odin(&c)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(_sha3.Sha3_Context); ok {
_sha3.shake_xof_odin(&c)
_sha3.shake_out_odin(&c, hash[:])
}
final :: proc "contextless" (ctx: ^_sha3.Sha3_Context, hash: []byte) {
_sha3.shake_xof(ctx)
_sha3.shake_out(ctx, hash[:])
}

View File

@@ -1,487 +0,0 @@
package skein
/*
Copyright 2021 zhibog
Made available under the BSD-3 license.
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the SKEIN hashing algorithm, as defined in <https://www.schneier.com/academic/skein/>
This package offers the internal state sizes of 256, 512 and 1024 bits and arbitrary output size.
*/
import "core:os"
import "core:io"
import "../botan"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
ctx.is_using_odin = false
} else {
_assign_hash_vtable(ctx)
ctx.is_using_odin = true
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
// @note(zh): Default to SKEIN-512
ctx.hash_bytes_slice = hash_bytes_skein512_odin
ctx.hash_file_slice = hash_file_skein512_odin
ctx.hash_stream_slice = hash_stream_skein512_odin
ctx.init = _init_skein512_odin
ctx.update = _update_skein512_odin
ctx.final = _final_skein512_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
_hash_impl.is_using_odin = false
// @note(zh): Botan only supports SKEIN-512.
botan.assign_hash_vtable(_hash_impl, botan.HASH_SKEIN_512)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
@(warning="SKEIN is not yet implemented in Odin. Botan bindings will be used")
use_odin :: #force_inline proc() {
// _hash_impl.is_using_odin = true
// _assign_hash_vtable(_hash_impl)
use_botan()
}
@(private)
_create_skein256_ctx :: #force_inline proc(size: int) {
_hash_impl.hash_size_val = size
if _hash_impl.is_using_odin {
ctx: Skein256_Context
ctx.h.bit_length = u64(size)
_hash_impl.internal_ctx = ctx
_hash_impl.hash_bytes_slice = hash_bytes_skein256_odin
_hash_impl.hash_file_slice = hash_file_skein256_odin
_hash_impl.hash_stream_slice = hash_stream_skein256_odin
_hash_impl.init = _init_skein256_odin
_hash_impl.update = _update_skein256_odin
_hash_impl.final = _final_skein256_odin
}
}
@(private)
_create_skein512_ctx :: #force_inline proc(size: int) {
_hash_impl.hash_size_val = size
if _hash_impl.is_using_odin {
ctx: Skein512_Context
ctx.h.bit_length = u64(size)
_hash_impl.internal_ctx = ctx
_hash_impl.hash_bytes_slice = hash_bytes_skein512_odin
_hash_impl.hash_file_slice = hash_file_skein512_odin
_hash_impl.hash_stream_slice = hash_stream_skein512_odin
_hash_impl.init = _init_skein512_odin
_hash_impl.update = _update_skein512_odin
_hash_impl.final = _final_skein512_odin
}
}
@(private)
_create_skein1024_ctx :: #force_inline proc(size: int) {
_hash_impl.hash_size_val = size
if _hash_impl.is_using_odin {
ctx: Skein1024_Context
ctx.h.bit_length = u64(size)
_hash_impl.internal_ctx = ctx
_hash_impl.hash_bytes_slice = hash_bytes_skein1024_odin
_hash_impl.hash_file_slice = hash_file_skein1024_odin
_hash_impl.hash_stream_slice = hash_stream_skein1024_odin
_hash_impl.init = _init_skein1024_odin
_hash_impl.update = _update_skein1024_odin
_hash_impl.final = _final_skein1024_odin
}
}
/*
High level API
*/
// hash_skein256_string will hash the given input and return the
// computed hash
hash_skein256_string :: proc(data: string, bit_size: int, allocator := context.allocator) -> []byte {
return hash_skein256_bytes(transmute([]byte)(data), bit_size, allocator)
}
// hash_skein256_bytes will hash the given input and return the
// computed hash
hash_skein256_bytes :: proc(data: []byte, bit_size: int, allocator := context.allocator) -> []byte {
_create_skein256_ctx(bit_size)
return _hash_impl->hash_bytes_slice(data, bit_size, allocator)
}
// hash_skein256_stream will read the stream in chunks and compute a
// hash from its contents
hash_skein256_stream :: proc(s: io.Stream, bit_size: int, allocator := context.allocator) -> ([]byte, bool) {
_create_skein256_ctx(bit_size)
return _hash_impl->hash_stream_slice(s, bit_size, allocator)
}
// hash_skein256_file will read the file provided by the given handle
// and compute a hash
hash_skein256_file :: proc(hd: os.Handle, bit_size: int, load_at_once := false, allocator := context.allocator) -> ([]byte, bool) {
_create_skein256_ctx(bit_size)
return _hash_impl->hash_file_slice(hd, bit_size, load_at_once, allocator)
}
hash_skein256 :: proc {
hash_skein256_stream,
hash_skein256_file,
hash_skein256_bytes,
hash_skein256_string,
}
// hash_skein512_string will hash the given input and return the
// computed hash
hash_skein512_string :: proc(data: string, bit_size: int, allocator := context.allocator) -> []byte {
return hash_skein512_bytes(transmute([]byte)(data), bit_size, allocator)
}
// hash_skein512_bytes will hash the given input and return the
// computed hash
hash_skein512_bytes :: proc(data: []byte, bit_size: int, allocator := context.allocator) -> []byte {
_create_skein512_ctx(bit_size)
return _hash_impl->hash_bytes_slice(data, bit_size, allocator)
}
// hash_skein512_stream will read the stream in chunks and compute a
// hash from its contents
hash_skein512_stream :: proc(s: io.Stream, bit_size: int, allocator := context.allocator) -> ([]byte, bool) {
_create_skein512_ctx(bit_size)
return _hash_impl->hash_stream_slice(s, bit_size, allocator)
}
// hash_skein512_file will read the file provided by the given handle
// and compute a hash
hash_skein512_file :: proc(hd: os.Handle, bit_size: int, load_at_once := false, allocator := context.allocator) -> ([]byte, bool) {
_create_skein512_ctx(bit_size)
return _hash_impl->hash_file_slice(hd, bit_size, load_at_once, allocator)
}
hash_skein512 :: proc {
hash_skein512_stream,
hash_skein512_file,
hash_skein512_bytes,
hash_skein512_string,
}
// hash_skein1024_string will hash the given input and return the
// computed hash
hash_skein1024_string :: proc(data: string, bit_size: int, allocator := context.allocator) -> []byte {
return hash_skein1024_bytes(transmute([]byte)(data), bit_size, allocator)
}
// hash_skein1024_bytes will hash the given input and return the
// computed hash
hash_skein1024_bytes :: proc(data: []byte, bit_size: int, allocator := context.allocator) -> []byte {
_create_skein1024_ctx(bit_size)
return _hash_impl->hash_bytes_slice(data, bit_size, allocator)
}
// hash_skein1024_stream will read the stream in chunks and compute a
// hash from its contents
hash_skein1024_stream :: proc(s: io.Stream, bit_size: int, allocator := context.allocator) -> ([]byte, bool) {
_create_skein1024_ctx(bit_size)
return _hash_impl->hash_stream_slice(s, bit_size, allocator)
}
// hash_skein1024_file will read the file provided by the given handle
// and compute a hash
hash_skein1024_file :: proc(hd: os.Handle, bit_size: int, load_at_once := false, allocator := context.allocator) -> ([]byte, bool) {
_create_skein1024_ctx(bit_size)
return _hash_impl->hash_file_slice(hd, bit_size, load_at_once, allocator)
}
hash_skein1024 :: proc {
hash_skein1024_stream,
hash_skein1024_file,
hash_skein1024_bytes,
hash_skein1024_string,
}
/*
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_skein256_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte, bit_size: int, allocator := context.allocator) -> []byte {
hash := make([]byte, bit_size, allocator)
if c, ok := ctx.internal_ctx.(Skein256_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
return hash
} else {
delete(hash)
return nil
}
}
hash_stream_skein256_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream, bit_size: int, allocator := context.allocator) -> ([]byte, bool) {
hash := make([]byte, bit_size, allocator)
if c, ok := ctx.internal_ctx.(Skein256_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
delete(hash)
return nil, false
}
}
hash_file_skein256_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, bit_size: int, load_at_once := false, allocator := context.allocator) -> ([]byte, bool) {
if !load_at_once {
return hash_stream_skein256_odin(ctx, os.stream_from_handle(hd), bit_size, allocator)
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_skein256_odin(ctx, buf[:], bit_size, allocator), ok
}
}
return nil, false
}
hash_bytes_skein512_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte, bit_size: int, allocator := context.allocator) -> []byte {
hash := make([]byte, bit_size, allocator)
if c, ok := ctx.internal_ctx.(Skein512_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
return hash
} else {
delete(hash)
return nil
}
}
hash_stream_skein512_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream, bit_size: int, allocator := context.allocator) -> ([]byte, bool) {
hash := make([]byte, bit_size, allocator)
if c, ok := ctx.internal_ctx.(Skein512_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
delete(hash)
return nil, false
}
}
hash_file_skein512_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, bit_size: int, load_at_once := false, allocator := context.allocator) -> ([]byte, bool) {
if !load_at_once {
return hash_stream_skein512_odin(ctx, os.stream_from_handle(hd), bit_size, allocator)
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_skein512_odin(ctx, buf[:], bit_size, allocator), ok
}
}
return nil, false
}
hash_bytes_skein1024_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte, bit_size: int, allocator := context.allocator) -> []byte {
hash := make([]byte, bit_size, allocator)
if c, ok := ctx.internal_ctx.(Skein1024_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
return hash
} else {
delete(hash)
return nil
}
}
hash_stream_skein1024_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream, bit_size: int, allocator := context.allocator) -> ([]byte, bool) {
hash := make([]byte, bit_size, allocator)
if c, ok := ctx.internal_ctx.(Skein1024_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
delete(hash)
return nil, false
}
}
hash_file_skein1024_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, bit_size: int, load_at_once := false, allocator := context.allocator) -> ([]byte, bool) {
if !load_at_once {
return hash_stream_skein512_odin(ctx, os.stream_from_handle(hd), bit_size, allocator)
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_skein512_odin(ctx, buf[:], bit_size, allocator), ok
}
}
return nil, false
}
@(private)
_init_skein256_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_skein256_ctx(ctx.hash_size_val)
if c, ok := ctx.internal_ctx.(Skein256_Context); ok {
init_odin(&c)
}
}
@(private)
_update_skein256_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Skein256_Context); ok {
update_odin(&c, data)
}
}
@(private)
_final_skein256_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Skein256_Context); ok {
final_odin(&c, hash)
}
}
@(private)
_init_skein512_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_skein512_ctx(ctx.hash_size_val)
if c, ok := ctx.internal_ctx.(Skein512_Context); ok {
init_odin(&c)
}
}
@(private)
_update_skein512_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Skein512_Context); ok {
update_odin(&c, data)
}
}
@(private)
_final_skein512_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Skein512_Context); ok {
final_odin(&c, hash)
}
}
@(private)
_init_skein1024_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_skein1024_ctx(ctx.hash_size_val)
if c, ok := ctx.internal_ctx.(Skein1024_Context); ok {
init_odin(&c)
}
}
@(private)
_update_skein1024_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Skein1024_Context); ok {
update_odin(&c, data)
}
}
@(private)
_final_skein1024_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Skein1024_Context); ok {
final_odin(&c, hash)
}
}
/*
SKEIN implementation
*/
STATE_WORDS_256 :: 4
STATE_WORDS_512 :: 8
STATE_WORDS_1024 :: 16
STATE_BYTES_256 :: 32
STATE_BYTES_512 :: 64
STATE_BYTES_1024 :: 128
Skein_Header :: struct {
bit_length: u64,
bcnt: u64,
t: [2]u64,
}
Skein256_Context :: struct {
h: Skein_Header,
x: [STATE_WORDS_256]u64,
b: [STATE_BYTES_256]byte,
}
Skein512_Context :: struct {
h: Skein_Header,
x: [STATE_WORDS_512]u64,
b: [STATE_BYTES_512]byte,
}
Skein1024_Context :: struct {
h: Skein_Header,
x: [STATE_WORDS_1024]u64,
b: [STATE_BYTES_1024]byte,
}
init_odin :: proc(ctx: ^$T) {
}
update_odin :: proc(ctx: ^$T, data: []byte) {
}
final_odin :: proc(ctx: ^$T, hash: []byte) {
}

View File

@@ -6,7 +6,6 @@ package sm3
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the SM3 hashing algorithm, as defined in <https://datatracker.ietf.org/doc/html/draft-sca-cfrg-sm3-02>
*/
@@ -15,51 +14,6 @@ import "core:os"
import "core:io"
import "../util"
import "../botan"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_32 = hash_bytes_odin
ctx.hash_file_32 = hash_file_odin
ctx.hash_stream_32 = hash_stream_odin
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_SM3)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
*/
// hash_string will hash the given input and return the
// computed hash
@@ -70,22 +24,44 @@ hash_string :: proc(data: string) -> [32]byte {
// hash_bytes will hash the given input and return the
// computed hash
hash_bytes :: proc(data: []byte) -> [32]byte {
_create_sm3_ctx()
return _hash_impl->hash_bytes_32(data)
hash: [32]byte
ctx: Sm3_Context
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream will read the stream in chunks and compute a
// hash from its contents
hash_stream :: proc(s: io.Stream) -> ([32]byte, bool) {
_create_sm3_ctx()
return _hash_impl->hash_stream_32(s)
hash: [32]byte
ctx: Sm3_Context
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file will read the file provided by the given handle
// and compute a hash
hash_file :: proc(hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
_create_sm3_ctx()
return _hash_impl->hash_file_32(hd, load_at_once)
if !load_at_once {
return hash_stream(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes(buf[:]), ok
}
}
return [32]byte{}, false
}
hash :: proc {
@@ -99,86 +75,64 @@ hash :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
init :: proc(ctx: ^Sm3_Context) {
ctx.state[0] = IV[0]
ctx.state[1] = IV[1]
ctx.state[2] = IV[2]
ctx.state[3] = IV[3]
ctx.state[4] = IV[4]
ctx.state[5] = IV[5]
ctx.state[6] = IV[6]
ctx.state[7] = IV[7]
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
}
update :: proc(ctx: ^Sm3_Context, data: []byte) {
data := data
ctx.length += u64(len(data))
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
hash_bytes_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Sm3_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Sm3_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
if ctx.bitlength > 0 {
n := copy(ctx.x[ctx.bitlength:], data[:])
ctx.bitlength += u64(n)
if ctx.bitlength == 64 {
block(ctx, ctx.x[:])
ctx.bitlength = 0
}
final_odin(&c, hash[:])
return hash, true
data = data[n:]
}
if len(data) >= 64 {
n := len(data) &~ (64 - 1)
block(ctx, data[:n])
data = data[n:]
}
if len(data) > 0 {
ctx.bitlength = u64(copy(ctx.x[:], data[:]))
}
}
final :: proc(ctx: ^Sm3_Context, hash: []byte) {
length := ctx.length
pad: [64]byte
pad[0] = 0x80
if length % 64 < 56 {
update(ctx, pad[0: 56 - length % 64])
} else {
return hash, false
update(ctx, pad[0: 64 + 56 - length % 64])
}
}
hash_file_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_odin(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin(ctx, buf[:]), ok
}
}
return [32]byte{}, false
}
length <<= 3
util.PUT_U64_BE(pad[:], length)
update(ctx, pad[0: 8])
assert(ctx.bitlength == 0)
@(private)
_create_sm3_ctx :: #force_inline proc() {
ctx: Sm3_Context
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._32
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_sm3_ctx()
if c, ok := ctx.internal_ctx.(Sm3_Context); ok {
init_odin(&c)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Sm3_Context); ok {
update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Sm3_Context); ok {
final_odin(&c, hash)
}
util.PUT_U32_BE(hash[0:], ctx.state[0])
util.PUT_U32_BE(hash[4:], ctx.state[1])
util.PUT_U32_BE(hash[8:], ctx.state[2])
util.PUT_U32_BE(hash[12:], ctx.state[3])
util.PUT_U32_BE(hash[16:], ctx.state[4])
util.PUT_U32_BE(hash[20:], ctx.state[5])
util.PUT_U32_BE(hash[24:], ctx.state[6])
util.PUT_U32_BE(hash[28:], ctx.state[7])
}
/*
@@ -200,17 +154,6 @@ IV := [8]u32 {
0xa96f30bc, 0x163138aa, 0xe38dee4d, 0xb0fb0e4e,
}
init_odin :: proc(ctx: ^Sm3_Context) {
ctx.state[0] = IV[0]
ctx.state[1] = IV[1]
ctx.state[2] = IV[2]
ctx.state[3] = IV[3]
ctx.state[4] = IV[4]
ctx.state[5] = IV[5]
ctx.state[6] = IV[6]
ctx.state[7] = IV[7]
}
block :: proc "contextless" (ctx: ^Sm3_Context, buf: []byte) {
buf := buf
@@ -282,52 +225,3 @@ block :: proc "contextless" (ctx: ^Sm3_Context, buf: []byte) {
ctx.state[0], ctx.state[1], ctx.state[2], ctx.state[3] = state0, state1, state2, state3
ctx.state[4], ctx.state[5], ctx.state[6], ctx.state[7] = state4, state5, state6, state7
}
update_odin :: proc(ctx: ^Sm3_Context, data: []byte) {
data := data
ctx.length += u64(len(data))
if ctx.bitlength > 0 {
n := copy(ctx.x[ctx.bitlength:], data[:])
ctx.bitlength += u64(n)
if ctx.bitlength == 64 {
block(ctx, ctx.x[:])
ctx.bitlength = 0
}
data = data[n:]
}
if len(data) >= 64 {
n := len(data) &~ (64 - 1)
block(ctx, data[:n])
data = data[n:]
}
if len(data) > 0 {
ctx.bitlength = u64(copy(ctx.x[:], data[:]))
}
}
final_odin :: proc(ctx: ^Sm3_Context, hash: []byte) {
length := ctx.length
pad: [64]byte
pad[0] = 0x80
if length % 64 < 56 {
update_odin(ctx, pad[0: 56 - length % 64])
} else {
update_odin(ctx, pad[0: 64 + 56 - length % 64])
}
length <<= 3
util.PUT_U64_BE(pad[:], length)
update_odin(ctx, pad[0: 8])
assert(ctx.bitlength == 0)
util.PUT_U32_BE(hash[0:], ctx.state[0])
util.PUT_U32_BE(hash[4:], ctx.state[1])
util.PUT_U32_BE(hash[8:], ctx.state[2])
util.PUT_U32_BE(hash[12:], ctx.state[3])
util.PUT_U32_BE(hash[16:], ctx.state[4])
util.PUT_U32_BE(hash[20:], ctx.state[5])
util.PUT_U32_BE(hash[24:], ctx.state[6])
util.PUT_U32_BE(hash[28:], ctx.state[7])
}

View File

@@ -6,7 +6,6 @@ package streebog
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the Streebog hashing algorithm, standardized as GOST R 34.11-2012 in RFC 6986 <https://datatracker.ietf.org/doc/html/rfc6986>
*/
@@ -15,58 +14,6 @@ import "core:os"
import "core:io"
import "../util"
import "../botan"
import "../_ctx"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_32 = hash_bytes_odin_32
ctx.hash_file_32 = hash_file_odin_32
ctx.hash_stream_32 = hash_stream_odin_32
ctx.hash_bytes_64 = hash_bytes_odin_64
ctx.hash_file_64 = hash_file_odin_64
ctx.hash_stream_64 = hash_stream_odin_64
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_STREEBOG)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
@(private)
_create_streebog_ctx :: #force_inline proc(is256: bool) {
ctx: Streebog_Context
ctx.is256 = is256
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = is256 ? ._32 : ._64
}
/*
High level API
@@ -81,22 +28,46 @@ hash_string_256 :: proc(data: string) -> [32]byte {
// hash_bytes_256 will hash the given input and return the
// computed hash
hash_bytes_256 :: proc(data: []byte) -> [32]byte {
_create_streebog_ctx(true)
return _hash_impl->hash_bytes_32(data)
hash: [32]byte
ctx: Streebog_Context
ctx.is256 = true
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_256 will read the stream in chunks and compute a
// hash from its contents
hash_stream_256 :: proc(s: io.Stream) -> ([32]byte, bool) {
_create_streebog_ctx(true)
return _hash_impl->hash_stream_32(s)
hash: [32]byte
ctx: Streebog_Context
ctx.is256 = true
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_256 will read the file provided by the given handle
// and compute a hash
hash_file_256 :: proc(hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
_create_streebog_ctx(true)
return _hash_impl->hash_file_32(hd, load_at_once)
if !load_at_once {
return hash_stream_256(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_256(buf[:]), ok
}
}
return [32]byte{}, false
}
hash_256 :: proc {
@@ -115,22 +86,44 @@ hash_string_512 :: proc(data: string) -> [64]byte {
// hash_bytes_512 will hash the given input and return the
// computed hash
hash_bytes_512 :: proc(data: []byte) -> [64]byte {
_create_streebog_ctx(false)
return _hash_impl->hash_bytes_64(data)
hash: [64]byte
ctx: Streebog_Context
init(&ctx)
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream_512 will read the stream in chunks and compute a
// hash from its contents
hash_stream_512 :: proc(s: io.Stream) -> ([64]byte, bool) {
_create_streebog_ctx(false)
return _hash_impl->hash_stream_64(s)
hash: [64]byte
ctx: Streebog_Context
init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file_512 will read the file provided by the given handle
// and compute a hash
hash_file_512 :: proc(hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
_create_streebog_ctx(false)
return _hash_impl->hash_file_64(hd, load_at_once)
if !load_at_once {
return hash_stream_512(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_512(buf[:]), ok
}
}
return [64]byte{}, false
}
hash_512 :: proc {
@@ -144,120 +137,64 @@ hash_512 :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
init :: proc(ctx: ^Streebog_Context) {
if ctx.is256 {
ctx.hash_size = 256
for _, i in ctx.h {
ctx.h[i] = 0x01
}
} else {
ctx.hash_size = 512
}
ctx.v_512[1] = 0x02
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
update :: proc(ctx: ^Streebog_Context, data: []byte) {
length := u64(len(data))
chk_size: u64
data := data
for (length > 63) && (ctx.buf_size == 0) {
stage2(ctx, data)
data = data[64:]
length -= 64
}
for length != 0 {
chk_size = 64 - ctx.buf_size
if chk_size > length {
chk_size = length
}
copy(ctx.buffer[ctx.buf_size:], data[:chk_size])
ctx.buf_size += chk_size
length -= chk_size
data = data[chk_size:]
if ctx.buf_size == 64 {
stage2(ctx, ctx.buffer[:])
ctx.buf_size = 0
}
}
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
final :: proc(ctx: ^Streebog_Context, hash: []byte) {
t: [64]byte
t[1] = byte((ctx.buf_size * 8) >> 8) & 0xff
t[0] = byte((ctx.buf_size) * 8) & 0xff
hash_bytes_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [32]byte {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Streebog_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
padding(ctx)
hash_stream_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([32]byte, bool) {
hash: [32]byte
if c, ok := ctx.internal_ctx.(Streebog_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
G(ctx.h[:], ctx.n[:], ctx.buffer[:])
hash_file_odin_32 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([32]byte, bool) {
if !load_at_once {
return hash_stream_odin_32(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_32(ctx, buf[:]), ok
}
}
return [32]byte{}, false
}
add_mod_512(ctx.n[:], t[:], ctx.n[:])
add_mod_512(ctx.sigma[:], ctx.buffer[:], ctx.sigma[:])
hash_bytes_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [64]byte {
hash: [64]byte
if c, ok := ctx.internal_ctx.(Streebog_Context); ok {
init_odin(&c)
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
G(ctx.h[:], ctx.v_0[:], ctx.n[:])
G(ctx.h[:], ctx.v_0[:], ctx.sigma[:])
hash_stream_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([64]byte, bool) {
hash: [64]byte
if c, ok := ctx.internal_ctx.(Streebog_Context); ok {
init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_64 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
if !load_at_once {
return hash_stream_odin_64(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_64(ctx, buf[:]), ok
}
}
return [64]byte{}, false
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
_create_streebog_ctx(ctx.hash_size == ._32)
if c, ok := ctx.internal_ctx.(Streebog_Context); ok {
init_odin(&c)
}
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Streebog_Context); ok {
update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Streebog_Context); ok {
final_odin(&c, hash)
}
if ctx.is256 {
copy(hash[:], ctx.h[32:])
} else {
copy(hash[:], ctx.h[:])
}
}
/*
@@ -534,63 +471,3 @@ padding :: proc(ctx: ^Streebog_Context) {
copy(ctx.buffer[:], t[:])
}
}
init_odin :: proc(ctx: ^Streebog_Context) {
if ctx.is256 {
ctx.hash_size = 256
for _, i in ctx.h {
ctx.h[i] = 0x01
}
} else {
ctx.hash_size = 512
}
ctx.v_512[1] = 0x02
}
update_odin :: proc(ctx: ^Streebog_Context, data: []byte) {
length := u64(len(data))
chk_size: u64
data := data
for (length > 63) && (ctx.buf_size == 0) {
stage2(ctx, data)
data = data[64:]
length -= 64
}
for length != 0 {
chk_size = 64 - ctx.buf_size
if chk_size > length {
chk_size = length
}
copy(ctx.buffer[ctx.buf_size:], data[:chk_size])
ctx.buf_size += chk_size
length -= chk_size
data = data[chk_size:]
if ctx.buf_size == 64 {
stage2(ctx, ctx.buffer[:])
ctx.buf_size = 0
}
}
}
final_odin :: proc(ctx: ^Streebog_Context, hash: []byte) {
t: [64]byte
t[1] = byte((ctx.buf_size * 8) >> 8) & 0xff
t[0] = byte((ctx.buf_size) * 8) & 0xff
padding(ctx)
G(ctx.h[:], ctx.n[:], ctx.buffer[:])
add_mod_512(ctx.n[:], t[:], ctx.n[:])
add_mod_512(ctx.sigma[:], ctx.buffer[:], ctx.sigma[:])
G(ctx.h[:], ctx.v_0[:], ctx.n[:])
G(ctx.h[:], ctx.v_0[:], ctx.sigma[:])
if ctx.is256 {
copy(hash[:], ctx.h[32:])
} else {
copy(hash[:], ctx.h[:])
}
}

View File

@@ -6,7 +6,6 @@ package tiger
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Interface for the Tiger1 variant of the Tiger hashing algorithm as defined in <https://www.cs.technion.ac.il/~biham/Reports/Tiger/>
*/
@@ -14,55 +13,8 @@ package tiger
import "core:os"
import "core:io"
import "../botan"
import "../_ctx"
import "../_tiger"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_16 = hash_bytes_odin_16
ctx.hash_file_16 = hash_file_odin_16
ctx.hash_stream_16 = hash_stream_odin_16
ctx.hash_bytes_20 = hash_bytes_odin_20
ctx.hash_file_20 = hash_file_odin_20
ctx.hash_stream_20 = hash_stream_odin_20
ctx.hash_bytes_24 = hash_bytes_odin_24
ctx.hash_file_24 = hash_file_odin_24
ctx.hash_stream_24 = hash_stream_odin_24
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_TIGER)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
*/
@@ -76,22 +28,46 @@ hash_string_128 :: proc(data: string) -> [16]byte {
// hash_bytes_128 will hash the given input and return the
// computed hash
hash_bytes_128 :: proc(data: []byte) -> [16]byte {
_create_tiger_ctx(16)
return _hash_impl->hash_bytes_16(data)
hash: [16]byte
ctx: _tiger.Tiger_Context
ctx.ver = 1
_tiger.init(&ctx)
_tiger.update(&ctx, data)
_tiger.final(&ctx, hash[:])
return hash
}
// hash_stream_128 will read the stream in chunks and compute a
// hash from its contents
hash_stream_128 :: proc(s: io.Stream) -> ([16]byte, bool) {
_create_tiger_ctx(16)
return _hash_impl->hash_stream_16(s)
hash: [16]byte
ctx: _tiger.Tiger_Context
ctx.ver = 1
_tiger.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_tiger.update(&ctx, buf[:read])
}
}
_tiger.final(&ctx, hash[:])
return hash, true
}
// hash_file_128 will read the file provided by the given handle
// and compute a hash
hash_file_128 :: proc(hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
_create_tiger_ctx(16)
return _hash_impl->hash_file_16(hd, load_at_once)
if !load_at_once {
return hash_stream_128(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_128(buf[:]), ok
}
}
return [16]byte{}, false
}
hash_128 :: proc {
@@ -110,22 +86,46 @@ hash_string_160 :: proc(data: string) -> [20]byte {
// hash_bytes_160 will hash the given input and return the
// computed hash
hash_bytes_160 :: proc(data: []byte) -> [20]byte {
_create_tiger_ctx(20)
return _hash_impl->hash_bytes_20(data)
hash: [20]byte
ctx: _tiger.Tiger_Context
ctx.ver = 1
_tiger.init(&ctx)
_tiger.update(&ctx, data)
_tiger.final(&ctx, hash[:])
return hash
}
// hash_stream_160 will read the stream in chunks and compute a
// hash from its contents
hash_stream_160 :: proc(s: io.Stream) -> ([20]byte, bool) {
_create_tiger_ctx(20)
return _hash_impl->hash_stream_20(s)
hash: [20]byte
ctx: _tiger.Tiger_Context
ctx.ver = 1
_tiger.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_tiger.update(&ctx, buf[:read])
}
}
_tiger.final(&ctx, hash[:])
return hash, true
}
// hash_file_160 will read the file provided by the given handle
// and compute a hash
hash_file_160 :: proc(hd: os.Handle, load_at_once := false) -> ([20]byte, bool) {
_create_tiger_ctx(20)
return _hash_impl->hash_file_20(hd, load_at_once)
if !load_at_once {
return hash_stream_160(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_160(buf[:]), ok
}
}
return [20]byte{}, false
}
hash_160 :: proc {
@@ -144,22 +144,46 @@ hash_string_192 :: proc(data: string) -> [24]byte {
// hash_bytes_192 will hash the given input and return the
// computed hash
hash_bytes_192 :: proc(data: []byte) -> [24]byte {
_create_tiger_ctx(24)
return _hash_impl->hash_bytes_24(data)
hash: [24]byte
ctx: _tiger.Tiger_Context
ctx.ver = 1
_tiger.init(&ctx)
_tiger.update(&ctx, data)
_tiger.final(&ctx, hash[:])
return hash
}
// hash_stream_192 will read the stream in chunks and compute a
// hash from its contents
hash_stream_192 :: proc(s: io.Stream) -> ([24]byte, bool) {
_create_tiger_ctx(24)
return _hash_impl->hash_stream_24(s)
hash: [24]byte
ctx: _tiger.Tiger_Context
ctx.ver = 1
_tiger.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_tiger.update(&ctx, buf[:read])
}
}
_tiger.final(&ctx, hash[:])
return hash, true
}
// hash_file_192 will read the file provided by the given handle
// and compute a hash
hash_file_192 :: proc(hd: os.Handle, load_at_once := false) -> ([24]byte, bool) {
_create_tiger_ctx(24)
return _hash_impl->hash_file_24(hd, load_at_once)
if !load_at_once {
return hash_stream_192(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_192(buf[:]), ok
}
}
return [24]byte{}, false
}
hash_192 :: proc {
@@ -169,163 +193,21 @@ hash_192 :: proc {
hash_string_192,
}
hash_bytes_odin_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [16]byte {
hash: [16]byte
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
_tiger.update_odin(&c, data)
_tiger.final_odin(&c, hash[:])
}
return hash
}
/*
Low level API
*/
hash_stream_odin_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([16]byte, bool) {
hash: [16]byte
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_tiger.update_odin(&c, buf[:read])
}
}
_tiger.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
Tiger_Context :: _tiger.Tiger_Context
hash_file_odin_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
if !load_at_once {
return hash_stream_odin_16(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_16(ctx, buf[:]), ok
}
}
return [16]byte{}, false
}
hash_bytes_odin_20 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [20]byte {
hash: [20]byte
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
_tiger.update_odin(&c, data)
_tiger.final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_20 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([20]byte, bool) {
hash: [20]byte
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_tiger.update_odin(&c, buf[:read])
}
}
_tiger.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_20 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([20]byte, bool) {
if !load_at_once {
return hash_stream_odin_20(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_20(ctx, buf[:]), ok
}
}
return [20]byte{}, false
}
hash_bytes_odin_24 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [24]byte {
hash: [24]byte
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
_tiger.update_odin(&c, data)
_tiger.final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_24 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([24]byte, bool) {
hash: [24]byte
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_tiger.update_odin(&c, buf[:read])
}
}
_tiger.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_24 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([24]byte, bool) {
if !load_at_once {
return hash_stream_odin_24(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_24(ctx, buf[:]), ok
}
}
return [24]byte{}, false
}
@(private)
_create_tiger_ctx :: #force_inline proc(hash_size: int) {
ctx: _tiger.Tiger_Context
init :: proc(ctx: ^_tiger.Tiger_Context) {
ctx.ver = 1
_hash_impl.internal_ctx = ctx
switch hash_size {
case 16: _hash_impl.hash_size = ._16
case 20: _hash_impl.hash_size = ._20
case 24: _hash_impl.hash_size = ._24
}
_tiger.init(ctx)
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
#partial switch ctx.hash_size {
case ._16: _create_tiger_ctx(16)
case ._20: _create_tiger_ctx(20)
case ._24: _create_tiger_ctx(24)
}
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
}
update :: proc(ctx: ^_tiger.Tiger_Context, data: []byte) {
_tiger.update(ctx, data)
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.final_odin(&c, hash)
}
}
final :: proc(ctx: ^_tiger.Tiger_Context, hash: []byte) {
_tiger.final(ctx, hash)
}

View File

@@ -6,7 +6,6 @@ package tiger2
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Interface for the Tiger2 variant of the Tiger hashing algorithm as defined in <https://www.cs.technion.ac.il/~biham/Reports/Tiger/>
*/
@@ -14,55 +13,8 @@ package tiger2
import "core:os"
import "core:io"
import "../_ctx"
import "../_tiger"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_16 = hash_bytes_odin_16
ctx.hash_file_16 = hash_file_odin_16
ctx.hash_stream_16 = hash_stream_odin_16
ctx.hash_bytes_20 = hash_bytes_odin_20
ctx.hash_file_20 = hash_file_odin_20
ctx.hash_stream_20 = hash_stream_odin_20
ctx.hash_bytes_24 = hash_bytes_odin_24
ctx.hash_file_24 = hash_file_odin_24
ctx.hash_stream_24 = hash_stream_odin_24
ctx.init = _init_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan does nothing, since Tiger2 is not available in Botan
@(warning="Tiger2 is not provided by the Botan API. Odin implementation will be used")
use_botan :: #force_inline proc() {
use_odin()
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
*/
@@ -76,22 +28,46 @@ hash_string_128 :: proc(data: string) -> [16]byte {
// hash_bytes_128 will hash the given input and return the
// computed hash
hash_bytes_128 :: proc(data: []byte) -> [16]byte {
_create_tiger2_ctx(16)
return _hash_impl->hash_bytes_16(data)
hash: [16]byte
ctx: _tiger.Tiger_Context
ctx.ver = 2
_tiger.init(&ctx)
_tiger.update(&ctx, data)
_tiger.final(&ctx, hash[:])
return hash
}
// hash_stream_128 will read the stream in chunks and compute a
// hash from its contents
hash_stream_128 :: proc(s: io.Stream) -> ([16]byte, bool) {
_create_tiger2_ctx(16)
return _hash_impl->hash_stream_16(s)
hash: [16]byte
ctx: _tiger.Tiger_Context
ctx.ver = 2
_tiger.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_tiger.update(&ctx, buf[:read])
}
}
_tiger.final(&ctx, hash[:])
return hash, true
}
// hash_file_128 will read the file provided by the given handle
// and compute a hash
hash_file_128 :: proc(hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
_create_tiger2_ctx(16)
return _hash_impl->hash_file_16(hd, load_at_once)
if !load_at_once {
return hash_stream_128(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_128(buf[:]), ok
}
}
return [16]byte{}, false
}
hash_128 :: proc {
@@ -110,22 +86,46 @@ hash_string_160 :: proc(data: string) -> [20]byte {
// hash_bytes_160 will hash the given input and return the
// computed hash
hash_bytes_160 :: proc(data: []byte) -> [20]byte {
_create_tiger2_ctx(20)
return _hash_impl->hash_bytes_20(data)
hash: [20]byte
ctx: _tiger.Tiger_Context
ctx.ver = 2
_tiger.init(&ctx)
_tiger.update(&ctx, data)
_tiger.final(&ctx, hash[:])
return hash
}
// hash_stream_160 will read the stream in chunks and compute a
// hash from its contents
hash_stream_160 :: proc(s: io.Stream) -> ([20]byte, bool) {
_create_tiger2_ctx(20)
return _hash_impl->hash_stream_20(s)
hash: [20]byte
ctx: _tiger.Tiger_Context
ctx.ver = 2
_tiger.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_tiger.update(&ctx, buf[:read])
}
}
_tiger.final(&ctx, hash[:])
return hash, true
}
// hash_file_160 will read the file provided by the given handle
// and compute a hash
hash_file_160 :: proc(hd: os.Handle, load_at_once := false) -> ([20]byte, bool) {
_create_tiger2_ctx(20)
return _hash_impl->hash_file_20(hd, load_at_once)
if !load_at_once {
return hash_stream_160(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_160(buf[:]), ok
}
}
return [20]byte{}, false
}
hash_160 :: proc {
@@ -144,22 +144,46 @@ hash_string_192 :: proc(data: string) -> [24]byte {
// hash_bytes_192 will hash the given input and return the
// computed hash
hash_bytes_192 :: proc(data: []byte) -> [24]byte {
_create_tiger2_ctx(24)
return _hash_impl->hash_bytes_24(data)
hash: [24]byte
ctx: _tiger.Tiger_Context
ctx.ver = 2
_tiger.init(&ctx)
_tiger.update(&ctx, data)
_tiger.final(&ctx, hash[:])
return hash
}
// hash_stream_192 will read the stream in chunks and compute a
// hash from its contents
hash_stream_192 :: proc(s: io.Stream) -> ([24]byte, bool) {
_create_tiger2_ctx(24)
return _hash_impl->hash_stream_24(s)
hash: [24]byte
ctx: _tiger.Tiger_Context
ctx.ver = 2
_tiger.init(&ctx)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
_tiger.update(&ctx, buf[:read])
}
}
_tiger.final(&ctx, hash[:])
return hash, true
}
// hash_file_192 will read the file provided by the given handle
// and compute a hash
hash_file_192 :: proc(hd: os.Handle, load_at_once := false) -> ([24]byte, bool) {
_create_tiger2_ctx(24)
return _hash_impl->hash_file_24(hd, load_at_once)
if !load_at_once {
return hash_stream_192(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_192(buf[:]), ok
}
}
return [24]byte{}, false
}
hash_192 :: proc {
@@ -169,163 +193,21 @@ hash_192 :: proc {
hash_string_192,
}
hash_bytes_odin_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [16]byte {
hash: [16]byte
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
_tiger.update_odin(&c, data)
_tiger.final_odin(&c, hash[:])
}
return hash
}
/*
Low level API
*/
hash_stream_odin_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([16]byte, bool) {
hash: [16]byte
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_tiger.update_odin(&c, buf[:read])
}
}
_tiger.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
Tiger_Context :: _tiger.Tiger_Context
hash_file_odin_16 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([16]byte, bool) {
if !load_at_once {
return hash_stream_odin_16(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_16(ctx, buf[:]), ok
}
}
return [16]byte{}, false
}
hash_bytes_odin_20 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [20]byte {
hash: [20]byte
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
_tiger.update_odin(&c, data)
_tiger.final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_20 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([20]byte, bool) {
hash: [20]byte
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_tiger.update_odin(&c, buf[:read])
}
}
_tiger.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_20 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([20]byte, bool) {
if !load_at_once {
return hash_stream_odin_20(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_20(ctx, buf[:]), ok
}
}
return [20]byte{}, false
}
hash_bytes_odin_24 :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [24]byte {
hash: [24]byte
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
_tiger.update_odin(&c, data)
_tiger.final_odin(&c, hash[:])
}
return hash
}
hash_stream_odin_24 :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([24]byte, bool) {
hash: [24]byte
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
_tiger.update_odin(&c, buf[:read])
}
}
_tiger.final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
hash_file_odin_24 :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([24]byte, bool) {
if !load_at_once {
return hash_stream_odin_24(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin_24(ctx, buf[:]), ok
}
}
return [24]byte{}, false
}
@(private)
_create_tiger2_ctx :: #force_inline proc(hash_size: int) {
ctx: _tiger.Tiger_Context
init :: proc(ctx: ^_tiger.Tiger_Context) {
ctx.ver = 2
_hash_impl.internal_ctx = ctx
switch hash_size {
case 16: _hash_impl.hash_size = ._16
case 20: _hash_impl.hash_size = ._20
case 24: _hash_impl.hash_size = ._24
}
_tiger.init(ctx)
}
@(private)
_init_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
#partial switch ctx.hash_size {
case ._16: _create_tiger2_ctx(16)
case ._20: _create_tiger2_ctx(20)
case ._24: _create_tiger2_ctx(24)
}
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.init_odin(&c)
}
update :: proc(ctx: ^_tiger.Tiger_Context, data: []byte) {
_tiger.update(ctx, data)
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(_tiger.Tiger_Context); ok {
_tiger.final_odin(&c, hash)
}
final :: proc(ctx: ^_tiger.Tiger_Context, hash: []byte) {
_tiger.final(ctx, hash)
}

View File

@@ -6,7 +6,6 @@ package whirlpool
List of contributors:
zhibog, dotbmp: Initial implementation.
Jeroen van Rijn: Context design to be able to change from Odin implementation to bindings.
Implementation of the Whirlpool hashing algorithm, as defined in <https://web.archive.org/web/20171129084214/http://www.larc.usp.br/~pbarreto/WhirlpoolPage.html>
*/
@@ -14,48 +13,8 @@ package whirlpool
import "core:os"
import "core:io"
import "../botan"
import "../_ctx"
import "../util"
/*
Context initialization and switching between the Odin implementation and the bindings
*/
USE_BOTAN_LIB :: bool(#config(USE_BOTAN_LIB, false))
@(private)
_init_vtable :: #force_inline proc() -> ^_ctx.Hash_Context {
ctx := _ctx._init_vtable()
when USE_BOTAN_LIB {
use_botan()
} else {
_assign_hash_vtable(ctx)
}
return ctx
}
@(private)
_assign_hash_vtable :: #force_inline proc(ctx: ^_ctx.Hash_Context) {
ctx.hash_bytes_64 = hash_bytes_odin
ctx.hash_file_64 = hash_file_odin
ctx.hash_stream_64 = hash_stream_odin
ctx.update = _update_odin
ctx.final = _final_odin
}
_hash_impl := _init_vtable()
// use_botan assigns the internal vtable of the hash context to use the Botan bindings
use_botan :: #force_inline proc() {
botan.assign_hash_vtable(_hash_impl, botan.HASH_WHIRLPOOL)
}
// use_odin assigns the internal vtable of the hash context to use the Odin implementation
use_odin :: #force_inline proc() {
_assign_hash_vtable(_hash_impl)
}
/*
High level API
*/
@@ -69,22 +28,44 @@ hash_string :: proc(data: string) -> [64]byte {
// hash_bytes will hash the given input and return the
// computed hash
hash_bytes :: proc(data: []byte) -> [64]byte {
_create_whirlpool_ctx()
return _hash_impl->hash_bytes_64(data)
hash: [64]byte
ctx: Whirlpool_Context
// init(&ctx) No-op
update(&ctx, data)
final(&ctx, hash[:])
return hash
}
// hash_stream will read the stream in chunks and compute a
// hash from its contents
hash_stream :: proc(s: io.Stream) -> ([64]byte, bool) {
_create_whirlpool_ctx()
return _hash_impl->hash_stream_64(s)
hash: [64]byte
ctx: Whirlpool_Context
// init(&ctx) No-op
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = s->impl_read(buf)
if read > 0 {
update(&ctx, buf[:read])
}
}
final(&ctx, hash[:])
return hash, true
}
// hash_file will read the file provided by the given handle
// and compute a hash
hash_file :: proc(hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
_create_whirlpool_ctx()
return _hash_impl->hash_file_64(hd, load_at_once)
if !load_at_once {
return hash_stream(os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes(buf[:]), ok
}
}
return [64]byte{}, false
}
hash :: proc {
@@ -98,76 +79,103 @@ hash :: proc {
Low level API
*/
init :: proc(ctx: ^_ctx.Hash_Context) {
_hash_impl->init()
@(warning="Init is a no-op for Whirlpool")
init :: proc(ctx: ^Whirlpool_Context) {
// No action needed here
}
update :: proc(ctx: ^_ctx.Hash_Context, data: []byte) {
_hash_impl->update(data)
update :: proc(ctx: ^Whirlpool_Context, source: []byte) {
source_pos: int
nn := len(source)
source_bits := u64(nn * 8)
source_gap := u32((8 - (int(source_bits & 7))) & 7)
buffer_rem := uint(ctx.buffer_bits & 7)
b: u32
for i, carry, value := 31, u32(0), u32(source_bits); i >= 0 && (carry != 0 || value != 0); i -= 1 {
carry += u32(ctx.bitlength[i]) + (u32(value & 0xff))
ctx.bitlength[i] = byte(carry)
carry >>= 8
value >>= 8
}
for source_bits > 8 {
b = u32(u32((source[source_pos] << source_gap) & 0xff) | u32((source[source_pos+1] & 0xff) >> (8 - source_gap)))
ctx.buffer[ctx.buffer_pos] |= u8(b >> buffer_rem)
ctx.buffer_pos += 1
ctx.buffer_bits += int(8 - buffer_rem)
if ctx.buffer_bits == 512 {
transform(ctx)
ctx.buffer_bits = 0
ctx.buffer_pos = 0
}
ctx.buffer[ctx.buffer_pos] = byte(b << (8 - buffer_rem))
ctx.buffer_bits += int(buffer_rem)
source_bits -= 8
source_pos += 1
}
if source_bits > 0 {
b = u32((source[source_pos] << source_gap) & 0xff)
ctx.buffer[ctx.buffer_pos] |= byte(b) >> buffer_rem
} else {b = 0}
if u64(buffer_rem) + source_bits < 8 {
ctx.buffer_bits += int(source_bits)
} else {
ctx.buffer_pos += 1
ctx.buffer_bits += 8 - int(buffer_rem)
source_bits -= u64(8 - buffer_rem)
if ctx.buffer_bits == 512 {
transform(ctx)
ctx.buffer_bits = 0
ctx.buffer_pos = 0
}
ctx.buffer[ctx.buffer_pos] = byte(b << (8 - buffer_rem))
ctx.buffer_bits += int(source_bits)
}
}
final :: proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
_hash_impl->final(hash)
}
final :: proc(ctx: ^Whirlpool_Context, hash: []byte) {
n := ctx
n.buffer[n.buffer_pos] |= 0x80 >> (uint(n.buffer_bits) & 7)
n.buffer_pos += 1
hash_bytes_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) -> [64]byte {
hash: [64]byte
if c, ok := ctx.internal_ctx.(Whirlpool_Context); ok {
update_odin(&c, data)
final_odin(&c, hash[:])
}
return hash
}
if n.buffer_pos > 64 - 32 {
if n.buffer_pos < 64 {
for i := 0; i < 64 - n.buffer_pos; i += 1 {
n.buffer[n.buffer_pos + i] = 0
}
}
transform(ctx)
n.buffer_pos = 0
}
hash_stream_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, fs: io.Stream) -> ([64]byte, bool) {
hash: [64]byte
if c, ok := ctx.internal_ctx.(Whirlpool_Context); ok {
buf := make([]byte, 512)
defer delete(buf)
read := 1
for read > 0 {
read, _ = fs->impl_read(buf)
if read > 0 {
update_odin(&c, buf[:read])
}
}
final_odin(&c, hash[:])
return hash, true
} else {
return hash, false
}
}
if n.buffer_pos < 64 - 32 {
for i := 0; i < (64 - 32) - n.buffer_pos; i += 1 {
n.buffer[n.buffer_pos + i] = 0
}
}
n.buffer_pos = 64 - 32
hash_file_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hd: os.Handle, load_at_once := false) -> ([64]byte, bool) {
if !load_at_once {
return hash_stream_odin(ctx, os.stream_from_handle(hd))
} else {
if buf, ok := os.read_entire_file(hd); ok {
return hash_bytes_odin(ctx, buf[:]), ok
}
}
return [64]byte{}, false
}
for i := 0; i < 32; i += 1 {
n.buffer[n.buffer_pos + i] = n.bitlength[i]
}
transform(ctx)
@(private)
_create_whirlpool_ctx :: #force_inline proc() {
ctx: Whirlpool_Context
_hash_impl.internal_ctx = ctx
_hash_impl.hash_size = ._64
}
@(private)
_update_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, data: []byte) {
if c, ok := ctx.internal_ctx.(Whirlpool_Context); ok {
update_odin(&c, data)
}
}
@(private)
_final_odin :: #force_inline proc(ctx: ^_ctx.Hash_Context, hash: []byte) {
if c, ok := ctx.internal_ctx.(Whirlpool_Context); ok {
final_odin(&c, hash)
}
for i := 0; i < 8; i += 1 {
hash[i * 8] = byte(n.hash[i] >> 56)
hash[i * 8 + 1] = byte(n.hash[i] >> 48)
hash[i * 8 + 2] = byte(n.hash[i] >> 40)
hash[i * 8 + 3] = byte(n.hash[i] >> 32)
hash[i * 8 + 4] = byte(n.hash[i] >> 24)
hash[i * 8 + 5] = byte(n.hash[i] >> 16)
hash[i * 8 + 6] = byte(n.hash[i] >> 8)
hash[i * 8 + 7] = byte(n.hash[i])
}
}
/*
@@ -774,97 +782,3 @@ transform :: proc (ctx: ^Whirlpool_Context) {
}
for i := 0; i < 8; i += 1 {ctx.hash[i] ~= state[i] ~ block[i]}
}
update_odin :: proc(ctx: ^Whirlpool_Context, source: []byte) {
source_pos: int
nn := len(source)
source_bits := u64(nn * 8)
source_gap := u32((8 - (int(source_bits & 7))) & 7)
buffer_rem := uint(ctx.buffer_bits & 7)
b: u32
for i, carry, value := 31, u32(0), u32(source_bits); i >= 0 && (carry != 0 || value != 0); i -= 1 {
carry += u32(ctx.bitlength[i]) + (u32(value & 0xff))
ctx.bitlength[i] = byte(carry)
carry >>= 8
value >>= 8
}
for source_bits > 8 {
b = u32(u32((source[source_pos] << source_gap) & 0xff) | u32((source[source_pos+1] & 0xff) >> (8 - source_gap)))
ctx.buffer[ctx.buffer_pos] |= u8(b >> buffer_rem)
ctx.buffer_pos += 1
ctx.buffer_bits += int(8 - buffer_rem)
if ctx.buffer_bits == 512 {
transform(ctx)
ctx.buffer_bits = 0
ctx.buffer_pos = 0
}
ctx.buffer[ctx.buffer_pos] = byte(b << (8 - buffer_rem))
ctx.buffer_bits += int(buffer_rem)
source_bits -= 8
source_pos += 1
}
if source_bits > 0 {
b = u32((source[source_pos] << source_gap) & 0xff)
ctx.buffer[ctx.buffer_pos] |= byte(b) >> buffer_rem
} else {b = 0}
if u64(buffer_rem) + source_bits < 8 {
ctx.buffer_bits += int(source_bits)
} else {
ctx.buffer_pos += 1
ctx.buffer_bits += 8 - int(buffer_rem)
source_bits -= u64(8 - buffer_rem)
if ctx.buffer_bits == 512 {
transform(ctx)
ctx.buffer_bits = 0
ctx.buffer_pos = 0
}
ctx.buffer[ctx.buffer_pos] = byte(b << (8 - buffer_rem))
ctx.buffer_bits += int(source_bits)
}
}
final_odin :: proc(ctx: ^Whirlpool_Context, hash: []byte) {
n := ctx
n.buffer[n.buffer_pos] |= 0x80 >> (uint(n.buffer_bits) & 7)
n.buffer_pos += 1
if n.buffer_pos > 64 - 32 {
if n.buffer_pos < 64 {
for i := 0; i < 64 - n.buffer_pos; i += 1 {
n.buffer[n.buffer_pos + i] = 0
}
}
transform(ctx)
n.buffer_pos = 0
}
if n.buffer_pos < 64 - 32 {
for i := 0; i < (64 - 32) - n.buffer_pos; i += 1 {
n.buffer[n.buffer_pos + i] = 0
}
}
n.buffer_pos = 64 - 32
for i := 0; i < 32; i += 1 {
n.buffer[n.buffer_pos + i] = n.bitlength[i]
}
transform(ctx)
for i := 0; i < 8; i += 1 {
hash[i * 8] = byte(n.hash[i] >> 56)
hash[i * 8 + 1] = byte(n.hash[i] >> 48)
hash[i * 8 + 2] = byte(n.hash[i] >> 40)
hash[i * 8 + 3] = byte(n.hash[i] >> 32)
hash[i * 8 + 4] = byte(n.hash[i] >> 24)
hash[i * 8 + 5] = byte(n.hash[i] >> 16)
hash[i * 8 + 6] = byte(n.hash[i] >> 8)
hash[i * 8 + 7] = byte(n.hash[i])
}
}

Binary file not shown.

View File

@@ -33,7 +33,6 @@ import "core:crypto/tiger2"
import "core:crypto/gost"
import "core:crypto/streebog"
import "core:crypto/sm3"
import "core:crypto/skein"
import "core:crypto/jh"
import "core:crypto/groestl"
import "core:crypto/haval"
@@ -102,7 +101,6 @@ main :: proc() {
test_tiger2_160(&t)
test_tiger2_192(&t)
test_sm3(&t)
test_skein512(&t)
test_jh_224(&t)
test_jh_256(&t)
test_jh_384(&t)
@@ -171,13 +169,6 @@ test_md4 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
md4.use_botan()
for v, _ in test_vectors {
computed := md4.hash(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -197,12 +188,6 @@ test_md5 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
md5.use_botan()
for v, _ in test_vectors {
computed := md5.hash(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -225,15 +210,8 @@ test_sha1 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
sha1.use_botan()
for v, _ in test_vectors {
computed := sha1.hash(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
test_sha224 :: proc(t: ^testing.T) {
// Test vectors from
@@ -250,12 +228,6 @@ test_sha224 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
sha2.use_botan()
for v, _ in test_vectors {
computed := sha2.hash_224(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -274,13 +246,6 @@ test_sha256 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
sha2.use_botan()
for v, _ in test_vectors {
computed := sha2.hash_256(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -299,13 +264,6 @@ test_sha384 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
sha2.use_botan()
for v, _ in test_vectors {
computed := sha2.hash_384(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -324,15 +282,8 @@ test_sha512 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
sha2.use_botan()
for v, _ in test_vectors {
computed := sha2.hash_512(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
test_sha3_224 :: proc(t: ^testing.T) {
// Test vectors from
@@ -353,12 +304,6 @@ test_sha3_224 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
sha3.use_botan()
for v, _ in test_vectors {
computed := sha3.hash_224(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -381,12 +326,6 @@ test_sha3_256 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
sha3.use_botan()
for v, _ in test_vectors {
computed := sha3.hash_256(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -409,12 +348,6 @@ test_sha3_384 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
sha3.use_botan()
for v, _ in test_vectors {
computed := sha3.hash_384(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -437,12 +370,6 @@ test_sha3_512 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
sha3.use_botan()
for v, _ in test_vectors {
computed := sha3.hash_512(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -457,12 +384,6 @@ test_shake_128 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
sha3.use_botan()
for v, _ in test_vectors {
computed := shake.hash_128(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -477,12 +398,6 @@ test_shake_256 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
sha3.use_botan()
for v, _ in test_vectors {
computed := shake.hash_256(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -499,12 +414,6 @@ test_keccak_224 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
keccak.use_botan()
for v, _ in test_vectors {
computed := keccak.hash_224(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -521,12 +430,6 @@ test_keccak_256 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
keccak.use_botan()
for v, _ in test_vectors {
computed := keccak.hash_256(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -543,12 +446,6 @@ test_keccak_384 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
keccak.use_botan()
for v, _ in test_vectors {
computed := keccak.hash_384(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -565,12 +462,6 @@ test_keccak_512 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
keccak.use_botan()
for v, _ in test_vectors {
computed := keccak.hash_512(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -597,12 +488,6 @@ test_whirlpool :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
whirlpool.use_botan()
for v, _ in test_vectors {
computed := whirlpool.hash(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -623,12 +508,6 @@ test_gost :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
gost.use_botan()
for v, _ in test_vectors {
computed := gost.hash(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -643,12 +522,6 @@ test_streebog_256 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
streebog.use_botan()
for v, _ in test_vectors {
computed := streebog.hash_256(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -663,12 +536,6 @@ test_streebog_512 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
streebog.use_botan()
for v, _ in test_vectors {
computed := streebog.hash_512(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -738,12 +605,6 @@ test_blake2b :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
blake2b.use_botan()
for v, _ in test_vectors {
computed := blake2b.hash(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -797,12 +658,6 @@ test_ripemd_160 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
ripemd.use_botan()
for v, _ in test_vectors {
computed := ripemd.hash_160(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -863,12 +718,6 @@ test_tiger_128 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
tiger.use_botan()
for v, _ in test_vectors {
computed := tiger.hash_128(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -889,12 +738,6 @@ test_tiger_160 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
tiger.use_botan()
for v, _ in test_vectors {
computed := tiger.hash_160(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -915,12 +758,6 @@ test_tiger_192 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
tiger.use_botan()
for v, _ in test_vectors {
computed := tiger.hash_192(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
@@ -979,26 +816,6 @@ test_sm3 :: proc(t: ^testing.T) {
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
sm3.use_botan()
for v, _ in test_vectors {
computed := sm3.hash(v.str)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)
test_skein512 :: proc(t: ^testing.T) {
test_vectors := [?]TestHash {
TestHash{"bc5b4c50925519c290cc634277ae3d6257212395cba733bbad37a4af0fa06af41fca7903d06564fea7a2d3730dbdb80c1f85562dfcc070334ea4d1d9e72cba7a", ""},
TestHash{"94c2ae036dba8783d0b3f7d6cc111ff810702f5c77707999be7e1c9486ff238a7044de734293147359b4ac7e1d09cd247c351d69826b78dcddd951f0ef912713", "The quick brown fox jumps over the lazy dog"},
}
skein.use_botan()
for v, _ in test_vectors {
computed := skein.hash_skein512(v.str, 64)
computed_str := hex_string(computed[:])
expect(t, computed_str == v.hash, fmt.tprintf("Expected: %s for input of %s, but got %s instead", v.hash, v.str, computed_str))
}
}
@(test)