<m3t4synt4ct1c>
I found that if I move the .stream down a line, it works fine, but I'm having trouble understanding why
scientes has quit [Ping timeout: 276 seconds]
Ichorio has quit [Ping timeout: 250 seconds]
meheleventyone has joined #zig
Hejsil has joined #zig
m3t4synt4ct1c has quit [Ping timeout: 276 seconds]
<Hejsil>
m3t4synt4ct1c, You're copying the stream out of its implementation. You should take the pointer to `stream`: `var b = &std.io.SliceInStream.init(a).stream;`
ysgard has joined #zig
Ichorio has joined #zig
meheleventyone has quit [Quit: My MacBook has gone to sleep. ZZZzzz…]
wb3 has joined #zig
<wb3>
hi, I'm trying to write a CSV parser for zig, but I can't figure out how to correctly do mapping from a []u8 into a struct, specifically how to intermix comptime type info and runtime data -- WIP file here: https://github.com/waiteb3/zig-csv/blob/master/src/main.zig#L55 -- on zig 0.4.0+cc8cf94c OSX
<wb3>
if anyone has an example of some form of raw input -> struct, or knows of a better approach, I'd appreciate it, but so far I haven't found any examples or anything in the stdlib
scientes has joined #zig
<Sahnvour>
wb3, if I understand, you want to load values from a CSV field that you know have the same name as a struct field ?
<Sahnvour>
or the same index
Hejsil has quit [Ping timeout: 256 seconds]
<wb3>
sahnvour, for now I'm working on the assumption that fields 0->N are your columns 0->N
<wb3>
going for the simplest, most assumptive solve, but I can't figure out get around needing a comptime mapping on variable/runtime data
<very-mediocre>
wb3 `parse_csv_line` should have Row as the return type
<very-mediocre>
and you'd return an instance of Row which zig forces you to populate with values
<very-mediocre>
syntax for instantiation is like `return Row {.age = 1, .name="something"};`
<wb3>
ya, the issue I've not been able to figure out how to have a function that takes `comptime T: type` parse a runtime defined, non-const `[]u8` generically
<very-mediocre>
the type is comptime known but not the instance values
<very-mediocre>
sorry if this sounds condescending, not sure how much you already know
<Sahnvour>
wb3, I think you could do `inline for(info.Struct.fields) |i, field| {...}` and inside you for loop, compare `i` to the index of the csv value you're reading ; if they are equal, you could use https://ziglang.org/documentation/master/#field with `field.name` to set its value
<wb3>
yeah I do all that already, but when I try to do any slice processing, all of integers used to navigate the []u8 have to be comptime, so it will only process the first char of a slice for all fields
<Sahnvour>
this is from the top of my head, but I hope the logic works, though you might encounter trouble with fields of differetn types
jjido has quit [Quit: Connection closed for inactivity]
<wb3>
I can't figure out how to do `inline for` to loop the fields from the @typeInfo and iterate the slice at runtime
<mikdusan>
wb3: did i miss something; are you writing a comptime parser for csv or runtime?
<Sahnvour>
the first snippet I wrote should be pretty close to working
<wb3>
@mikdusan runtime []u8, comptime T: type (struct)
<wb3>
the JSON parser in the stdlib uses a custom JSON tree/type, vs any arbitrary T: type
<wb3>
I don't want to pre-maturely state that the comptime/type-reflection isn't capable of feature, but it is for my current approach or understanding
<Sahnvour>
wb3, doing the inline for would basically generate code that does "if fieldIdx == i, then assign to field" for every field of the struct (and associated i counter)
<Sahnvour>
doing that at every CSV value would allow you to fill your struct
<wb3>
so I have code that constructs a struct, and using @typeInfo.Struct.Fields, it fills out the values, but it fills it in with junk since the ints use to iterate the slice have to be comptime for the lib to compile
<wb3>
so if they're comptime, start/end just ends up being equal to `fieldIdx` since they're processed at copmtime, so it parses a 0 length slice
<wb3>
let me try your snippet
<very-mediocre>
wb3 `comptime`is basically the static context in other languages, what Sahnvour has suggested is code generation which, at comptime, tailors your function so the incoming values map to T's fields
<Sahnvour>
wb3, it actually won't work ; I missed that fieldIdx was comptime, it should instead be incremented at each ',' you encounter to count the number of fields read from CSV
<Sahnvour>
and I think you can get `start` and `end` from the reading too, no need to have them comptime
<wb3>
so `fieldIdx` has to be comptime in this snippet, I think th's is cause Zig is erasing the info at runtime (it just codegens a function instance per type?)
<very-mediocre>
ok i get the problem now, could you invert the loop?
<very-mediocre>
loop through the fields with `inline for`
<very-mediocre>
actually that sucks, my bad
<wb3>
it's ok, it may just not be a flexible enough system yet
<wb3>
I'm going to try and write an actual, simpler `parse_line` and share that if I can't get past it again
<very-mediocre>
there may be potential in having the outer loop be `inline for` and getting rid of fieldIdx
<very-mediocre>
or even just use it to codegen the field contents in a manner that is later accessible at runtime
<wb3>
I was thinking of the latter, but I hit a few different snags around runtime vs comptime declarations
<wb3>
something like `parse_line(comptime T: type, [@typeInfo(T).Struct.fields.len][]u8 cells)` but I couldn't make the compiler happy on that path
<very-mediocre>
i think this might be doable using break:
<Sahnvour>
wb3, why does fieldIdx have to be comptime ?
<very-mediocre>
@typeInfo somehow needs that
<very-mediocre>
i ran into that earlier as well, not sure if bug or by design
<wb3>
compiler says so, I assume because it needs to be reified and is erased before runtime since Zig does codegen for it's generics & reflection
Ichorio has quit [Read error: Connection reset by peer]
<Sahnvour>
wb3, in the first for loop, I inverted `char` and `i`. writing them in the correct order should help for a start
<wb3>
didn't think that I'd have to loop on the fields on each iteration
<Sahnvour>
great! I was going to propose something like that :)
<Sahnvour>
yeah that's kind of a workaround
m3t4synt4ct1c has joined #zig
<wb3>
cool, I probably should check if there's an issue on this limitation, feels like it's unintended
<wb3>
gonna figure out floats and strings later
<wb3>
thanks @Sahnvour and @very-mediocre for the help
moo has quit [Quit: Leaving]
<m3t4synt4ct1c>
I'm not sure if I'm understanding how error merging works properly.. I am writing an InStream that takes in ReadError as a comptime parameter just like InStream, and then I want to add a new error set that I use || to merge the two error sets, then I try to create a new instance of InStream using that merged set of errors and the compiler errors seem to only see the new set of errors, not the original one
<Sahnvour>
wb3, to access a struct field by its index fully at runtime, zig would have to embed some metadata about type
<m3t4synt4ct1c>
I'm trying to create a GzipInstream that takes in an InStream, parses and decompresses data from it and is itself an InStream, but it contains the error set of the original input stream plus parsing errors
slugm has quit [Remote host closed the connection]
wb3 has quit [Ping timeout: 256 seconds]
very-mediocre has quit [Ping timeout: 256 seconds]
avoidr has joined #zig
<m3t4synt4ct1c>
wrestling with error types in stream-related code is pretty maddening
fengb_ has quit [Quit: Page closed]
<m3t4synt4ct1c>
what does this mean? error: expected type 'fn(*std.io.InStream(std.io.Error), []u8) std.io.Error!usize', found 'fn(*std.io.InStream(std.io.Error), []u8) @typeOf(GzipInStream(std.io.Error).readFn).ReturnType.ErrorSet!usize'.stream = Stream{ .readFn = readFn },
m3t4synt4ct1c has quit [Ping timeout: 250 seconds]
Sahnvour has quit [Read error: Connection reset by peer]
halosghost has quit [Quit: WeeChat 2.4]
bbrittain is now known as bwb_
<emekankurumeh[m]>
i think `readFn` needs to return a std.io.Error!usize, but your return is different