You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
and create a dataset of this type and write a few values out (from a numpy recarray), I think/expect those will be saved to disk in the packed way described in the datatype so they basically use 13 bytes each.
If I then try and read this back into go using go-hdf5 I can declare a struct that looks like
which generally will not be 13 bytes but a few more, let's say 16 because of alignment of the members. So if I am going to read back into such a struct, I need to create a compound datatype of the same offsets/sizes so hdf5 knows how to map the values from disk to memory.
However, the API retrieves the datatype of the on-file dataset and passes that into the read call along with the memory address of the beginning of my slice of structs. This ends up mapping the values incorrectly.
I can modify the h5py dataset creation side to use say all 4 or 8 byte datatypes and using similar types on the go side the read will work again but only by chance.
Is my understanding wrong or does the API need some refining?
Thanks!!
The text was updated successfully, but these errors were encountered:
Hello - thanks for coming back, here is a repro, the python script that creates the hdf5 file and golang that reads it. Obviously you need to rename the file if not on windows.
Hello all, question/remark regarding compound types.
If I create a compound type from h5py that looks like say
total size: 13 bytes
timestamp: NATIVE_UINT64
market: NATIVE_UINT8
price: NATIVE_FLOAT
and create a dataset of this type and write a few values out (from a numpy recarray), I think/expect those will be saved to disk in the packed way described in the datatype so they basically use 13 bytes each.
If I then try and read this back into go using go-hdf5 I can declare a struct that looks like
type tick struct {
timestamp uint64
market uint8
price float32
}
which generally will not be 13 bytes but a few more, let's say 16 because of alignment of the members. So if I am going to read back into such a struct, I need to create a compound datatype of the same offsets/sizes so hdf5 knows how to map the values from disk to memory.
However, the API retrieves the datatype of the on-file dataset and passes that into the read call along with the memory address of the beginning of my slice of structs. This ends up mapping the values incorrectly.
I can modify the h5py dataset creation side to use say all 4 or 8 byte datatypes and using similar types on the go side the read will work again but only by chance.
Is my understanding wrong or does the API need some refining?
Thanks!!
The text was updated successfully, but these errors were encountered: