-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Write Opaque Datatype #76
Comments
Hi, no it is not yet possible. Mainly because of lack of use cases. Do you want to write a single opaque value (scalar) or an array of opaque values? |
I'm currently writing a JPG image as a dataset and trying to visualize it with h5web which only tries to do so for opaque datasets. I've also requested that they might implement a different way to visualize a dataset as an image. |
Thanks, I`ll have a look into this today evening |
Support for opaque has been added now for v1.0.0-beta.13: var data = File.ReadAllBytes("/home/vincent/Downloads/img.jpg");
var opaqueInfo = new H5OpaqueInfo(
TypeSize: (uint)data.Length,
Tag: "My tag"
);
var file = new H5File
{
["opaque"] = new H5Dataset(data, opaqueInfo: opaqueInfo)
};
file.Write("/home/vincent/Downloads/testimg.h5"); |
Thanks a lot for the very quick turnaround on this :) |
@Apollo3zehn So with larger opaque datasets I get the following error from h5web:
I'm not sure whether the issue comes from the way the hdf5 is encoded on PureHDFs side or from h5web. I've tried 1Mb images here. h5py also isn't happy with it:
|
Two more problems I've encountered where I'm not sure whether it is a PureHDF or h5web problem:
Writing this file (with a screenshot.png that is small enough that it doesn't directly break) one gets an error on opening the top group in h5web
When leaving out the top opaque dataset:
The dataset does not get written as opaque instead it gets written as |
I have the need to add a byte[] as an OPAQUE dataset so it can be viewed as an image in h5web.
Is there a way to currently write opaque datasets using PureHDF?
The text was updated successfully, but these errors were encountered: