-
Notifications
You must be signed in to change notification settings - Fork 249
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error in reading NWB files #1131
Comments
We should raise a telling error message in that case suggesting to convert to a later NWB format. |
The error does not come from Neo, because the files are also not readable with PyNWB. |
We received an answer from Thomas Braun saying « If you open the file with a plain hdf5 reader, you see that the root group has a dataset nwb_version which says "NWB-1.0.5". So this is an NWBv1 file. pynwb can only read NWBv2 files. There is an open issue https://github.com/NeurodataWithoutBorders/pynwb/issues/1077 which wants to give a better error message in pynwb. But this looks stalled. » As NWBv1 is not implemented in Neo, I propose putting an error message to advise converting to a later NWB format as @JuliaSprenger suggests. See PR #1165 |
Thanks for the update @legouee . |
Tried with two sample files:
Installed Neo using the master branch (June 22: commit) and tried the following using Python 3.8:
Error:
/home/shailesh/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/backends/hdf5/h5tools.py:192: UserWarning: No cached namespaces found in 525011725_ephys.nwb
warnings.warn(msg)
ValueError Traceback (most recent call last)
Input In [3], in <cell line: 1>()
----> 1 reader.read_all_blocks()
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/neo/io/nwbio.py:259, in NWBIO.read_all_blocks(self, lazy, **kwargs)
256 assert self.nwb_file_mode in ('r',)
257 io = pynwb.NWBHDF5IO(self.filename, mode=self.nwb_file_mode,
258 load_namespaces=True) # Open a file with NWBHDF5IO
--> 259 self._file = io.read()
261 self.global_block_metadata = {}
262 for annotation_name in GLOBAL_ANNOTATIONS:
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/backends/hdf5/h5tools.py:447, in HDF5IO.read(self, **kwargs)
444 raise UnsupportedOperation("Cannot read from file %s in mode '%s'. Please use mode 'r', 'r+', or 'a'."
445 % (self.source, self.__mode))
446 try:
--> 447 return call_docval_func(super().read, kwargs)
448 except UnsupportedOperation as e:
449 if str(e) == 'Cannot build data. There are no values.': # pragma: no cover
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/utils.py:434, in call_docval_func(func, kwargs)
432 def call_docval_func(func, kwargs):
433 fargs, fkwargs = fmt_docval_args(func, kwargs)
--> 434 return func(*fargs, **fkwargs)
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/utils.py:593, in docval..dec..func_call(*args, **kwargs)
591 def func_call(*args, **kwargs):
592 pargs = _check_args(args, kwargs)
--> 593 return func(args[0], **pargs)
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/backends/io.py:38, in HDMFIO.read(self, **kwargs)
35 @DocVal(returns='the Container object that was read in', rtype=Container)
36 def read(self, **kwargs):
37 """Read a container from the IO source."""
---> 38 f_builder = self.read_builder()
39 if all(len(v) == 0 for v in f_builder.values()):
40 # TODO also check that the keys are appropriate. print a better error message
41 raise UnsupportedOperation('Cannot build data. There are no values.')
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/utils.py:593, in docval..dec..func_call(*args, **kwargs)
591 def func_call(*args, **kwargs):
592 pargs = _check_args(args, kwargs)
--> 593 return func(args[0], **pargs)
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/backends/hdf5/h5tools.py:471, in HDF5IO.read_builder(self)
469 ignore.add(self.__file[specloc].name)
470 if f_builder is None:
--> 471 f_builder = self.__read_group(self.__file, ROOT_NAME, ignore=ignore)
472 self.__read[self.__file] = f_builder
473 return f_builder
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/backends/hdf5/h5tools.py:605, in HDF5IO.__read_group(self, h5obj, name, ignore)
603 obj_type = kwargs['groups']
604 if builder is None:
--> 605 builder = read_method(sub_h5obj)
606 self.__set_built(sub_h5obj.file.filename, sub_h5obj.id, builder)
607 obj_type[builder.name] = builder
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/backends/hdf5/h5tools.py:605, in HDF5IO.__read_group(self, h5obj, name, ignore)
603 obj_type = kwargs['groups']
604 if builder is None:
--> 605 builder = read_method(sub_h5obj)
606 self.__set_built(sub_h5obj.file.filename, sub_h5obj.id, builder)
607 obj_type[builder.name] = builder
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/backends/hdf5/h5tools.py:605, in HDF5IO.__read_group(self, h5obj, name, ignore)
603 obj_type = kwargs['groups']
604 if builder is None:
--> 605 builder = read_method(sub_h5obj)
606 self.__set_built(sub_h5obj.file.filename, sub_h5obj.id, builder)
607 obj_type[builder.name] = builder
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/backends/hdf5/h5tools.py:613, in HDF5IO.__read_group(self, h5obj, name, ignore)
611 continue
612 kwargs['source'] = h5obj.file.filename
--> 613 ret = GroupBuilder(name, **kwargs)
614 ret.location = os.path.dirname(h5obj.name)
615 self.__set_written(ret)
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/utils.py:593, in docval..dec..func_call(*args, **kwargs)
591 def func_call(*args, **kwargs):
592 pargs = _check_args(args, kwargs)
--> 593 return func(args[0], **pargs)
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/build/builders.py:159, in GroupBuilder.init(self, **kwargs)
157 for dataset in datasets:
158 if dataset is not None:
--> 159 self.set_dataset(dataset)
160 for link in links:
161 self.set_link(link)
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/utils.py:593, in docval..dec..func_call(*args, **kwargs)
591 def func_call(*args, **kwargs):
592 pargs = _check_args(args, kwargs)
--> 593 return func(args[0], **pargs)
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/build/builders.py:226, in GroupBuilder.set_dataset(self, **kwargs)
224 """Add a dataset to this group."""
225 builder = getargs('builder', kwargs)
--> 226 self.__set_builder(builder, GroupBuilder.__dataset)
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/build/builders.py:236, in GroupBuilder.__set_builder(self, builder, obj_type)
234 def __set_builder(self, builder, obj_type):
235 name = builder.name
--> 236 self.__check_obj_type(name, obj_type)
237 # if child builder already exists (e.g., read from file), do not reset it.
238 # resetting the child builder will change the python object ID / hash of the child builder
239 # and make the IO backend think that the child builder has not yet been written.
240 if self.get(name) == builder:
File ~/.virtualenvs/py3env/lib/python3.8/site-packages/hdmf/build/builders.py:213, in GroupBuilder.__check_obj_type(self, name, obj_type)
210 def __check_obj_type(self, name, obj_type):
211 # check that the name is not associated with a different object type in this group
212 if name in self.obj_type and self.obj_type[name] != obj_type:
--> 213 raise ValueError("'%s' already exists in %s.%s, cannot set in %s."
214 % (name, self.name, self.obj_type[name], obj_type))
ValueError: 'comments' already exists in Sweep_10.attributes, cannot set in datasets.
@apdavison suggested to create a ticket for this issue.
@legouee : could you take a look at this? Thanks.
The text was updated successfully, but these errors were encountered: