You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This came to my attention via the following error:
File /local/scratch/BrainStat/brainstat/stats/SLM.py:325, in SLM._surfstat_to_brainstat_rft(self)
323 yeo_names, _ = fetch_yeo_networks_metadata(7)
324 yeo_names.insert(0, "Undefined")
--> 325 yeo7_index = yeo7[self.P["peak"]["vertid"][i]]
326 if "yeo7" not in self.P["peak"]:
327 self.P["peak"]["yeo7"] = []
IndexError: index 64984 is out of bounds for axis 0 with size 64984
Full Error
---------------------------------------------------------------------------
IndexError Traceback (most recent call last)
Cell In[14], line 36
22 interact = {
23 "model": term_ses * term_panss + term_sub,
24 "contrast": (ses_post * df["panss-ratio"]) - (ses_pre * df["panss-ratio"]),
25 "label": "intr",
26 }
27 slm = SLM(
28 ses["model"],
29 ses["contrast"] * -1,
(...)
34 two_tailed=False,
35 )
---> 36 slm.fit(np.asanyarray(surface.sel(desc="thickness", smoothing=8)))
File /local/scratch/BrainStat/brainstat/stats/SLM.py:158, in SLM.fit(self, Y)
156 self._unmask()
157 if self.correction is not None:
--> 158 self.multiple_comparison_corrections(student_t_test)
File /local/scratch/BrainStat/brainstat/stats/SLM.py:254, in SLM.multiple_comparison_corrections(self, student_t_test)
251 self.P[field][field2] = [self.P[field][field2]]
252 self.Q = Q1
--> 254 self._surfstat_to_brainstat_rft()
File /local/scratch/BrainStat/brainstat/stats/SLM.py:325, in SLM._surfstat_to_brainstat_rft(self)
323 yeo_names, _ = fetch_yeo_networks_metadata(7)
324 yeo_names.insert(0, "Undefined")
--> 325 yeo7_index = yeo7[self.P["peak"]["vertid"][i]]
326 if "yeo7" not in self.P["peak"]:
327 self.P["peak"]["yeo7"] = []
IndexError: index 64984 is out of bounds for axis 0 with size 64984
From what I can tell, "clustid" and "vertid" are being saved as 1-based indices in brainstat.stats._multiple_comparisons::peak_clus, but being used at brainstat/stats/SLM.py:325 to directly index a numpy array, which uses 0-based indexing. In my example, I just happened to have a peak on the last voxel of the mesh, triggering the IndexError. But of course, this would mean potentially the wrong networks are being indexed even when the code runs without error.
I'm not familiar enough with the code to know if this problem creeps up elsewhere. For what it's worth, it's definitely surprising to me as a python user that the vertids are 1-based, although it's probably impossible to change this now. But I think at least this should be prominently documented.
Hopefully this is sufficient information, it's a bit tricky to make a minimal example, but I can try to pull one together if necessary.
The text was updated successfully, but these errors were encountered:
As mentioned, I'm really not sure how to make a minimal example, as the appearance of this bug relied on having a significant peak in the very last index of the brain mesh. I'd have to come up with fake data that meets this criterion, and I truthfully don't have time to do that.
I do have the small patch I've been using that's at least cleared the error pvandyken@dd9be62. Feel free to go off that.
This came to my attention via the following error:
Full Error
From what I can tell,
"clustid"
and"vertid"
are being saved as 1-based indices inbrainstat.stats._multiple_comparisons::peak_clus
, but being used atbrainstat/stats/SLM.py:325
to directly index a numpy array, which uses 0-based indexing. In my example, I just happened to have a peak on the last voxel of the mesh, triggering theIndexError
. But of course, this would mean potentially the wrong networks are being indexed even when the code runs without error.I'm not familiar enough with the code to know if this problem creeps up elsewhere. For what it's worth, it's definitely surprising to me as a python user that the
vertid
s are 1-based, although it's probably impossible to change this now. But I think at least this should be prominently documented.Hopefully this is sufficient information, it's a bit tricky to make a minimal example, but I can try to pull one together if necessary.
The text was updated successfully, but these errors were encountered: