You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been trying to run your code for several days now without success so I'd like to ask for some help.
You did not specify data preprocessing steps, so basically you gave no instructions on how to prepare data for your scripts (learn_base.py, temp_exp_main.py, enhance_main.py).
After placing the raw files into the 'processed' folder, running learn_base.py was no problem.
I also managed to generate .h5 files for the explainer script (temp_exp_main.py) but then I got stuck.
Please, give me some instructions on how to prepare the processed/wikipedia_train_edge.npy and processed/wikipedia_test_edge.npy files that you use as input for temp_exp_main.py. I looked over all your source code but it does not contain any resources for generating/exporting these files.
Your help would be much appreciated!
Thanks,
Ferenc
The text was updated successfully, but these errors were encountered:
I am also struggling with this - in general, the code seems to be quite incomplete, which makes reproducing this study challenging. It would be great if the authors could supply a usable version of the code base together with documentation.
Dear Authors!
I've been trying to run your code for several days now without success so I'd like to ask for some help.
You did not specify data preprocessing steps, so basically you gave no instructions on how to prepare data for your scripts (learn_base.py, temp_exp_main.py, enhance_main.py).
learn_base.py
was no problem.temp_exp_main.py
) but then I got stuck.Please, give me some instructions on how to prepare the
processed/wikipedia_train_edge.npy
andprocessed/wikipedia_test_edge.npy
files that you use as input fortemp_exp_main.py
. I looked over all your source code but it does not contain any resources for generating/exporting these files.Your help would be much appreciated!
Thanks,
Ferenc
The text was updated successfully, but these errors were encountered: