Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ReLU Threshold #47

Open
2minkyulee opened this issue Jan 9, 2023 · 1 comment
Open

ReLU Threshold #47

2minkyulee opened this issue Jan 9, 2023 · 1 comment

Comments

@2minkyulee
Copy link

2minkyulee commented Jan 9, 2023

First of all, thank you for your great work.

In the paper (Section: Ablation on Simple-FFT-ReLU Stream), it is mentioned that different thresholds for ReLU can show better results. However, I cannot find the implementation of it in this repository.

I have two questions about the implementation.

  1. Does ReLU affect the real/imaginary part of the feature map independently?
  2. Does it zero out values under the threshold or does it "clamp" the values?

ex) Given a value 101+99j where the threshold of ReLU is 100(1+1j), which one is the correct value after the ReLU Layer?

  1. 0 + 0j (zero-out values, not independent)
  2. 0 + 99j (zero-out values, independent)
  3. 100 + 99j (clamp values, independent)

Thank you.

@INVOKERer
Copy link
Owner

Thanks for your attention, we will organize the relevant code.
Answer:

  1. Yes, ReLU affect the real/imaginary part of the feature map independently.
  2. We use torch.clamp to realize this experiment. Given a value 101+99j, it will be 101+100j.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants