Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ELU activation function #17

Open
khedher1984 opened this issue Feb 11, 2020 · 1 comment
Open

ELU activation function #17

khedher1984 opened this issue Feb 11, 2020 · 1 comment

Comments

@khedher1984
Copy link

Hello,
In a research work related to the robustness of images to geometric transformations, we used your ERAN code, which allows the use of abstract interpretation. Here is our publication
Currently, and in the same application we need to evaluate another model that uses the ELU (Exponential Linear Unit) activation function. At this time, the available version of your code is not suitable for the ELU function.
Could you please inform me whether it is possible to use your code in the case of an ELU activation function or whether you plan to adapt the code to the different activation functions?
Thank you in advance,
Yours sincerely.

@GgnDpSngh
Copy link
Collaborator

Hi there,

Thanks for your interest in using ERAN. We currently do not support the ELU activation function in ERAN as the networks we tried did not require this functionality. However, adding this support is straightforward and we will let you know when this is available.

Cheers,
Gagandeep Singh

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants