-
Notifications
You must be signed in to change notification settings - Fork 112
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AttributeError: module 'tensorflow.contrib.seq2seq' has no attribute 'prepare_attention' #7
Comments
Use tensorflow version 1.0.0, it will only work with that. |
I am also having the same issue, do I need to downgrade(as i am having tf version '1.3.0') the tf version? or there is some other solution to this issue? |
same issue |
You have to downgrade, as downgrading worked for me. |
Could not find tensorflow version 1.0 C:\Users\eratsau>pip install tensorflow==1.0 |
Use Python version = 3.5. (Upgrade/downgrade python version accordingly) |
what is the alternative for tf.contrib.seq2seq.prepare_attention() in the tensorflow API 1.8 |
I am unable to downgrade tensorflow 1.9 to 1.0 on anaconda. |
is there any other way to clear this issue without downgrading the version of tensorflow |
Is there any other workaround without having to downgrade my tensorflow |
Is there any other workaround other than downgrading TensorFlow? |
What is the alternative for tensorflow.contrib.seq2seq.prepare_attention() in tensorflow 1.13 or 1.14 |
This part of the code:
Fails with the following error. Anyone get the code to work with TF 1.2.1 or 1.3?
The text was updated successfully, but these errors were encountered: