You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Shivam one works for me, but trying to test with CPU gives me this?
...
File "/home/nerdy/github/diffusers_shivam/src/diffusers/models/attention.py", line 273, in forward
hidden_states = xformers.ops.memory_efficient_attention(query, key, value)
File "/home/nerdy/anaconda3/envs/diffusers/lib/python3.9/site-packages/xformers/ops.py", line 568, in memory_efficient_attention
op = AttentionOpDispatch.from_arguments(
File "/home/nerdy/anaconda3/envs/diffusers/lib/python3.9/site-packages/xformers/ops.py", line 531, in op
raise NotImplementedError(f"No operator found for this attention: {self}")
NotImplementedError: No operator found for this attention: AttentionOpDispatch(dtype=torch.float32, device=device(type='cpu'), k=40, has_dropout=False, attn_bias_type=<class 'NoneType'>, kv_len=4096, q_len=4096)
The Shivam one works for me, but trying to test with CPU gives me this?
Do I need to compile xformers differently, maybe? I'm guessing this version has something different?
"I used this revision of the xformers library pip install git+https://github.com/facebookresearch/xformers@1d31a3a#egg=xformers"
The text was updated successfully, but these errors were encountered: