-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
build_waitk_mask
does not exist
#1
Comments
Sorry, I just saw your issue. This function was deleted by mistake when standardizing the code. I've corrected it in the new commit. |
@zhangshaolei1998 Thanks a lot for your reply. I noticed it soon. This problem is solved. Train works well. But, there are some troble wihr evaluation. I comment here some.
=> should it comment out like this?
=> I don't know how to solve. It seems like this error exists in fairseq repo's issue. |
Thank you so much for your work and open access.
I followed Quick Start in the readme.
While training, the program runs throuh this below line.
But maybe
build_waitk_mask
function does not exist.When I commented out this
else
scope, the train went without stopping.But I cannot know whether way to fix is correct.
DiSeg/fairseq/modules/waitseg_multihead_attention.py
Line 337 in feff06a
I searched ictnlp's repo, and I found the similar code in the MoE-Waitk repo.
https://github.com/ictnlp/MoE-Waitk/blob/6f8ca9834c2ab77785ebd93fd569f73c3819340b/fairseq/modules/moe_waitk_multihead_attention.py#L400
I copied this
build_waitk_mask
function. But i got efformasked_fill only supports boolean masks, but got dtype Float
in following procedure.Please kindly tell me how to fix.
Thank you.
The text was updated successfully, but these errors were encountered: