Replies: 4 comments
-
The binary attention bias component is implemented here: and is inserted into MoiraiModule here: uni2ts/src/uni2ts/model/moirai/module.py Line 101 in fac7a93 |
Beta Was this translation helpful? Give feedback.
-
Thank you very much for your reply. Do I still not understand that the "The binary attention bias component" mentioned in the email is equivalent to the "any-variable attention mechanism" in the paper?
Redamancy
***@***.***
…------------------ Original ------------------
From: Gerald Woo ***@***.***>
Date: Fri,Jul 12,2024 3:13 PM
To: SalesforceAIResearch/uni2ts ***@***.***>
Cc: Jiaulo ***@***.***>, Author ***@***.***>
Subject: Re: [SalesforceAIResearch/uni2ts] any-variable attention (Discussion#83)
The binary attention bias component is implemented here:
https://github.com/SalesforceAIResearch/uni2ts/blob/fac7a937cba55a318fba6897e27a748d2403d07d/src/uni2ts/module/position/attn_bias.py#L67
and is inserted into MoiraiModule here:
https://github.com/SalesforceAIResearch/uni2ts/blob/fac7a937cba55a318fba6897e27a748d2403d07d/src/uni2ts/model/moirai/module.py#L101
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Hi @Jiaulo, yes, please refer to part 3.1.2 of the paper to see how the binary attention bias relates to the any-variate attention mechanism. It is the key part of the mechanism that enables the any-variate property. |
Beta Was this translation helpful? Give feedback.
-
Thank you very much for your reply. Since I am a student and am currently studying UNI2TS, I will ask you for help by email if you encounter difficulties later. Thanks again!
…------------------ 原始邮件 ------------------
发件人: "SalesforceAIResearch/uni2ts" ***@***.***>;
发送时间: 2024年7月14日(星期天) 上午10:20
***@***.***>;
***@***.******@***.***>;
主题: Re: [SalesforceAIResearch/uni2ts] any-variable attention (Discussion #83)
Hi @Jiaulo, yes, please refer to part 3.1.2 of the paper to see how the binary attention bias relates to the any-variate attention mechanism. It is the key part of the mechanism that enables the any-variate property.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
In which file is the any-variable attention mechanism mentioned in the paper located?
Beta Was this translation helpful? Give feedback.
All reactions