We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
你好: 如果我同一個資料會有多種不同標記情況的話 我應該如何修改讓BERT學到這種情況
謝謝
The text was updated successfully, but these errors were encountered:
Sorry, something went wrong.
就是比如說在TaipeiQA資料上,"2018臺北藝術節FAQ"它同時label屬於臺北市政府文化局和臺北市政府產業發展局科技產業服務中心。這樣模型要怎麼學習跟預測呢
抱歉過了這麼久才回覆 其實我沒有真正解過多選問題,因此可能無法回答你 不過可以參考一下 https://huggingface.co/transformers/model_doc/bert.html#bertformultiplechoice 然後找個多選題資料集試試看
這個repo是用BertForSequenceClassification,比較適合單選
我曾經在另外一個比較特別的case使用判斷logits高低做為門檻值來達到類似效果,看起來是有些效果的,不過並沒有進行更深一步驗證
No branches or pull requests
你好:
如果我同一個資料會有多種不同標記情況的話
我應該如何修改讓BERT學到這種情況
謝謝
The text was updated successfully, but these errors were encountered: