Skip to content

Latest commit

 

History

History
6 lines (3 loc) · 738 Bytes

README.md

File metadata and controls

6 lines (3 loc) · 738 Bytes

Transformer-based Pre-trained Models for Natural Language Processing

In this project, we provided a comprehensive review and evaluation of some pre-trained models. Specifically, we firstly introduced the required knowledge for language representation learning and transfer learning and discussed the structural differences and advantages of these different pre-trained models. Then we performed experiments applying these pre-trained models to different downstream tasks (Q&A, summarization and language modelling) for comprehensive evaluation. This project aims to serve the future learners and the community as in-depth theoretical analysis and useful benchmark test results.

See report.pdf for the project report.