Abstract
Chương I: Tổng quan về đề tài; Chương II: Cơ sở lý thuyết; Chương III: Đề xuất mô hình tóm tắt văn bản; Chương IV: Cài đặt thử nghiệm và đánh giá; Chương V: Kết luận và hướng phát triển trong tương lai;
Keywords
Mô hình tóm tắt, Đơn văn bản, Tiếng Anh, Kiểu trích xuất
Publisher
Trường Đại Học Kinh Tế Quốc Dân???dc.relation.reference???
1. Alex M. (2015), Word2Vec Tutorial Part I: The Skip-gram Model. Retrieved from http://mccormickml.com/2016/04/27/word2vec-resources/#alex-minnaarstutorials.; 2. Dat Quoc Nguyen, Anh Tuan Nguyen, “PhoBERT: Pre-trained language models for Vietnamese,” arXiv preprint arXiv:2003.00744. 2020.; 3. Devlin, Jacob and Chang, Ming-Wei and Lee, Kenton and Toutanova, Kristina, "Bert: Pre-training of deep bidirectional transformers for language understanding," arXiv preprint arXiv:1810.04805. 2018.; 4. Géron A. (2017), Hands-on Machine Learning with Scikit-Learn and Tensorflow -Concepts, Tools, and Techniques to Build Intelligent Systems. Published by O‟Reilly Media, Inc., 1005 Gravenstein Highway North, Sebastopol, CA 95472.; 5. Ibrahim A.H. (2017), Understanding Word2vec for Word Embedding I. Retrieved from https://ahmedhanibrahim.wordpress.com/2017/04/25/thesistutorials-i-understanding-word2vec-for-word-embedding-i/
1. Alex M. (2015), Word2Vec Tutorial Part I: The Skip-gram Model. Retrieved from http://mccormickml.com/2016/04/27/word2vec-resources/#alex-minnaarstutorials.; 2. Dat Quoc Nguyen, Anh Tuan Nguyen, “PhoBERT: Pre-trained language models for Vietnamese,” arXiv preprint arXiv:2003.00744. 2020.; 3. Devlin, Jacob and Chang, Ming-Wei and Lee, Kenton and Toutanova, Kristina, "Bert: Pre-training of deep bidirectional transformers for language understanding," arXiv preprint arXiv:1810.04805. 2018.; 4. Géron A. (2017), Hands-on Machine Learning with Scikit-Learn and Tensorflow -Concepts, Tools, and Techniq...See More
03.20.00019.pdfSize : 2,13 MB
Format : Adobe PDF
View : 0
Download : 0