零(语言学)
计算机科学
弹丸
情绪分析
人工智能
语言学
哲学
有机化学
化学
作者
Konstantinos Kyritsis,Isidoros Perikos,Michael Paraskevas
标识
DOI:10.1109/bcd57833.2023.10466289
摘要
The BART model is an advanced adaptation of transformers introduced by Facebook. It has incorporated elements from both BERT and GPT transformers, enabling significant advancements in language understanding and general speech processing. Utilizing both encoder and decoder components, BART proves versatile for various tasks, including translation, text completion, automatic sentence generation, entity recognition, sentiment analysis, and more. In this study, we focus on the study of pretrained models, BART and a modified version called distilbart, in the context of Zero-Shot Text Classification. In the experimental study we dive into the Zero-Shot technique applied to various pretrained Transformers. Our analysis demonstrates that, depending on the Model we utilize, we can achieve F1 scores of up to 88%, showcasing the model's effectiveness in discerning classes for this Sentiment Analysis problem using the Zero-Shot Text Classification technique.
科研通智能强力驱动
Strongly Powered by AbleSci AI