Exploring Transformers In Emotion Recognition: A Comparison Of BERT, Distillbert, Roberta, Xlnet And ELECTRA | Awesome LLM Papers

Exploring Transformers In Emotion Recognition: A Comparison Of BERT, Distillbert, Roberta, Xlnet And ELECTRA

Diogo Cortiz Β· CCRIS'22: 2022 3rd International Conference on Control, Robotics and Intelligent System Β· 2021

This paper investigates how Natural Language Understanding (NLU) could be applied in Emotion Recognition, a specific task in affective computing. We finetuned different transformers language models (BERT, DistilBERT, RoBERTa, XLNet, and ELECTRA) using a fine-grained emotion dataset and evaluating them in terms of performance (f1-score) and time to complete.

Similar Work
Loading…