A hybrid model for enhanced aspect-based sentiment analysis using large language models

International Journal of Artificial Intelligence

A hybrid model for enhanced aspect-based sentiment analysis using large language models

Abstract

Aspect-based sentiment analysis (ABSA) is a crucial task within natural language processing (NLP), enabling fine-grained opinion mining by identifying sentiments associated with specific aspects of a product or service. While transformer-based models like bidirectional encoder representations from transformers (BERT) have improved sentiment classification, they still struggle with limited contextual adaptability, especially in customer reviews containing complex expressions. Most existing approaches rely heavily on benchmark datasets such as semantic evaluation (SemEval) and multi-aspect multi-sentiment (MAMS), which do not fully capture the diversity of real-world review scenarios. Hence, this research addresses these limitations by proposing a novel hybrid model, called as hybrid-BERT (H-BERT), that integrates span-aware BERT (SpanBERT) with bidirectional long short-term memory (BiLSTM), conditional random field (CRF), and large language models (LLMs). The objective is to enhance aspect extraction and sentiment classification performance using both annotated and synthetic data. The methodology includes preprocessing, hybrid model training, and evaluation using the SemEval 2014 dataset. Experimental results show that H-BERT achieved 90.58% accuracy and 90.56% F-score in the laptop domain and 91.21% accuracy with a 92.03% F-score in the restaurant domain. These results outperform existing models, confirming H-BERT’s robustness and effectiveness. In conclusion, H-BERT improves sentiment understanding in customer reviews.

Discover Our Library

Embark on a journey through our expansive collection of articles and let curiosity lead your path to innovation.

Explore Now
Library 3D Ilustration