Mengungkap Opini Publik: Pendekatan BERT-based-caused untuk Analisis Sentimen pada Komentar Film

  • Andi Aljabar Mudding Universitas Nahdlatul Ulama Indonesia
Keywords: BERT, NLP, Transformer

Abstract

BERT (Bidirectional Encoder Representations from Transformers) adalah model bahasa yang revolusioner dalam pemrosesan bahasa alami, mengandalkan encoder untuk menghasilkan representasi kontekstual dari teks input. Melalui pendekatan tokenisasi, embedding, dan attention mechanisms pada setiap layer transformer, BERT memungkinkan pemahaman hubungan antar kata secara mendalam dan bidireksional. Keunikan BERT terletak pada kemampuannya untuk memproses konteks dari kedua arah, menciptakan representasi vektor yang kaya makna. Model ini telah menjadi pionir dalam transfer learning di NLP, memungkinkan pemanfaatan representasi umum pada tugas-tugas khusus setelah proses pelatihan. Dengan demikian, BERT mengubah paradigma pemrosesan bahasa alami, membuka pintu untuk aplikasi yang lebih canggih seperti klasifikasi teks, analisis sentimen, dan pemahaman bahasa yang lebih kontekstual.

References

Alaparthi, S., & Mishra, M. (n.d.). Bidirectional Encoder Representations from Transformers (BERT): A sentiment analysis odyssey.

Boukabous, M., & Azizi, M. (2022). Crime prediction using a hybrid sentiment analysis approach based on the bidirectional encoder representations from transformers. Indonesian Journal of Electrical Engineering and Computer Science, 25(2), 1131–1139. https://doi.org/10.11591/ijeecs.v25.i2.pp1131-1139

Chiorrini, A., Diamantini, C., Mircoli, A., & Potena, D. (2021). Emotion and sentiment analysis of tweets using BERT. https://code.google.com/archive/p/word2vec/

Deepa, M. D., & Tamilarasi, A. (2021). Bidirectional Encoder Representations from Transformers (BERT) Language Model for Sentiment Analysis task: Review. In Turkish Journal of Computer and Mathematics Education (Vol. 12, Issue 7).

Delbrouck, J.-B., Tits, N., Brousmiche, M., & Dupont, S. (2020). A Transformer-based joint-encoding for Emotion Recognition and Sentiment Analysis. https://doi.org/10.18653/v1/2020.challengehml-1.1

Geni, L., Yulianti, E., & Sensuse, D. I. (2023). Sentiment Analysis of Tweets Before the 2024 Elections in Indonesia Using IndoBERT Language Models. Jurnal Ilmiah Teknik Elektro Komputer Dan Informatika (JITEKI), 9(3), 746–757. https://doi.org/10.26555/jiteki.v9i3.26490

Hutama, L. B., & Suhartono, D. (2022). Indonesian Hoax News Classification with Multilingual Transformer Model and BERTopic. Informatica (Slovenia), 46(8), 81–90. https://doi.org/10.31449/inf.v46i8.4336

Karayiğit, H., Akdagli, A., & Acı, Ç. İ. (2022). BERT-based Transfer Learning Model for COVID-19 Sentiment Analysis on Turkish Instagram Comments. Information Technology and Control, 51(3), 409–428. https://doi.org/10.5755/j01.itc.51.3.30276

Khan, L., Amjad, A., Ashraf, N., & Chang, H. T. (2022). Multi-class sentiment analysis of urdu text using multilingual BERT. Scientific Reports, 12(1). https://doi.org/10.1038/s41598-022-09381-9

Palani, S., Rajagopal, P., & Pancholi, S. (2021). T-BERT -- Model for Sentiment Analysis of Micro-blogs Integrating Topic Model and BERT. http://arxiv.org/abs/2106.01097

Tabinda Kokab, S., Asghar, S., & Naz, S. (2022). Transformer-based deep learning models for the sentiment analysis of social media data. Array, 14. https://doi.org/10.1016/j.array.2022.100157

Tesfagergish, S. G., Kapočiūtė-Dzikienė, J., & Damaševičius, R. (2022). Zero-Shot Emotion Detection for Semi-Supervised Sentiment Analysis Using Sentence Transformers and Ensemble Learning. Applied Sciences (Switzerland), 12(17). https://doi.org/10.3390/app12178662

Wang, Z., Wan, Z., & Wan, X. (2020). TransModality: An End2End Fusion Method with Transformer for Multimodal Sentiment Analysis. The Web Conference 2020 - Proceedings of the World Wide Web Conference, WWW 2020, 2514–2520. https://doi.org/10.1145/3366423.3380000

Yuan, Z., Li, W., Xu, H., & Yu, W. (2021). Transformer-based Feature Reconstruction Network for Robust Multimodal Sentiment Analysis. MM 2021 - Proceedings of the 29th ACM International Conference on Multimedia, 4400–4407. https://doi.org/10.1145/3474085.3475585

Yüksel, A. E., Türkmen, Y. A., Özgür, A., & Altınel, A. B. (2019). Turkish tweet classification with transformer encoder. International Conference Recent Advances in Natural Language Processing, RANLP, 2019-September, 1380–1387. https://doi.org/10.26615/978-954-452-056-4_158

Zhang, T., Gong, X., & Chen, C. L. P. (2022). BMT-Net: Broad Multitask Transformer Network for Sentiment Analysis. IEEE Transactions on Cybernetics, 52(7), 6232–6243. https://doi.org/10.1109/TCYB.2021.3050508

Published
2024-01-22