1. Araci, D., 2019. FinBERT: financial Sentiment Analysis with Pre-trained Language Models. https://doi.org/10.48550/arXiv.1908.10063.
2. Brown, T.B., Mann, B., Ryder, N., Subbiah, M., Kaplan, J., Dhariwal, P., Neelakantan, A., Shyam, P., Sastry, G., Askell, A., Agarwal, S., Herbert-Voss, A., Krueger, G., Henighan, T., Child, R., Ramesh, A., Ziegler, D.M., Wu, J., Winter, C., Hesse, C., Chen, M., Sigler, E., Litwin, M., Gray, S., Chess, B., Clark, J., Berner, C., McCandlish, S., Radford, A., Sutskever, I., Amodei, D., 2020. Language Models are Few-Shot Learners. https://doi.org/10.48550/arXiv.2005.14165.
3. Devlin, J., Chang, M.-W., Lee, K., Toutanova, K., 2019. BERT: pre-training of Deep Bidirectional Transformers for Language Understanding. 10.48550/arXiv.1810.04805.
4. ChatGPT for (finance) research: the Bananarama conjecture;Dowling;Finance Res. Lett.,2023
5. Positive words carry less information than negative words;Garcia;EPJ Data Sci.,2012