Affiliation:
1. Faculty of Islamic Technology, University Islam Sultan Sharif Ali, Brunei
2. School of Computer Science, Taylor's University, Malaysia
Abstract
AI systems are integral to the development of smart cities, but their complexity can make decision-making processes opaque, leading to concerns about accountability and transparency. Explainable AI (XAI) aims to address this by designing algorithms that can explain decisions in a way that humans can understand. XAI can increase transparency and accountability in smart cities, promote trust between residents and officials, and enhance the adoption and acceptance of smart city technologies. However, there are still challenges to overcome, and continued research is necessary to fully realize the potential benefits of XAI.
Reference58 articles.
1. Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2. Explainable Natural Disasters Detection Using Transfer Learning
3. Ahmad, K., Maabreh, M., Ghaly, M., Khan, K., Qadir, J., & Al-Fuqaha, A. (2020). Developing future human-centered smart cities: Critical analysis of smart city security, interpretability, and ethical challenges. arXiv preprint arXiv:2012.09110.
4. Biometric Authentication-Based Intrusion Detection Using Artificial Intelligence Internet of Things in Smart City
5. Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Future Trends and Challenges in Cybersecurity and Generative AI;Advances in Information Security, Privacy, and Ethics;2024-07-26