Preprint / Version 1

Qualitative Differences in Communication Between ChatGPT and Siri

##article.authors##

  • Emily Su The Harker School

DOI:

https://doi.org/10.58445/rars.498

Keywords:

ChatGPT, Siri, AI, Eliza Effect, Computers as Social Actors

Abstract

This research explores different aspects of chatbot design, such as display, vernacular, and communication style influence the impression of A.I. consciousness of humans. Additionally, it explores the Eliza Effect and the Computers as Social Actors Theory, and the reasons humans instinctively treat AI as human. The information collected will help provide insight on the future of chatbots.

Methods to conduct the research include direct communication with chatbots ChatGPT and Siri. Additionally, existing surveys and data gathered on consciousness perceived by humans as well as the underlying reasons behind those perceptions are used and connected to chatbot design to understand how to continue strengthening the bots.

The results depict a broader understanding of communication between humans and A.I and the ways human perception of A.I. consciousness is influenced by chatbot design. The findings also detail the future benefits and doubts with A.I. and the necessary measures to be taken to prevent dangers.

The research aims to discover underlying reasons as to why humans are gullible to believing in A.I. consciousness based on chatbot design, techniques to strengthen chatbot communication with humans, and the implications the findings provide on the future of chatbots.

References

Adobe Digital Insights. (2019). State of Voice: Adobe Digital Insights 2019. https://www.slideshare.net/adobe/state-of-voice-assistants-2019?from_action=save

AppleInsider. (2023, September 8). Siri: Features, Shortcuts, Abilities. https://appleinsider.com/inside/siri#:~:text=Siri%20is%20Apple’s%20smart%20assistant,multiple%20voices%20across%20several%20languages

Buçinca, Z., Malaya, M. B., & Gajos, K. Z. (2021). To Trust or to Think. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW1), 1–21. https://doi.org/10.1145/3449287

Cai, Z. G., Haslett, D. A., Duan, X., Wang, S., & Pickering, M. J. (2023). Does ChatGPT resemble humans in language use?. arXiv preprint arXiv:2303.08014.

Cosmo, L. D. (2022, July 12). Google Engineer Claims AI Chatbot is Sentient: Why That Matters. Scientific American. https://www.scientificamerican.com/article/google-engineer-claims-ai-chatbot-is-sentient-why-that-matters/

Cristea, I. A., Sucala, M., & David, D. (2013). Can you tell the difference? Comparing face-to-face versus computer-based interventions. The" Eliza" effect in psychotherapy. Journal of Cognitive & Behavioral Psychotherapies, 13(2).

Hasan, R., Koles, B., Zaman, M., & Paul, J. (2021). The potential of chatbots in travel and tourism services in the context of social distancing. International Journal of Technology Intelligence and Planning, 13(1), 63-83.

Hu, K. (2023, February 2). ChatGPT sets record for fastest-growing user base - analyst note. Reuters. https://www.reuters.com/technology/chatgpt-sets-record-fastest-growing-user-base-analyst-note-2023-02-01/

Hu, T., Xu, A., Liu, Z., You, Q., Guo, Y., Sinha, V., ... & Akkiraju, R. (2018, April). Touch your heart: A tone-aware chatbot for customer care on social media. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-12).

Hill, S., Wan, K., & Beaton, P. (2021, December 10). 81 Funny Things to Ask Siri: The Funniest Questions. Digital Trends. https://www.digitaltrends.com/mobile/funny-questions-to-ask-siri/

Kim, S. Y., Schmitt, B. H., & Thalmann, N. M. (2019). Eliza in the uncanny valley: Anthropomorphizing consumer robots increases their perceived warmth but decreases liking. Marketing letters, 30, 1-12.

Kuyucu, Ayşe Kübra. (2023, February 2). What People Ask ChatGPT the Most:Frequent Questions asked to ChatGPT. Medium. https://medium.com/tech-talk-with-chatgpt/what-people-ask-chatgpt-the-most-fd500eba26e0

Lee, J. R. & Nass, C. I. (2010). Trust in Computers: The Computers-Are-Social-Actors (CASA) Paradigm and Trustworthiness Perception in Human-Computer Communication. In D. Latusek & A. Gerbasi (Eds.), Trust and Technology in a Ubiquitous Modern Environment: Theoretical and Methodological Perspectives (pp. 1-15). IGI Global. https://doi.org/10.4018/978-1-61520-901-9.ch001

McCarthy, Lauren. “A Wellness Chatbot Is Offline After Its ‘Harmful’ Focus on Weight Loss.” New York Times, 8 June 2023.

Palasundram, K., Sharef, N. M., Nasharuddin, N., Kasmiran, K., & Azman, A. (2019). Sequence to sequence model performance for education chatbot. International journal of emerging Technologies in Learning (iJET), 14(24), 56-68.

Pradhan, A., & Lazar, A. (2021). Hey Google, do you have a personality? Designing personality and personas for conversational agents. In Proceedings of the 3rd Conference on Conversational User Interfaces (pp. 1-4).

Roose, Kevin. “A Conversation With Bing’s Chatbot Left Me Deeply Unsettled.” New York Times, 16 February 2023.

Saeidnia, H. (2023). Using ChatGPT as a Digital/Smart Reference Robot: How May ChatGPT Impact Digital Reference Services?. Information Matters, 2(5)

Shewale, R. (2023, August 14). 67 Voice Search Statistics for 2023 (Fresh data). DemandSage. https://www.demandsage.com/voice-search-statistics/

Silberg, J., & Manyika, J. (2019). Notes from the AI frontier: Tackling bias in AI (and in humans). McKinsey Global Institute, 1(6).

Strengers, Y., & Kennedy, J. (2021). The smart wife: Why Siri, Alexa, and other smart home devices need a feminist reboot. Mit Press

Downloads

Posted

2023-09-23