Review Article

AIsmosis and the pas de deux of human-AI interaction: Exploring the communicative dance between society and artificial intelligence

Ayse Asli Bozdag 1 2 *
More Detail
1 Faculty of Business, Istanbul Bilgi University, Istanbul, TÜRKİYE2 Faculty of Communication, Bahcesehir University, Istanbul, TÜRKİYE* Corresponding Author
Online Journal of Communication and Media Technologies, 13(4), October 2023, e202340, https://doi.org/10.30935/ojcmt/13414
Published Online: 19 June 2023, Published: 01 October 2023
OPEN ACCESS   4045 Views   2844 Downloads
Download Full Text (PDF)

ABSTRACT

As the global influence of artificial intelligence (AI) in our daily lives and the looming advent of artificial general intelligence (AGI) become increasingly apparent, the need for a sophisticated interpretive framework intensifies. This paper introduces ‘AIsmosis’–a term that captures AI’s gradual, nuanced integration into society, and akin to the biological process of osmosis. AI’s integration dynamics are examined through the lens of three pivotal theories: social construction of technology, technological determinism, and diffusion of innovations. These theories collectively elucidate the sociocultural influences on AI, the potential repercussions of unchecked technological growth, and the factors driving the adoption of novel technologies. Building upon these explorations, the ‘controlled AIsmosis’ conceptual framework emerges, emphasizing ethically conscious development, active stakeholder communication, and democratic dialogue in the context of AI technology adoption. Rooted in communicative action theory, this framework illuminates AI’s transformative impact on society. It calls for a comprehensive evaluation of systems that steer AI diffusion and their potential impacts, acknowledging the pervasive influence of AI and transcending traditional disciplinary boundaries. This work underscores the need for a multidisciplinary and interdisciplinary approach in investigating the complex AI-society interplay and understanding the ethical and societal consequences of AIsmosis.

CITATION (APA)

Bozdag, A. A. (2023). AIsmosis and the pas de deux of human-AI interaction: Exploring the communicative dance between society and artificial intelligence. Online Journal of Communication and Media Technologies, 13(4), e202340. https://doi.org/10.30935/ojcmt/13414

REFERENCES

  1. Adorno, T. W. (1991). The culture industry: Selected essays on mass culture. Routledge.
  2. Baker, R. S., Hawn, A. (2022). Algorithmic bias in education. International Journal of Artificial Intelligence in Education, 32, 1052-1092. https://doi.org/10.1007/s40593-021-00285-9
  3. BBC News. (2023). Drake and the weekend song ‘pulled’ by Universal Music. BBC. https://www.bbc.com/news/entertainment-arts-65309313
  4. Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim Code. John Wiley & Sons. https://doi.org/10.1093/sf/soz162
  5. Bickmore, T. W., & Picard, R. W. (2004). Towards caring machines. In Proceedings of the 2004 Conference on Human Factors in Computing Systems. https://doi.org/10.1145/985921.986097
  6. Bonhoeffer, K. (1904). Der Korsakowsche Symptomenkoplex in seinen Beziehungen zu den verschiedenen Krankheitsformen [Korsakoff’s complex of symptoms in relation to the various forms of the disease]. Allgemeine Zeitung Psychiatrie [General Newspaper Psychiatry], 61, 744e752.
  7. Brynjolfsson, E., & McAfee, A. (2014). The second machine age: Work, progress, and prosperity in a time of brilliant technologies. WW Norton & Company.
  8. Bubeck, S., Chandrasekaran, V., Eldan, R., Gehrke, J., Horvitz, E., Kamar, E., Lee, P., Lee, Y. T., Li, Y., Lundberg, S., Nori, H., Palangi, H., Ribeiro, M. T., & Zhang, Y. (2023, March). Sparks of Artificial General Intelligence: Early experiments with GPT-4. https://www.microsoft.com/en-us/research/publication/sparks-of-artificial-general-intelligence-early-experiments-with-gpt-4/
  9. Büchi, M., Festic, N., & Latzer, M. (2022). The chilling effects of digital dataveillance: A theoretical model and an empirical research agenda. Big Data & Society, 9(1). https://doi.org/10.1177/20539517211065368
  10. Bughin, J., Hazan, E., Ramaswamy, S., Chui, M., Allas, T., Dahlström, P., Henke, N., & Trench, M. (2017). The rise of AI: What do its advances mean for business strategy? McKinsey Quarterly.
  11. Caldarini, G., Jaf, S., & McGarry, K. (2022). A literature survey of recent advances in chatbots. Information, 13(1), 41. https://doi.org/10.3390/info13010041
  12. Capital.com. (2023). Top AI stocks: Best artificial intelligence companies to invest in. Capital.com. https://capital.com/top-ai-stocks-companies-to-invest
  13. Chen, H., Chiang, R. H. L., & Storey, V. C. (2012). Business intelligence and analytics: From big data to big impact. MIS Quarterly, 36(4), 1165-1188. https://doi.org/10.2307/41703503
  14. Council of Europe. (2022). Artificial intelligence and education: A critical view through the lens of human rights, democracy, and the rule of law. Council of Europe. https://rm.coe.int/artificial-intelligence-and-education-a-critical-view-through-the-lens/1680b7c9b8
  15. Crawford, K. (2015). Can an algorithm be agonistic? Ten scenes from life in calculated publics. Science, Technology, & Human Values, 41(1), 77-92. https://doi.org/10.1177/0162243915589635
  16. Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin’s Press.
  17. Financial Times. (2023). We must slow down the race to God-like AI. Financial Times. https://www.ft.com/content/03895dc4-a3b7-481e-95cc-336a524f2ac2
  18. Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e19. https://doi.org/10.2196/mental.7785
  19. Floridi, L. (2014). The fourth revolution: How the infosphere is reshaping human reality. Oxford University Press.
  20. Foot, P. (1967). Abortion and the doctrine of double effect. Oxford Review, 5, 5-15.
  21. Forbes. (2020). Why AI is transforming the banking industry. Forbes. https://www.forbes.com/sites/cognitiveworld/2020/04/05/why-ai-is-transforming-the-banking-industry/
  22. Forester, J. (1999). The deliberative practitioner: Encouraging participatory planning processes. MIT Press.
  23. Fortune. (2023a). Scammers are using voice-cloning A.I. tools to sound like victims’ relatives in desperate need of financial help. It’s working. Fortune. https://fortune.com/2023/03/05/scammers-ai-voice-cloning-tricking-victims-sound-like-relatives-needing-money/
  24. Fortune. (2023b). Snapchat influencer launches CarynAI, a virtual girlfriend bot using OpenAI’s GPT-4. https://fortune.com/2023/05/09/snapchat-influencer-launches-carynai-virtual-girlfriend-bot-openai-gpt4/
  25. Foucault, M. (1977). Discipline and punish: The birth of the prison. Vintage.
  26. Freeman, R. E. (1984). Strategic management: A stakeholder approach. Pitman.
  27. Frey, C. B., & Osborne, M. A. (2017). The future of employment: How susceptible are jobs to computerization? Technological Forecasting and Social Change, 114, 254-280. https://doi.org/10.1016/j.techfore.2016.08.019
  28. Fuchs, C. (2013). Social media: A critical introduction. SAGE. https://doi.org/10.4135/9781446270066
  29. Gerbner, G., Gross, L., Morgan, M., & Signorielli, N. (1980). The “mainstreaming” of America: Violence profile no. 11. Journal of Communication, 30(3), 10-29. https://doi.org/10.1111/j.1460-2466.1980.tb01987.x
  30. Gerkin, R. C., & Wiltschko, A. B. (2022). Digitizing smell: Using molecular maps to understand odor. Google AI Blog. https://ai.googleblog.com/2022/09/digitizing-smell-using-molecular-maps.html
  31. Goldhaber, M. H. (1997). Attention shoppers! Wired. https://www.wired.com/1997/12/es-attention/
  32. Gujral, R. (2023). The implications of emotion AI in our lives [YouTube video]. https://www.youtube.com/watch?v=jHoCfJlMHdM
  33. Habermas, J. (1984). The theory of communicative action: Reason and the rationalization of society. Beacon Press.
  34. Habermas, J. (1987). The theory of communicative action: Lifeworld and system, a critique of functionalist reason. Beacon Press.
  35. Haraway, D. J. (1991). Simians, cyborgs, and women: The reinvention of nature. Routledge.
  36. Harris, T., & Raskin, A. (2023). The A.I. dilemma. YouTube. https://www.youtube.com/watch?v=xoVJKj8lcNQ
  37. Hatfield, E., Cacioppo, J. L. & Rapson, R. L. (1993). Emotional contagion. Current Directions in Psychological Sciences, 2, 96-99. https://doi.org/10.1111/1467-8721.ep10770953
  38. Hofstadter, D. R. (2008). I am a strange loop. Basic Books.
  39. Huang, J., Gu, S. S., Hou, L., Wu, Y., Wang, X., Yu, H., & Han, J. (2022). Large language models can self-improve. arXiv, 11610. https://doi.org/10.48550/arXiv.2210.11610
  40. Innis, H. (1950). Empire and communications. Oxford University Press.
  41. Jobin, A., Ienca, M., & Vayena, E. (2019). The global landscape of AI ethics guidelines. Nature Machine Intelligence, 1(9), 389-399. https://doi.org/10.1038/s42256-019-0088-2
  42. Jordan, M. I., & Mitchell, T. M. (2015). Machine learning: Trends, perspectives, and prospects. Science, 349(6245), 255-260. https://doi.org/10.1126/science.aaa8415
  43. Layton, E. (1971). Mirror-image twins: The communities of science and technology in 19th-century America. Technology and Culture, 12(4), 562-580. https://doi.org/10.2307/3102571
  44. McAfee, A., & Brynjolfsson, E. (2012). Big data: The management revolution. Harvard Business Review, 90(10), 60–68.
  45. McKinsey Global Institute. (2017). Harnessing automation for a future that works. https://www.mckinsey.com/featured-insights/digital-disruption/harnessing-automation-for-a-future-that-works
  46. Metz, C. (2023, May 16). Microsoft says new A.I. shows signs of human reasoning. The New York Times. https://www.nytimes.com/2023/05/16/technology/microsoft-ai-human-reasoning.html
  47. Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society, 3(2). https://doi.org/10.1177/2053951716679679
  48. Moore, S. (2023). How bias impacts AI [Paper presentation]. Rise of AI Conference 2023.
  49. MSNBC. (2023). Hip-hop artists are leading the AI discussion. Are you listening? MSNBC. https://www.msnbc.com/the-reidout/reidout-blog/hip-hop-timbaland-ai-young-guru-rcna85875
  50. Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81-103. https://doi.org/10.1111/0022-4537.00153
  51. New York Post. (2023). AI-generated photo of fake Pentagon explosion sparks brief stock selloff. https://nypost.com/2023/05/22/ai-generated-photo-of-fake-pentagon-explosion-sparks-brief-stock-selloff/
  52. Noelle-Neumann, E. (1974). The spiral of silence: Public opinion–Our social skin. University of Chicago Press. https://doi.org/10.1111/j.1460-2466.1974.tb00367.x
  53. Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 366(6464), 447-453. https://doi.org/10.1126/science.aax2342
  54. OpenAI. (2023). GPT-4 technical report. OpenAI. https://cdn.openai.com/papers/gpt-4.pdf
  55. Pariser, E. (2011). The filter bubble: How the new personalized web is changing what we read and how we think. Penguin.
  56. Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Harvard University Press. https://doi.org/10.4159/harvard.9780674736061
  57. Piaget, J. (1929). The child’s conception of the world. Brace Jovanovich.
  58. Pinch, T. J., & Bijker, W. E. (1984). The social construction of facts and artefacts: Or how the sociology of science and the sociology of technology might benefit each other. Social Studies of Science, 14(3), 399-441. https://doi.org/10.1177/030631284014003004
  59. Premack, D., & Woodruff, G. (1978). Does the chimpanzee have a theory of mind? Behavioral and Brain Sciences, 1(4), 515-526. https://doi.org/10.1017/S0140525X00076512
  60. Rogers, E. M. (1962). Diffusion of innovations. Free Press of Glencoe.
  61. Russell, S. (2023). How not to destroy the world with AI [YouTube video]. https://www.youtube.com/watch?v=ISkAkiAkK7A&t=2865s
  62. Selbst, A. D., Boyd, D., Friedler, S. A., Venkatasubramanian, S., & Vertesi, J. (2019). Fairness and abstraction in sociotechnical systems. In Proceedings of the Conference on Fairness, Accountability, and Transparency (pp. 59-68). ACM. https://doi.org/10.1145/3287560.3287598
  63. Selman, R. L. (1980). The growth of interpersonal understanding: Developmental and clinical analyses. Academic Press.
  64. Simon, H. A. (1971). Designing organizations for an information-rich world. In M. Greenberger (Ed.), Computers, communications, and the public interest (pp. 37-72). The Johns Hopkins Press.
  65. Smith, A., & Anderson, M. (2018). Automation in everyday life. Pew Research Center. https://www.pewresearch.org/internet/2017/10/04/automation-in-everyday-life/
  66. Starke, C., Baleis, J., Keller, B., & Marcinkowski, F.(2022). Fairness perceptions of algorithmic decision-making: A systematic review of the empirical literature. Big Data & Society, 9(2), 1-16. https://doi.org/10.1177/20539517221115189
  67. Sunstein, C. R. (2001). Republic.com. Princeton University Press.
  68. Taddeo, M., & Floridi, L. (2018). Regulate artificial intelligence to avert cyber arms race. Nature. https://www.nature.com/articles/d41586-018-04602-6
  69. The Verge. (2023). Grimes says anyone can use her voice for AI-generated songs. The Verge. https://www.theverge.com/2023/4/24/23695746/grimes-ai-music-profit-sharing-copyright-ip
  70. Tian, Z., Cui, L., Liang, J., & Yu, S. (2022). A comprehensive survey on poisoning attacks and countermeasures in machine learning. ACM Computing Surveys, 55(8), 166. https://doi.org/10.1145/3551636
  71. Turkle, S. (2015). Reclaiming conversation: The power of talk in a digital age. Penguin Books.
  72. van Dijk, J. A. G. M. (2020). The digital divide: Aspects of inequalities resulting from network participation. International Journal of Communication, 14, 3915-3931.
  73. Veblen, T. (1899). The theory of the leisure class: An economic study in the evolution of institution. Macmillan.
  74. Veblen, T. (1904). The theory of business enterprise. Charles Scribners.
  75. Veblen, T. (1921). The engineers & the price system. Batoche Book.
  76. Virvou, M. (2022). The emerging era of human-AI interaction: Keynote address. In Proceedings of the 13th International Conference on Information, Intelligence, Systems & Applications (pp. 1-10). https://doi.org/10.1109/IISA56318.2022.9904422
  77. Warschauer, M., & Matuchniak, T. (2010). New technology and digital worlds: Analyzing evidence of equity in access, use, and outcomes. Review of Research in Education, 34(1), 179-225. https://doi.org/10.3102/0091732X09349791
  78. Webb, A. (2023). Amy Webb launches 2023 emerging tech trend report [YouTube video]. https://www.youtube.com/watch?v=vMUpzxZB3-Y
  79. Winner, L. (1986). The whale and the reactor: A search for limits in an age of high technology. University of Chicago Press.
  80. World Economic Forum. (2018). How artificial intelligence is shaking up the job market. https://www.weforum.org/agenda/2018/09/artificial-intelligence-shaking-up-job-market/
  81. World Economic Forum. (2022). AI’s trolley problem debate can lead us to surprising conclusions. https://www.weforum.org/agenda/2022/05/ai-s-trolley-problem-debate-can-lead-us-to-surprising-conclusions/
  82. Zhang, C., Patras, P., & Haddadi, H. (2018). Deep learning in mobile and wireless networking: A survey. IEEE Communications Surveys & Tutorials, 21, 2224-2287. https://doi.org/10.1109/COMST.2019.2904897
  83. Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. Public Affairs.