Emotional Attitudes towards the Components of the Digital Environment (Based on the Text Analysis of Network Comments)

Cover Page

Cite item


One of the psychological effects of digitalization is the establishment of specific relationships between a person and the cyber environment and its components. The paper presents the results of a study of the emotional component of attitudes towards the components of the digital environment, carried out using emotive-predicate analysis, a new method of computer text processing implemented in TITANIS, an advanced social media text analysis tool. Using this method, it is possible to automatically extract descriptions of emotional situations from texts in which the components of the digital environment are the cause or subject of 68 emotional states. The texts of 2048 online videos posted in the Russian-language segment of YouTube served as the material for the analysis. According to the frequency of occurrence, emotional situations involving various components of the digital environment are quite typical for even thematically irrelevant online discussions. The components of the digital environment mentioned in non-thematic discussions as participants in emotional situations are classified into the following three groups: (1) general concepts of digital technologies; (2) digital devices; (3) activities mediated by digital technologies. The lexemes of the latter group, denoting various aspects of network communication, are included in the vast majority of descriptions of emotional situations involving the components of the digital environment, and six times more often as causes of emotions than as subjects of emotional states. In general, the emotional attitude towards the components of the cyber environment as a whole is characterized as balanced, without a noticeable predominance of negative or positive emotions. However, negative states are more often attributed to the components of the cyber environment as the subjects than as the causes of emotions. The practical significance of the described method of text analysis as a means of assessing the emotional component of attitudes towards the components of the digital environment is determined by the influence that affective reactions of users have on the demand for technical innovations and the direction of their development.

About the authors

Yulia M. Kuznetsova

Federal Research Center for Computer Science and Control of Russian Academy of Sciences

Author for correspondence.
Email: kuzjum@yandex.ru
ORCID iD: 0000-0001-9380-4478

Ph.D. in Psychology, is Senior Researcher

9 Prospekt 60-Letiya Oktyabrya, Moscow, 117312, Russian Federation


  1. Bartneck, C., Kulić, D., Croft, E., & Zoghbi, S. (2009). Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International Journal of Social Robotics, 1(1), 71–81. https://doi.org/10.1007/s12369-008-0001-3
  2. Belyaev, G.Yu. (2020). Social and digital environment as a source of new opportunities and new risks for modern education. Otechestvennaya i Zarubezhnaya Pedagogika, (4(69)), 109–123.
  3. Briggs, G., Gessell, B., Dunlap, M., & Scheutz, M. (2014). Actions speak louder than looks: Does robot appearance affect human reactions to robot protest and distress? The 23rd IEEE International Symposium on Robot and Human Interactive Communication: Conference Proceedings (pp. 1122–1127). Edinburgh: IEEE. http://doi.org/10.1109/ROMAN.2014.6926402
  4. Brondi, S., Pivetti, M., Di Battista, S., & Sarrica, M. (2021). What do we expect from robots? Social representations, attitudes and evaluations of robots in daily life. Technology in Society, 66, 101663. https://doi.org/10.1016/j.techsoc.2021.101663
  5. Burov, S.P. (2018). Methods for assessing social human-robot interaction. Gumanitarnaya Informatika, (14), 18–26. (In Russ.) https://doi.org/10.17223/23046082/14/2
  6. Carroll, J.M. (1997). Human – computer interaction: Psychology as a science of design. Annual Review of Psychology, 48, 61–83. https://doi.org/10.1146/annurev.psych.48.1.61
  7. Cave, S., Coughlan, K., & Dihal, K. (2019). “Scary Robots:” Examining public responses to AI. AIES '19: Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society (pp. 331–337). New York: Association for Computing Machinery. http://doi.org/10.1145/3306618.3314232
  8. Connolly, J., Mocz, V., Salomons, N., Valdez, J., Tsoi, N., Scassellati, B., & Vázquez, M. (2020). Prompting prosocial human interventions in response to robot mistreatment. HRI '20: Proceedings of the 2020 ACM/IEEE International Conference on Human – Robot Interaction (pp. 211–220). New York: Association for Computing Machinery. http://doi.org/10.1145/3319502.3374781
  9. Doyle-Burke, D., & Haring, K.S. (2020). Robots are moral actors: Unpacking current moral HRI research through a moral foundations lens. Social Robotics: 12th International Conference, ICSR 2020, Golden, CO, USA, November 14–18, 2020, Proceedings (pp. 170–181). Berlin, Heidelberg: Springer. https://doi.org/10.1007/978-3-030-62056-1_15
  10. Enikolopov, S.N., Kuznetsova, Y.M., Osipov, G.S., Smirnov, I.V., & Chudova, N.V. (2021). The method of relational-situational analysis of text in psychological research. Psychology. Journal of the Higher School of Economics, 18(4), 748–769. (In Russ.) https://doi.org/10.17323/1813-8918-2021-4-748-769
  11. Epley, N., Waytz, A., & Cacioppo, J.T. (2007). On seeing human: a three-factor theory of anthropomorphism. Psychological Review, 114(4), 864–886. https://doi.org/10.1037/0033-295X.114.4.864
  12. Fast, E., & Horvitz, E. (2017). Long-term trends in the public perception of artificial intelligence. AAAI'17: Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (pp. 963–969). AAAI Press. https://doi.org/10.48550/arXiv.1609.04904
  13. Gao, S., He, L., Chen, Y., Li, D., & Lai, K. (2020). Public perception of artificial intelligence in medical care: Content analysis of social media. Journal of Medical Internet Research, 22(7), e16649. https://doi.org/10.2196/16649
  14. Gimaletdinova, G.K., & Dovtaeva, E.Kh. (2020). Sentiment analysis of the reader’s internet commentary on a political text. Political Linguistics, (1), 42–51. (In Russ.) https://doi.org/10.26170/pl20-01-05
  15. Goldenberg, A., Garcia, D., Halperin, E., & Gross, J.J. (2020). Collective emotions. Current Directions in Psychological Science, 29(2), 154–160. https://doi.org/10.1177/0963721420901574
  16. Gregor, B., & Gotwald, B. (2021). Perception of artificial intelligence by customers of science centers. Problemy Zarządzania (Management Issues), 19(1), 29–39. https://doi.org/10.7172/1644-9584.91.2
  17. Horstmann, A.C., Bock, N., Linhuber, E., Szczuka, J.M., Straßmann, C., & Krämer, N.C. (2018). Do a robot’s social skills and its objection discourage interactants from switching the robot off? PLoS ONE, 13(7), e0201581. https://doi.org/10.1371/journal.pone.0201581
  18. Io, H.N., & Lee, C.B. (2020) Social media comments about hotel robots. Journal of China Tourism Research, 16(4), 606–625. https://doi.org/10.1080/19388160.2020.1769785
  19. Keijsers, M., & Bartneck, C. (2018). Mindless robots get bullied. HRI '18: Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (pp. 205–214). New York: Association for Computing Machinery. https://doi.org/10.1145/3171221.3171266
  20. Kelley, P.G., Yang, Y., Heldreth, C., Moessner, C., Sedley, A., & Woodruff, A. (2021). “Mixture of amazement at the potential of this technology and concern about possible pitfalls:” Public sentiment towards AI in 15 countries. IEEE Data Engineering Bulletin, 44(4), 28–46.
  21. Korotkova, V.O., & Lobza, O.V. (2021). Personal determinants of the functioning of mental structures in the digital space. Research Result. Pedagogy and Psychology of Education, 7(4), 59–73. (In Russ.) https://doi.org/10.18413/2313-8971-2021-7-4-0-5
  22. Kwon, D., Chung, M.J., Park, J.C., Yoo, C.D., Jee, E., Park, K., Kim, Y., Kim, H., Park, J., Min, H., Park, J.W., Yun, S., & Lee, K. (2008). Emotional exchange of a socially interactive robot. IFAC Proceedings Volumes, 41(2), 4330–4335. https://doi.org/10.3182/20080706-5-KR-1001.00729
  23. Lazer, D.M.J., Pentland, A., Watts, D.J., Aral, S., Athey, S., Contractor, N., Freelon, D., Gonzalez-Bailon, S., King, G., Margetts, H., Nelson, A., Salganik, M.J., Strohmaier, M., Vespignani, A., & Wagner, C. (2020). Computational social science: Obstacles and opportunities. Science, 369(6507), 1060–1062. https://doi.org/10.1126/science.aaz8170
  24. Li, S., Xu, L., Yu, F., & Peng, K. (2020). Does trait loneliness predict rejection of social robots? The role of reduced attributions of unique humanness: Exploring the effects of trait loneliness on anthropomorphism and acceptance of social robots. HRI '20: Proceedings of the 2020 ACM/IEEE International Conference on Human – Robot Interaction (pp. 271–280). New York: Association for Computing Machinery. https://doi.org/10.1145/3319502.3374777
  25. Liu, X., Burns, A.C., & Hou, Y. (2017). An investigation of brand-related usergenerated content on Twitter. Journal of Advertising, 46(2), 236–247. https://doi.org/10.1080/00913367.2017.1297273
  26. Lucas, H., Poston, J., Yocum, N., Carlson, Z., & Feil-Seifer, D. (2016). Too big to be mistreated? Examining the role of robot size on perceptions of mistreatment. 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN): Conference Proceedings (pp. 1071–1076). IEEE Press. https://doi.org/10.1109/ROMAN.2016.7745241
  27. Malinowska, J.K. (2021). What does it mean to empathise with a robot? Minds and Machines, 31(3), 361–376. https://doi.org/10.1007/s11023-021-09558-7
  28. Mdivani, M. (2019). Anthropomorphic trends in the perception of a personal car. Vestnik Samarskoi Gumanitarnoi Akademii. Seriya: Psikhologiya, (1), 74–81. (In Russ.)
  29. Mdivani, M.O. (2018). Interaction of individual with anthropogenous environment. Social Sciences and Humanities: Theory and Practice, (1), 535–547. (In Russ.)
  30. Melo, C.D., Carnevale, P.J., & Gratch, J. (2011). The effect of expression of anger and happiness in computer agents on negotiations with humans. AAMAS '11: The 10th International Conference on Autonomous Agents and Multiagent Systems: Conference Proceedings (vol. 3, pp. 937–944). Richland: IFAAMAS.
  31. Nadarzynski, T., Miles, O., Cowie, A., & Ridge, D. (2019). Acceptability of artificial intelligence (AI)-led chatbot services in healthcare: A mixed-methods study. Digital Health, 5. https://doi.org/10.1177/2055207619871808
  32. Naneva, S., Gou, M.S., Webb, T.L., & Prescott, T.J. (2020). A systematic review of attitudes, anxiety, acceptance, and trust towards social robots. International Journal of Social Robotics, 12(6), 1179–1201. https://doi.org/10.1007/s12369-020-00659-4
  33. Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103. https://doi.org/10.1111/0022-4537.00153
  34. Nicolas, S., & Wykowska, A. (2021). The personality of anthropomorphism: How the need for cognition and the need for closure define attitudes and anthropomorphic attributions toward robots. Computers in Human Behavior, 122, 106841. https://doi.org/10.1016/J.CHB.2021.106841
  35. Nomura, T., Fujita, A., Suzuki, D., & Umemuro, H. (2015). Development of the multi-dimensional robot attitude scale: Constructs of people’s attitudes towards domestic robots. Social Robotics. ICSR 2015. Lecture Notes in Computer Science: Conference Proceedings (vol. 9388, pp. 482–491). Cham: Springer. https://doi.org/10.1007/978-3-319-25554-5_48
  36. Nomura, T., Kanda, T., & Suzuki, T. (2006). Experimental investigation into influence of negative attitudes toward robots on human – robot interaction. AI & Society, 20(2), 138–150. https://doi.org/10.1007/s00146-005-0012-7
  37. Nomura, T., Kanda, T., Suzuki, T., & Kato, K. (2008). Prediction of human behavior inhuman – robot interaction using psychological scales for anxiety and negative attitudes toward robots. IEEE Transactions on Robotics, 24(2), 442–451. https://doi.org/10.1109/TRO.2007.914004
  38. Onnasch, L., & Roesler, E. (2019). Anthropomorphizing robots: The effect of framing in human – robot collaboration. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 63(1), 1311–1315. https://doi.org/10.1177/1071181319631209
  39. Ouchchy, L., Coin, A., & Dubljevic, V. (2020). AI in the headlines: the portrayal of the ethical issues of artificial intelligence in the media. AI & Society, 35(4), 927–936. https://doi.org/10.1007/s00146-020-00965-5
  40. Panov, V.I., & Patrakov, E.V. (2020). Digitalization of the information environment: Risks, representations, interactions. Moscow: Psychological Institute of RAE Publ.; Kursk: Universitetskaya Kniga Publ. (In Russ.) https://doi.org/10.47581/2020/02.Panov.001
  41. Ray, C., Mondada, F., & Siegwart, R.Y. (2008). What do people expect from robots? 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems: Conference Proceedings (pp. 3816–3821). IEEE. https://doi.org/10.1109/IROS.2008.4650714
  42. Reeves, B., Hancock, J., & Liu, X. (2020). Social robots are like real people: First impressions, attributes, and stereotyping of social robots. Technology, Mind, and Behavior, 1(1). https://doi.org/10.1037/tmb0000018
  43. Riek, L.D., Rabinowitch, T., Chakrabarti, B., & Robinson, P. (2009). Empathizing with robots: Fellow feeling along the anthropomorphic spectrum. 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops: Conference Proceedings (pp. 1–6). IEEE. https://doi.org/10.1109/ACII.2009.5349423
  44. Rosen, J., Lindblom, J., Billing, E., & Lamb, M. (2021). Ethical challenges in the human – robot interaction field. ArXiv. arXiv:2104.09306. https://doi.org/10.48550/arXiv.2104.09306
  45. Rosenthal-von der Pütten, A.M., Krämer, N.C., Hoffmann, L., Sobieraj, S., & Eimler, S.C. (2013). An experimental study on emotional reactions towards a robot. International Journal of Social Robotics, 5(1), 17–34. https://doi.org/10.1007/s12369-012-0173-8
  46. Ruijten, P.A.M., & Zhao, T. (2017). Computers and people alike. Investigating the similarity-attraction paradigm in persuasive technology. Persuasive Technology: Development and Implementation of Personalized Technologies to Change Attitudes and Behaviors. PERSUASIVE 2017. Lecture Notes in Computer Science: Conference Proceedings (vol. 10171, pp. 135–147). Cham: Springer. https://doi.org/10.1007/978-3-319-55134-0_11
  47. Savela, N., Garcia, D., Pellert, M., & Oksanen, A. (2021). Emotional talk about robotic technologies on Reddit: Sentiment analysis of life domains, motives, and temporal themes. New Media & Society. https://doi.org/10.1177/14614448211067259
  48. Schmidtler, J., Bengler, K., Dimeas, F., & Campeau-Lecours, A. (2017). A questionnaire for the evaluation of physical assistive devices (QUEAD): Testing usability and acceptance in physical human-robot interaction. 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC): Conference Proceedings (pp. 876–881). IEEE. https://doi.org/10.1109/SMC.2017.8122720
  49. Shpilnaya, N.N. (2018). The principle of the suppositional connection of the lexeme and the text as a key principle of human-computer communication organization. Kul'tura i Tekst, (4), 209–226. (In Russ.)
  50. Spatola, N., & Wudarczyk, O.A. (2021). Ascribing emotions to robots: Explicit and implicit attribution of emotions and perceived robot anthropomorphism. Computers in Human Behavior, 124, 106934. https://doi.org/10.1016/j.chb.2021.106934
  51. Stepnova, L.A., Safonova, T.E., & Kostyuk, Ju.A. (2020). Study of students’ digital consciousness by the method of semantic differential. World of Science. Pedagogy and Psychology, 8(6), 71. (In Russ.)
  52. Strait, M.K., Aguillon, C., Contreras, V., & Garcia, N. (2017). The public’s perception of humanlike robots: Online social commentary reflects an appearance-based uncanny valley, a general fear of a “Technology Takeover”, and the unabashed sexualization of female-gendered robots. 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN): Conference Proceedings (pp. 1418–1423). IEEE. https://doi.org/10.1109/ROMAN.2017.8172490
  53. Strait, M.K., Contreras, V., & Vela, C.D. (2018). Verbal disinhibition towards robots is associated with general antisociality. ArXiv. arXiv:1808.01076. https://doi.org/10.48550/arXiv.1808.01076
  54. Sundar, S.S. (2020). Rise of machine agency: A framework for studying the psychology of human – AI Interaction (HAII). Journal of Computer-Mediated Communication, 25(1), 74–88. https://doi.org/10.1093/jcmc/zmz026
  55. Swaminathan, V., Schwartz, H.A., Menezes, R., & Hill, S. (2022). The language of brands in social media: Using topic modeling on social media conversations to drive brand strategy. Journal of Interactive Marketing, 57(2), 255–277. https://doi.org/10.1177/10949968221088275
  56. Takayama, L., Ju, W., & Nass, C. (2008). Beyond dirty, dangerous and dull: What everyday people think robots should do. HRI '08: Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction (pp. 25–32). New York: Association for Computing Machinery. https://doi.org/10.1145/1349822.1349827
  57. Titova, T.A. (2013). Anthropomorphism as a way of mastering reality: Socio-philosophical analysis. Ph.D. in Philosophy Thesis. Kazan: Kazan Federal University. (In Russ.)
  58. Viik, T. (2020). Falling in love with robots: A phenomenological study of experiencing technological alterities. Paladyn, Journal of Behavioral Robotics, 11(1), 52–65. https://doi.org/10.1515/pjbr-2020-0005
  59. Wang, X., & Krumhuber, E.G. (2018). Mind perception of robots varies with their economic versus social function. Frontiers in Psychology, 9, 1230. https://doi.org/10.3389/fpsyg.2018.01230
  60. Waytz, A., Epley, N., & Cacioppo, J.T. (2010). Social cognition unbound: Insights into anthropomorphism and dehumanization. Current Directions in Psychological Science, 19(1), 58–62. https://doi.org/10.1177/0963721409359302
  61. Wiese, E., Metta, G., & Wykowska, A. (2017). Robots as intentional agents: Using neuroscientific methods to make robots appear more social. Frontiers in Psychology, 8, 1663. https://doi.org/10.3389/fpsyg.2017.01663
  62. Zhang, B. (2021, October 7) Public opinion toward artificial intelligence. OSF Preprints. https://doi.org/10.31219/osf.io/284sm
  63. Zhuravlev, A.L., & Nestik, T.A. (2016). Psychological aspects of negative attitudes toward new technologies. Psikhologicheskii Zhurnal, 37(6), 5–14. (In Russ.)
  64. Zilberman, N.N. (2019). Social robot in shopping malls: First results, challenges and prospects for researches. Gumanitarnaya Informatika, (16), 34–40. (In Russ.) https://doi.org/10.17223/23046082/16/5
  65. Zilberman, N.N., Chekunova, A.V., Gladky, D.A., & Kulikov I.A. (2015). Stereotypical children’s attitudes about social robot’s status-role characteristics. Modern Research of Social Problems, (4), 398–417. (In Russ.) https://doi.org/10.12731/2218-7405-2015-4-36

Copyright (c) 2022 Kuznetsova Y.M.

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies