Mauricio E Reyes, Ivan V Meza and Luis A Pineda (2019). Robotic facial expression of anger in collaborative human-robot interaction. Int J Adv Robot Syst, January-February 2019:1-12. DOI: 10.1177/1729881418817972. ISSN: 1729-8806; eISSN: 1729-8814.

  1. Mikhail Gorkavyy, Yuri Ivanov, Sergey Sukhorukov, Sergey Zhiganov, Makrel Melnichenko, Alexander Gorkavyy and Daniil Grabar.  Improving Collaborative Robotic Complex Efficiency: An Approach to the Intellectualization of the Control System.15th International Conference “Intelligent Systems”, p. 135--142. https://dlib.hust.edu.vn/bitstream/HUST/25279/1/OER000003291.pdf#page=136
  2. Ho, A.G. (2025). Cross-Cultural Differences in Emotional Response to Visual Information. Springer Series in Design and Innovation, 49, 65-75, https://doi.org/10.1007/978-3-031-73705-3_5
  3. Viviane Herdel, Yisrael Parmet, and Jessica R. Cauchard. 2025. Exploring the Effects of Emotion Appropriateness on User Perception: A Delivery Drone Case Study. In Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction (HRI '25). IEEE Press, 747756.
  4. Shu L, Barradas VR, Qin Z and Koike Y (2025). Facial expression recognition through muscle synergies and estimation of facial keypoint displacements through a skin-musculoskeletal model using facial sEMG signals. Front. Bioeng. Biotechnol. 13:1490919. doi: 10.3389/fbioe.2025.1490919
  5. Kim, HN (2024). Recognizing facially expressed emotions in videos of people with visual impairments in online settings. Technology and Disability, https://doi.org/10.3233/TAD-230040
  6. Abawi, Fares, Multimodal Social Cue Integration for Attention Modeling and Robot Gaze Control, Doctoral Dissertation, Universität Hamburg, Faculty of Mathematics, Informatics and Natural Sciences, Department of Informatics, 2024. file:///Users/luis/Downloads/DoctoralDissertation_FaresAbawi_2024-1.pdf
  7. F. Vigni, The Unscripted Encounter: Social Cues for Spontaneous Robotics Interactions, Doctoral Thesis, Ph.D. Program in Information and Communication Technology for Health, Università degli Studi di Napoli Federico II, 2024. https://icth.dieti.unina.it/images/Thesis/XXXVII/Vigni_Francesco_THESIS.pdf
  8. Ottoni, L.T.C., Cerqueira, J.d.J.F. A Systematic Review of HumanRobot Interaction: The Use of Emotions and the Evaluation of Their Performance. Int J of Soc Robotics, 16(11), 2169-2188, (2024). https://doi.org/10.1007/s12369-024-01178-2
  9. Keisuke Nishiwaki; Dražen Brščić; Takayuki Kanda, Expressing Anger with Robot for Tackling the Onset of Robot Abuse, ACM Transactions on Human-Robot Interaction, 14(1), 2024-09-24 | Journal article. DOI: 10.1145/3696467
  10. Herdel, Viviane and Cauchard, Jessica R.,Crafting for Emotion Appropriateness in Affective Robotics: Examining the Practicality of the OCC Model,2024, Proceedings of the ACM on Human-Computer Interaction, Volume 8, Issue MHCI Article No.: 248, Pages 1 19. https://doi.org/10.1145/367649
  11. Agnese Salutari, Harmonizing Users and Systems Requirement in Complex and Resource Intensive Application Domains by Distributed Hybrid Approach, Tesis doctoral, Universitá Degli Studi DelLAquila, Dottorato di Ricerca in Information and Communication Technologies (ICT). 2024. file:///Users/luis/Downloads/Tesi_PhD_Agnese_Salutari.pdf
  12. Gorkavyy, M., Ivanov, Y., Sukhorukov, S., Zhiganov, S., Melnichenko, M., Gorkavyy, A., & Grabar, D. (2023). Improving Collaborative Robotic Complex Efficiency: An Approach to the Intellectualization of the Control System †. Engineering Proceedings, 33(1), https://doi.org/10.3390/engproc2023033018
  13. Huy Quyen Ngo (2024). Human Perception of Robot Failure and Explanation. MSc. Thesis. School of Computer Science, The Robotics Institute, Carnegie Mellon University. https://www.ri.cmu.edu/app/uploads/2024/05/HuyQuyenNgo_MSR_Thesis.pdf
  14. Malik Haris, Muhammad Shahid Mastoi (2024). Empowering Communication: Utilizing Facial Expressions to Classify Interruptibility in the Workplace, Research Square, https://doi.org/10.21203/rs.3.rs-4366311/v1.
  15. Liana Linvik: Designing a social robot to evoke climate hope among youth via social media M.Sc. Thesis, Tampere University, Masters Degree Programme in Human-Technology Interaction, May 2024. https://trepo.tuni.fi/bitstream/handle/10024/155876/LinvikLiana.pdf?sequence=2
  16. Martina Gassen, Gait Embedings, Teaching/Humanoids Robotis Seminar, IAS TU Darmstat,  (2024).

https://www.ias.informatik.tu-darmstadt.de/uploads/Teaching/HumanoidRoboticsSeminar/hr_martina_gassen.pdf

  1. Ezra Tsur, E.; Elkana, O. Intelligent Robotics in Pediatric Cooperative Neurorehabilitation: A Review. Robotics 2024, 13(3), https://doi.org/10.3390/robotics13030049
  2. Herdel, V., Cauchard, J.R.  Emotion Appropriateness in HumanDrone Interaction. Int J of Soc Robotics 16(3), 579-597 (2024). https://doi.org/10.1007/s12369-023-01094-x
  3. Dorofeev, N. V., Grecheneva, A. V., & Podmasteryev, K. V. (2023, November). Research of gait changes in a personalized control system. In AIP Conference Proceedings (Vol. 2948, No. 1). AIP Publishing.
  4. Sakamoto, Y., Herath, A., Vuradi, T., Sallam, S., Gomez, R., & Irani, P. (2023, December). How Should a Social Robot Deliver Negative Feedback Without Creating Distance Between the Robot and Child Users? In Proceedings of the 11th International Conference on Human-Agent Interaction (pp. 325-334). https://doi.org/10.1145/3623809.3623882
  5. Hyung Nam Kim (2024) Sighted People Recognizing Emotions in Facial Expression Images of People With Visual Disabilities Via Cyberspace, International Journal of HumanComputer Interaction, DOI: 10.1080/10447318.2023.2285639
  6. Wang, T., Liu, L., Yang, L., & Yue, W. (2023). Creating the optimal design approach of facial expression for the elderly intelligent service robot. Journal of Advanced Mechanical Design, Systems and Manufacturing, 17(5), https://doi.org/10.1299/jamdsm.2023jamdsm0061
  7. Koike, A, & Mutlu, B (2023). Exploring the Design Space of Extra-Linguistic Expression for Robots. Proceedings of the 2023 ACM Designing Interactive Systems Conference, p. 2689-2706. https://doi.org/10.1145/3563657.3595968
  8. Koike, Amy & Mutlu, Blige. (2023). Exploring the Design Space of Extra-Linguistic Expression for Robots, arXiv:2306.15828v1.
  9. Tetsuya Matsui. (2023). Emotional gradients of characters in digital games and their impressions on users: the case of “THE IDOL@SRER SHINY COLORS”, Research Square, https://doi.org/10.21203/rs.3.rs-3070299/v1.
  10. Gorkavyy, M.; Ivanov, Y.; Sukhorukov, S.; Zhiganov, S.; Melnichenko, M.; Gorkavyy, A.; Grabar, D. Improving Collaborative Robotic Complex Efficiency: An Approach to the Intellectualization of the Control System. Eng. Proc. 2023, 33, 18. https://doi.org/10.3390/engproc2023033018
  11. Fu, D., Abawi, F., & Wermter, S. (2023). The Robot in the Room: Influence of Robot Facial Expressions and Gaze on Human-Human-Robot Collaboration. IEEE International Workshop on Robot and Human Communication, RO-MAN, 85-91, https://doi.org/10.1109/RO-MAN57019.2023.10309334
  12. Fu, D., Abawi, F., & Wermter, S. (2023). The Robot in the Room: Influence of Robot Facial Expressions and Gaze on Human-Human-Robot Collaboration. arXiv preprint arXiv:2303.14285.
  13. Vigni, F., Rossi, A., Miccio, L., & Rossi, S. (2023, February). On the Emotional Transparency of a Non-humanoid Social Robot. In Social Robotics: 14th International Conference, ICSR 2022, Florence, Italy, December 1316, 2022, Proceedings, Part I (pp. 290-299). Cham: Springer Nature Switzerland.
  14. Vigni, F., Rossi, A., Miccio, L., & Rossi, S. (2022). On the Emotional Transparency of a Non-humanoid Social Robot. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 13817, 290-299, https://doi.org/10.1007/978-3-031-24667-8_26
  15. Oravec, J.A. (2022). Robo-Rage Against the Machine: Abuse, Sabotage, and Bullying of Robots and Autonomous Vehicles. In: Good Robot, Bad Robot. Social and Cultural Studies of Robots and AI. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-031-14013-6_8
  16. Maculotti, G., Ulrich, L., Olivetti, E.C., Genta, G., Marcolin, F., Vezzetti, E., & Galetto, M. (2022). A methodology for task-specific metrological characterization of low-cost 3D camera for face analysis. Measurement: Journal of the International Measurement Confederation, 200, https://doi.org/10.1016/j.measurement.2022.111643
  17. Ramadhan, A.D., Usman, K., Pratiwi, N.K.C. (2022). Comparative Analysis of Various Optimizers on Residual Network Architecture for Facial Expression Identification. In: Triwiyanto, T., Rizal, A., Caesarendra, W. (eds) Proceedings of the 2nd International Conference on Electronics, Biomedical Engineering, and Health Informatics. Lecture Notes in Electrical Engineering, vol 898. Springer, Singapore. https://doi.org/10.1007/978-981-19-1804-9_22
  18. Manuela Pollak and Andrea Salfinger and Karin Anna Hummel, Teaching Drones on the Fly: Can Emotional Feedback Serve as Learning Signal for Training Artificial Agents?, arXvi 2202.09634, 2022.
  19. Горькавый, МА, Горькавый, АИ, & ... (2022). Специфика архитектуры цифрового двойника коллаборативного роботизированного процесса на базе мультиагентных систем. Известия Тульского …, cyberleninka.ru, https://cyberleninka.ru/article/n/spetsifika-arhitektury-tsifrovogo-dvoynika-kollaborativnogo-robotizirovannogo-protsessa-na-baze-multiagentnyh-sistem
  20. ГОРЬКАВЫЙ, МА, ГОРЬКАВЫЙ, АИ, СОЛОВЬЕВ, ВА, & ... ИЗВЕСТИЯ ТУЛЬСКОГО ГОСУДАРСТВЕННОГО УНИВЕРСИТЕТА. ТЕХНИЧЕСКИЕ НАУКИ. ИЗВЕСТИЯ, elibrary.ru, https://elibrary.ru/item.asp?id=48663420
  21. L. T. Cordeiro Ottoni and J. de Jesus Fiais Cerqueira, "A Review of Emotions in Human-Robot Interaction," 2021 Latin American Robotics Symposium (LARS), 2021 Brazilian Symposium on Robotics (SBR), and 2021 Workshop on Robotics in Education (WRE), 2021, pp. 7-12, doi: 10.1109/LARS/SBR/WRE54079.2021.9605479.
  22. V. Herdel, A. Kuzminyk, A. Hildebrandt, J. R. Cauchard. Drone in Love: Emotional Perception of Facial Expressions on Flying Robots, CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. May 2021 Article No.: 716, pages 120, https://doi.org/10.1145/3411764.3445495
  23. CA Ajibo, CT Ishi, H Ishiguro. Advocating attitudinal change through android robots intention-based behaviors: Towards WHO COVID-19 guiding adherence. IIE Robotics and Automation Letters, 2021.
  24. Stock-Homburg, R. Survey of Emotions in HumanRobot Interactions: Perspectives from Robotic Psychology on 20 Years of Research. Int J of Soc Robotics (2021). https://doi.org/10.1007/s12369-021-00778-6
  25. Wicke Philipp, Veale Tony, Creative Action at a Distance: A Conceptual Framework for Embodied Performance With Robotic Actors, Frontiers in Robotics and AI, 8, 2021, p. 115,https://www.frontiersin.org/article/10.3389/frobt.2021.662182     DOI.10.3389/frobt.2021.662182, ISSN=2296-9144
  26. Oidekivi, M., Nolte, A., Aabloo, A., & Kruusamae, K. (2021). Identifying emotions from facial expression displays of robots - Results from a survey study. International Conference on Human System Interaction, HSI, 2021-July. https://doi.org/10.1109/HSI52170.2021.9538774
  27. Chesher, C., & Andreallo, F. (2021). Robotic Faciality: The Philosophy, Science and Art of Robot Faces. INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 13(1, SI), 8396. https://doi.org/10.1007/s12369-020-00623-2
  28. Place, C. J. (2021). Sociolinguistic Impacts of Reactance in Law Enforcement Investigative Interviews: A Systematic Literature Review. University of Arizona Global Campus.
  29. Chinenye Augustine Ajibo, Carlos Toshinori Ishi, Ryusuke Mikata, Chaoran Liu & Hiroshi Ishiguro (2020) Analysis of body gestures in anger expression and evaluation in android robot, Advanced Robotics, 34:24, 1581-1590, DOI: 10.1080/01691864.2020.1855244
  30. Chesher, C., & Andreallo, F. (2020). Robotic Faciality: The Philosophy, Science and Art of Robot Faces. International Journal of Social Robotics. https://doi.org/10.1007/s12369-020-00623-2
  31. El Zaatari, S., Marei, M., Li, W., & Usman, Z. (2019). Cobot programming for collaborative industrial tasks: An overview. ROBOTICS AND AUTONOMOUS SYSTEMS, 116, 162180. https://doi.org/10.1016/j.robot.2019.03.003
  32. C. A. Ajibo, C. T. Ishi, R. Mikata, C. Liu, and H. Ishiguro, “Analysis of body gestures in anger expression and evaluation in android robot,” Adv. Robot., no. SI. https://doi.org/10.1080/01691864.2020.1855244
  33. G. Palestra and O. Pino, “Detecting emotions during a memory training assisted by a social robot for individuals with Mild Cognitive Impairment (MCI),” Multimed. Tools Appl., vol. 79, no. 4748, pp. 3582935844, 2020.
  34. J. K.-I. T. on S. P. & Computing and undefined 2020, “Expressions of Emotion in Smart-doll Eyes using a Micro Display,” Journal Auric.Kr, 9(3). https://doi.org/10.5573/IEIESPC.2020.9.3.193



Citas “B”


  1. Castillo, MER, Murillo, AJF, & ... (2022). Consideraciones en el diseño de robots para la atención médica en el mundo post COVID-19. Cultura Científica Tecnológica, erevistas.uacj.mx, http://erevistas.uacj.mx/ojs/index.php/culcyt/article/view/4571
  2. Fonseca, A, Mendoza, CRC, & ... (2022). Consideraciones en el diseño de robots para la atención médica en el mundo post COVID-19. … : Cultura Científica y …, dialnet.unirioja.es, https://dialnet.unirioja.es/servlet/articulo?codigo=8426099