PMID- 36277304 OWN - NLM STAT- PubMed-not-MEDLINE LR - 20221125 IS - 1875-4791 (Print) IS - 1875-4805 (Electronic) IS - 1875-4791 (Linking) VI - 14 IP - 9 DP - 2022 TI - MoTiS Parameters for Expressive Multi-Robot Systems: Relative Motion, Timing, and Spacing. PG - 1965-1993 LID - 10.1007/s12369-022-00936-4 [doi] AB - Multi-robot systems are moving into human spaces, such as working with people in factories (Bacula et al., in: Companion of the 2020 ACM/IEEE international conference on human-robot interaction, pp 119-121, 2020) or in emergency support (Wagner in Front Robot AI 8, 2021; Baxter et al., in: Autonomous robots and agents, Springer, pp 9-16, 2007) and it is crucial to consider how robots can communicate with the humans in the space. Our work evaluates a parameter framework to allow multi-robot groups of x, y, theta robots to effectively communicate using expressive motion. While expressive motion has been extensively studied in single robots (Knight et al., in: 2016 IEEE international conference on intelligent robots and systems (IROS), IEEE, 2016; Bacula and LaViers in Int J Soc Robot, 1-16, 2020; Dragan et al., in: 2013 8th ACM/IEEE international conference on human-robot interaction (HRI), IEEE, pp 301-308, 2013; Kirby et al., in: The 18th IEEE international symposium on robot and human interactive communication, 2009, RO-MAN 2009, IEEE, pp 607-612, 2009), moving to multi-robots creates new challenges as the state space expands and becomes more complex. We evaluate a hierarchical framework of six parameters to generate multi-robot expressive motion consisting of: (1) relative direction, (2) coherence, (3) relative speed, (4) relative start time, (5) proximity, and (6) geometry. We conducted six independent online studies to explore each parameter, finding that four out of six of the parameters had significant impact on people's perception of the multi-robot group. Additional takeaways of our studies clarify what humans interpret as a robot group, when the group is perceived positively versus negatively, and the critical role of architectural floor plan in interpreting robot intent. CI - (c) The Author(s), under exclusive licence to Springer Nature B.V. 2022, Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. FAU - Bacula, A AU - Bacula A AUID- ORCID: 0000-0001-8135-9899 AD - Collaborative Robotics and Intelligent Systems Institute, Oregon State University, Corvallis, OR 97331 USA. GRID: grid.4391.f. ISNI: 0000 0001 2112 1969 FAU - Knight, H AU - Knight H AD - Collaborative Robotics and Intelligent Systems Institute, Oregon State University, Corvallis, OR 97331 USA. GRID: grid.4391.f. ISNI: 0000 0001 2112 1969 LA - eng PT - Journal Article DEP - 20221017 PL - Netherlands TA - Int J Soc Robot JT - International journal of social robotics JID - 101622429 PMC - PMC9576134 OTO - NOTNLM OT - Expressivity OT - Human-robot interaction OT - Multi-robot systems OT - Social robotics EDAT- 2022/10/25 06:00 MHDA- 2022/10/25 06:01 PMCR- 2022/10/17 CRDT- 2022/10/24 04:30 PHST- 2022/10/05 00:00 [accepted] PHST- 2022/10/25 06:00 [pubmed] PHST- 2022/10/25 06:01 [medline] PHST- 2022/10/24 04:30 [entrez] PHST- 2022/10/17 00:00 [pmc-release] AID - 936 [pii] AID - 10.1007/s12369-022-00936-4 [doi] PST - ppublish SO - Int J Soc Robot. 2022;14(9):1965-1993. doi: 10.1007/s12369-022-00936-4. Epub 2022 Oct 17.