PhD Prelim Announcement: Improving Conversation Analysis Pipeline through Personal Situated Analytics in Extended Reality

May 23rd, 2023

Categories: Applications, Human Factors, MS / PhD Thesis, Software, User Groups, Visualization, VR, Augmented Reality, Human Computer Interaction (HCI), Data Science

PSA is an immersive MR/VR framework for experiencing and analyzing recorded conversations and events. We see two participants represented as virtual avatars engaged in a conversation.
PSA is an immersive MR/VR framework for experiencing and analyzing recorded conversations and events. We see two participants represented as virtual avatars engaged in a conversation.

About

Ph.D. Student: Ashwini G. Naik

Committee Members:
Dr. Nikita Soni
Dr. Debaleena Chattopadhyay
Dr. Steve Jones (External Member)
Dr. Robert Kenyon (Co-Advisor)
Dr. Andrew Johnson (Advisor and Chair)

Date and Time: May 23, 2023, 9 AM-11 AM Chicago Time
Location: ERF 2068

Zoom Link: https://uic.zoom.us/j/7898336493

Abstract:
Strategizing and sensemaking in immersive environments have predominantly been explored for training, learning, and recreational tasks. The ability to embed these concepts into the data analysis pipeline is recently emerging and has seen limited research. In our work, we present Personal Situated Analytics (PSA) that provides individuals the ability to embed themselves in recorded events in multiple degrees of immersion on the Reality-Virtuality spectrum, evaluate the benefits, and study their exploration patterns. With the increasing availability of sensors and capturing devices, we can now actively record and leverage data from our daily activities to gain valuable insights and make informed decisions. PSA at its core formulates a combination of embodied cognition, situated analytics, and conversation analysis while exploiting the benefits of strategic immersion and sensemaking in immersive environments. To that end, we first developed a framework for embedding the individuals in the environment where the conversation originally occurred. Secondly, we conducted a pilot user study (n=12, protocol #STUDY2023-0018), to compare user experiences while exploring recorded conversations in Mixed Reality through the HoloLens2 device, and in Virtual Reality through the Quest2 device. Our proposed framework encompasses various stages such as tracking, data capturing, data cleaning, data synchronization, prototype building, and deploying the final product to end-user hardware. In the first part of the experiment, we used datasets consisting of seated participants (protocol #2022-0354). In the future, we will record and analyze datasets with standing and non-stationary participants (protocol #STUDY2023-0509) in two settings that would employ various collaboration profiles among the participants. Drawing from our experience during the preliminary user study and the feedback we received from the participants, we catalog our lessons learned and insights which can potentially drive advancements in embodied situated analytics and thereby enhance the conversation analysis pipeline.