photo of Michelle Bebber

Michelle Bebber

Assistant Professor, CAS, Anthropology

Michelle's project, Visualizing the Past via Bodies in Motion: Reconstructing Paleolithic Art Through Immersive Technology, reconstructs the experience of creating Paleolithic cave art using immersive technology, motion capture, and sensory-rich simulations. By combining archaeology, biomechanics, digital media and the arts, it enables participants to physically engage with the spatial and gestural realities of ancient artistic practices. Utilizing Kent State’s Blank_Lab and XR_Collaboratory, the project both advances research into human creativity and evolution and seeks to generate new insights into the origins of aesthetic expression and connect modern audiences with ancient human experiences. 

Deebshikha Bhati

Deepshikha Bhati

Lecturer, Stark Campus, CAS, Computer Science

Deepshikha's project, Interpretable Generative AI for Education and Creative Industries, explores how explainable AI (XAI) can enhance transparency, trust, and ethical accountability in generative AI systems used across creative and educational fields. By integrating Layer-wise Relevance Propagation (LRP) into image generation pipelines like Stable Diffusion, she aims to demystify the “black box” of AI. Through interactive, web-based tools, users—including educators, students, and designers—will be able to visualize how input prompts shape AI outputs. 

woman smiling at camera in purple blouse

Kayon Hall

Assistant Professor, EHHS, Higher Education Administration and Student Affairs

Kayon's project is an interdisciplinary, multimedia project that reimagines black immigrant life beyond narratives of spectacle and survival. Drawing on theories of quiet and wake work (Quashie, Sharpe), it centers joy, rest, and interiority through a sensory archive built with photography, sound design, and community storytelling. Using the DI Hub’s Blank_Lab and immersive technologies, the project will create a traveling art exhibit and collaborative archive that honors the lived experiences of black immigrants and challenges extractive, trauma-centered representations of Blackness. 

photo of Raiful Hasan

Raiful Hasan

Assistant Professor, CAS, Computer Science

Raiful Hasan and Hadi Rahmati's collaborative project aims to develop a reciprocal, adaptive external Human–Machine Interface (eHMI) for autonomous vehicles, tailored to the diverse needs of pedestrians, including those who are distracted, disabled, or sensory-impaired. Drawing on Roman Jakobson’s communication theory, the system enables two-way, multi-modal interactions—responding to pedestrian feedback and adapting cues based on individual context. The team plans to simulate pedestrian–AV encounters using immersive environments while leveraging volumetric capture and virtual vehicle models to evaluate trust, inclusivity, and adaptability. The ultimate goal is to create an equitable, human-centered eHMI that enhances urban mobility and fosters social trust in autonomous systems. 

photo of Hadi Rahmati

Hadi Rahmati

Assistant Professor, CCI, Visual Communication Design

Raiful Hasan and Hadi Rahmati's collaborative project aims to develop a reciprocal, adaptive external Human–Machine Interface (eHMI) for autonomous vehicles, tailored to the diverse needs of pedestrians, including those who are distracted, disabled, or sensory-impaired. Drawing on Roman Jakobson’s communication theory, the system enables two-way, multi-modal interactions—responding to pedestrian feedback and adapting cues based on individual context. The team plans to simulate pedestrian–AV encounters using immersive environments while leveraging volumetric capture and virtual vehicle models to evaluate trust, inclusivity, and adaptability. The ultimate goal is to create an equitable, human-centered eHMI that enhances urban mobility and fosters social trust in autonomous systems. 

photo of David Silva

David Silva

Assistant Professor, CCI, Communication Studies & EMAT

David's project aims to transform Kent State’s DI Hub into a regional leader in scalable AR/VR training solutions by standardizing processes for developing immersive training materials across industries. While the university already possesses advanced 3D scanning and modeling technologies, current AR/VR initiatives remain isolated and industry-specific. By leveraging the XR_Collaboratory, DepthKit equipment, and faculty expertise in data design and user experience, this project will create a replicable pipeline for building cost-effective, high-impact training tools. A pilot collaboration with Crystal Diagnostics will demonstrate how immersive simulations can reduce training time and eliminate costly logistical barriers for biotech start-ups, while also enhancing workforce readiness for students. 

0
0