Research / AR, VR & Haptic Experiences - Exploring wearable technology, spatial computing, and human-computer interaction

Role
Researcher & Developer
Timeline
2021-2023
Skills
Throughout my graduate studies and early career, I explored the intersection of wearable technology, spatial computing, and human-computer interaction. My work spans from research prototyping to published studies, focusing on how emerging technologies can enhance accessibility and create more immersive user experiences.
I collaborated with cross-functional teams including researchers, engineers, and domain experts to design, prototype, and evaluate novel interaction paradigms for VR and AR platforms.
My extended reality work centers on three key areas: accessibility through multimodal feedback systems, spatial interaction design for immersive environments, and rapid prototyping methodologies for emerging platforms. Each project below represents a different facet of this research journey.
Tools & Platforms: Unity, C#, Meta Quest, Haptics Studio, ARCore, Mixed Reality Toolkit
The following showcases key projects from my wearables and extended reality portfolio.
Virtual Reality experiences rely heavily on spatial audio for immersion and navigation. However, users who are deaf or hard of hearing (DHH) miss critical audio cues that indicate direction, distance, and environmental context. SoundHapticVR explores how head-mounted haptic feedback can provide an alternative sensory channel for spatial sound information.
The research demonstrated that head-based haptic feedback can effectively convey spatial audio information, with participants successfully identifying sound direction and urgency through haptic patterns alone. The findings contribute to making VR more accessible for the DHH community.
Location-based AR experiences require precise positioning and contextual awareness to create meaningful interactions with the physical world. This exploratory project investigated how geospatial APIs and visual positioning systems (VPS) can enable persistent, world-anchored AR content for educational and cultural heritage applications.
The project produced functional prototypes demonstrating how AR can surface historical and contextual information tied to specific physical locations, along with insights into the technical constraints and design considerations for building reliable geospatial AR experiences.
As part of my Spring 2023 Co-Op at Bose Corporation R&D Labs, I was tasked with understanding the magnitude of perceptual errors in audio externalization algorithms and the importance of personalized HRTFs (Head-Related Transfer Functions) for spatial audio rendering.
Tools: Matlab, C#, Unity, Python, OptiTrack Motion Capture, Bose QC35, Meta Quest 2
As a Technical Intern Co-Op, I was responsible for proposing an experimental design, building the prototype, and conducting user studies. I also showcased the experiment design setup at Bose's Annual Internal Tech Event 'Bose Lab Expo' in 2023.
We designed a localization accuracy experiment to evaluate how well the center image of sound sits in the virtual auditory 3D space, comparing generic HRTFs against personalized measurements. The study involved 8 'Critical Listeners' whose personalized HRTFs were measured, using Bose QC35 for audio playback and Quest 2 for the visual environment.
Our findings showed negligible elevation error overall—while participants occasionally perceived the sound slightly higher, the offset remained close to center. More importantly, personalized HRTFs consistently improved localization accuracy for the majority of subjects, validating the value of personalization in spatial audio rendering.
Note: Data analysis and detailed processes are not disclosed due to NDA; the full methodology can be shared upon request.
I prototyped an experience to enhance user awareness of the eye-tracking feature in use. This included designing concepts for obtaining user consent to track eye data and clearly indicating when tracking is active. The prototype was later used in an academic study, which resulted in a full paper published at an international conference.
This was my first project using Unity, where I experimented with building a VR interface to explore the relationship between visual memory and saccadic eye movements. I created a simple Visual Memory Game in VR, integrated Pupil Labs' eye-tracking framework, and captured users' eye-movement data for analysis.