Process & Approach
I (product designer) collaborated closely with a Unity developer, backend engineer, and project owner in a bi-weekly Agile workflow. This allowed for rapid iteration, tight feedback loops, and a hands-on approach to problem-solving throughout the development process.
The goal of the project was developing a modular simulation of a hazardous work environment. Users would learn and test work protocols in a safe VR environment, that could prove dangerous or even deadly in the real world scenario. This would in turn ensure engagement, imprint of knowledge, and testing in a safe environment.
Research & Insights:
When I joined the ARGO project, the VR training module had already been in use for about two years. While the user base was small, we had some valuable usage data and insights to build on.
The need for an update was driven primarily by two factors: a shift from PC-tethered VR systems to standalone headsets, and a broader update in how Unity handled VR development. Speaking with clients, I learned that they appreciated the VR tool, but that it was mostly used as a showcase. Typically, a dedicated, VR-proficient staff member operated the system, and regular employees never used it independently.
At the same time, I was experimenting with the new hand tracking capabilities of the Meta Quest 2. This prompted a key question: how do our users actually prefer to navigate in VR? My hypothesis was that most of our target users weren’t particularly tech-savvy and found the setup process intimidating.
To validate this, I conducted both market and user research. First, I explored how leading companies and labs were implementing hand tracking and mapping real-world gestures to virtual interactions. This also helped me get familiar with the tech, understanding the limitations. Then, I spoke with both existing and new clients to better understand their expectations and comfort levels.
What I found was a broad range of tech familiarity among clients—from very tech-oriented to almost completely analog. However, a common theme emerged: if learning a new system wasn't absolutely necessary, they avoided it—especially in a work context. They also universally preferred simpler, lighter VR headsets with less setup steps.
Based on these insights, I proposed we create a test to reduce friction in using the VR system. I suspected that using controllers was a major barrier, since none of the users had gaming experience or familiarity with controller-based input. However, past research also suggested that users appreciated physical feedback and that tool-based interactions (like using a hammer or stick) translated well in VR.
To explore this further, I designed a short onboarding experience disguised as a game. It included key actions like moving, typing, opening/closing menus and doors, interacting with small objects (like dice), and using a virtual tool (a drill).
The tasks included in the demo were carefully selected to provide insights that would guide future feature development.
We first built and tested the demo using controllers. A month later, we reworked the same experience for hand tracking and tested it on the same group of users, allowing us to directly compare the two input methods.
From testing, I found that:
1. Users generally preferred hand tracking over controllers—except when it came to using virtual tools like the drill or moving around the space. In fact, users completed the demo roughly 50% fasterwith hand tracking.
2. The virtual keyboard proved to be a major friction point with hand tracking, consuming the most time during the experience.
3. Users enjoyed the feel of interacting with virtual objects—some even chose to repeat the demo just to play with elements like picking up and throwing items.
4. The key benefit of controllers was the tactile filing of using objects (vibration motors while using a drill).
5. One key discovery was that users didn’t understand the teleportation gesture for navigation. Instead, they preferred physically walking within the play area to reach different locations in the VR environment.
Design Decisions:
The research led us to redesign the system with hand tracking as the primary navigation method. We minimized the use of the virtual keyboard and focused on mimicking real-world interactions.
Design #1:
To support clear communication within the team, I created a simplified user journey map in Figma. We used team meetings to define the functionality of each step in the journey, which helped us prioritize features based on the project’s time constraints and set clear, achievable goals.
Given the application’s minimal UI, I skipped traditional wireframing and instead prepared rough building blocks for each specific step/function. As such, the UI is specific to the current state of the user (for example: user is in the process of working on the protocol, user is looking at the main menu, user has opened one information panel...).
The Unity developer built a simulated scenario based on the documentation our clients provided, and we worked closely together during the UI implementation phase to precisely define how graphical elements should behave in 3D space—including positioning, interactions, and animations.
I also designed new hand gestures for key actions like opening the main navigation menu and teleportation. These gestures were inspired by real-world behaviors—for instance, raising your hand as if checking the time would trigger the menu to appear on the user’s wrist. To streamline usability, the menu options would dynamically change based on the user’s location within the scene.
Special attention was given to tactile feedback, using sound and animation to reinforce key actions and make interactions feel more natural and engaging.
User validation:
After developing this improved version, we conducted another round of testing with new users using part of the final scenario (based on a real company protocol). Feedback was generally positive, however, testing revealed some additional hiccups.
One of such was the need for a gesture to open context panels (UI elements displaying object info) from a distance—something previously handled by controllers. We also learned that the wrist-based menu gesture was too sensitive and often triggered accidentally.
Design #2:
In response to insights gathered during user testing, we implemented several important changes to improve usability and overall user comfort within the application:
1. Gesture Redesign for Menu Access: The original gesture for opening the main navigation menu—raising the wrist as if checking the time—was replaced with an open palm gesture. This change made the interaction more intuitive, resembling the action of holding a tablet. The updated gesture aligned well with the redesigned menu layout, which now resembles a digital interface that users can view and interact with as if it were resting in their hand.
2. Personal or environmental UI Panels: In some scenarios, the menu now supports additional panels—such as object information, call assist, file access, and protocol references—that follow the user’s hand but can also be freely placed in 3D space for convenience (previously only supported 3D space placement).
3. Scene Design and Navigation Adjustments: We discovered that some users experienced discomfort with the teleportation feature, including symptoms similar to motion or sea sickness. To address this, we shifted our design approach to create smaller, more compact scenes. This layout enables users to move naturally within the VR environment by walking short distances, eliminating the need for artificial movement in most scenarios. Additionally, the VR headset’s built-in safety features—such as guardian boundaries—help prevent collisions with real-world obstacles, making physical navigation both safe and comfortable.
These changes collectively contributed to a more accessible, intuitive, and user-friendly VR experience, especially for users new to the technology or sensitive to virtual motion.