Designing an Intuitive VR Training Experience for Non-Tech Users in High-Risk Industries

February 2023
How do you teach employees a specific work protocol in a high risk environment, without exposing them to danger?
I led the product design for ARGO's VR training module, crafting intuitive interactions for a non-tech-savvy workforce in high-risk environments. ARGO is a digital twin platform used for storing and managing operational procedures, with an immersive VR training module at its core.
My Contributions
I led the product design for ARGO's VR training module, crafting intuitive interactions for a non-tech-savvy workforce in high-risk environments. ARGO is a digital twin platform used for storing and managing operational procedures, with an immersive VR training module at its core.

Art Rebel 9 offers ARGO: a digital twin-based solution designed for creating, storing, and managing complex information systems.
The platform consists of four interconnected components:
1. ARGO Web Editor – for managing and editing data like models, protocols, documentation, and staff members.
2. ARGO MR Viewer – for spatially viewing instructions, blueprints, and work orders.
3. ARGO Mobile Viewer – a cost-effective alternative to the MR viewer.
4. ARGO VR – a training module for teaching staff operational protocols.

By this point we were already offering ARGO VR for over two years, and as part of a contract wanted to update it to new technology.I was assigned as the product designer for the entire ARGO platform, overseeing the user experience and design strategy across all modules. However, for the purpose of this case study, I will primarily focus on the work I contributed to the updated ARGO VR training module.

ARGO ecosystem
The Problem
ARGO VR was originally designed 2 years prior, as a practical training tool to help companies teach employees operational protocols in a safe and immersive environment. While initial feedback from early clients was generally positive, we soon noticed a recurring pattern: the VR module was primarily being used as a showcase piece rather than as a functional training solution.

This raised important questions—why wasn’t the tool being used as intended, and what barriers were preventing its adoption as a core part of employee training? I set out to investigate the underlying issues, aiming to understand user behavior, identify pain points, and find actionable ways to improve the system for real-world training use.
Process & Approach
I (product designer) collaborated closely with a Unity developer, backend engineer, and project owner in a bi-weekly Agile workflow. This allowed for rapid iteration, tight feedback loops, and a hands-on approach to problem-solving throughout the development process.

The goal of the project was developing a modular simulation of a hazardous work environment. Users would learn and test work protocols in a safe VR environment, that could prove dangerous or even deadly in the real world scenario. This would in turn ensure engagement, imprint of knowledge, and testing in a safe environment.
Research & Insights:
When I joined the ARGO project, the VR training module had already been in use for about two years. While the user base was small, we had some valuable usage data and insights to build on.

The need for an update was driven primarily by two factors: a shift from PC-tethered VR systems to standalone headsets, and a broader update in how Unity handled VR development. Speaking with clients, I learned that they appreciated the VR tool, but that it was mostly used as a showcase. Typically, a dedicated, VR-proficient staff member operated the system, and regular employees never used it independently.

At the same time, I was experimenting with the new hand tracking capabilities of the Meta Quest 2. This prompted a key question: how do our users actually prefer to navigate in VR? My hypothesis was that most of our target users weren’t particularly tech-savvy and found the setup process intimidating.

To validate this, I conducted both market and user research. First, I explored how leading companies and labs were implementing hand tracking and mapping real-world gestures to virtual interactions. This also helped me get familiar with the tech, understanding the limitations. Then, I spoke with both existing and new clients to better understand their expectations and comfort levels.

What I found was a broad range of tech familiarity among clients—from very tech-oriented to almost completely analog. However, a common theme emerged: if learning a new system wasn't absolutely necessary, they avoided it—especially in a work context. They also universally preferred simpler, lighter VR headsets with less setup steps.

Based on these insights, I proposed we create a test to reduce friction in using the VR system. I suspected that using controllers was a major barrier, since none of the users had gaming experience or familiarity with controller-based input. However, past research also suggested that users appreciated physical feedback and that tool-based interactions (like using a hammer or stick) translated well in VR.
One of the one-page User Tests
To explore this further, I designed a short onboarding experience disguised as a game. It included key actions like moving, typing, opening/closing menus and doors, interacting with small objects (like dice), and using a virtual tool (a drill).
The tasks included in the demo were carefully selected to provide insights that would guide future feature development.

We first built and
tested the demo using controllers. A month later, we reworked the same experience for hand tracking and tested it on the same group of users, allowing us to directly compare the two input methods.

From testing, I found that:
1. Users generally preferred hand tracking over controllers—except when it came to using virtual tools like the drill or moving around the space. In fact, users completed the demo roughly 50% fasterwith hand tracking.
2. The virtual keyboard proved to be a major friction point with hand tracking, consuming the most time during the experience.
3. Users enjoyed the feel of interacting with virtual objects—some even chose to repeat the demo just to play with elements like picking up and throwing items.
4.
The key benefit of controllers was the tactile filing of using objects (vibration motors while using a drill).
5.
One key discovery was that users didn’t understand the teleportation gesture for navigation. Instead, they preferred physically walking within the play area to reach different locations in the VR environment.
ARGO VR demo with controllers
ARGO VR demo with hand tracking
Design Decisions:
The research led us to redesign the system with hand tracking as the primary navigation method. We minimized the use of the virtual keyboard and focused on mimicking real-world interactions.
Design #1:
To support clear communication within the team, I created a simplified user journey map in Figma. We used team meetings to define the functionality of each step in the journey, which helped us prioritize features based on the project’s time constraints and set clear, achievable goals.

Given the application’s minimal UI, I skipped traditional wireframing and instead prepared rough building blocks for each specific step/function. As such, the UI is specific to the current state of the user (for example: user is in the process of working on the protocol, user is looking at the main menu, user has opened one information panel...).

The Unity developer built a simulated scenario based on the documentation our clients provided, and we worked closely together during the UI implementation phase to precisely define how graphical elements should behave in 3D space—including positioning, interactions, and animations.

I also designed new hand gestures for key actions like opening the main navigation menu and teleportation. These gestures were inspired by real-world behaviors—for instance, raising your hand as if checking the time would trigger the menu to appear on the user’s wrist. To streamline usability, the menu options would dynamically change based on the user’s location within the scene.

Special attention was given to tactile feedback, using sound and animation to reinforce key actions and make interactions feel more natural and engaging.
ARGO VR UI elements with the updated hand menu gesture
User validation:
After developing this improved version, we conducted another round of testing with new users using part of the final scenario (based on a real company protocol). Feedback was generally positive, however, testing revealed some additional hiccups.

One of such was the need for a gesture to open context panels (UI elements displaying object info) from a distance—something previously handled by controllers. We also learned that the wrist-based menu gesture was too sensitive and often triggered accidentally.
Design #2:
In response to insights gathered during user testing, we implemented several important changes to improve usability and overall user comfort within the application:
1. Gesture Redesign for Menu Access: The original gesture for opening the main navigation menu—raising the wrist as if checking the time—was replaced with an open palm gesture. This change made the interaction more intuitive, resembling the action of holding a tablet. The updated gesture aligned well with the redesigned menu layout, which now resembles a digital interface that users can view and interact with as if it were resting in their hand.
2. Personal or environmental UI Panels: In some scenarios, the menu now supports additional panels—such as object information, call assist, file access, and protocol references—that follow the user’s hand but can also be freely placed in 3D space for convenience (previously only supported 3D space placement).
3. Scene Design and Navigation Adjustments: We discovered that some users experienced discomfort with the teleportation feature, including symptoms similar to motion or sea sickness. To address this, we shifted our design approach to create smaller, more compact scenes. This layout enables users to move naturally within the VR environment by walking short distances, eliminating the need for artificial movement in most scenarios. Additionally, the VR headset’s built-in safety features—such as guardian boundaries—help prevent collisions with real-world obstacles, making physical navigation both safe and comfortable.

These changes collectively contributed to a more accessible, intuitive, and user-friendly VR experience, especially for users new to the technology or sensitive to virtual motion.
ARGO VR demo with controllers
Result
VR training sessions are now 4x faster than traditional field training.
As a result of our development efforts, the ARGO VR module has evolved from being primarily a showcase tool to a practical and impactful solution for onboarding and training new employees. It now enables users to learn and practice proper work protocols in a safe, immersive environment—reducing real-world risks and the need for costly physical setups.

Our partners have reported that training in the VR environment is up to four times faster compared to traditional field training methods. This significant increase in efficiency has made the module not only more useful but also highly valuable to organizations looking to streamline their training processes.

Consequently, ARGO VR has become a fully integrated, validated, and in-demand component of the ARGO platform, with growing interest from both existing and prospective clients.
Learnings & next steps
What I Learned:
Working on this project taught me how to design spatial experiences specifically tailored for hand tracking. I gained a deeper understanding of how to create intuitive interactions in immersive environments, particularly for users with little to no prior experience in VR. I also learned about the advantages and limitations of using specialized controllers in applications designed for the general public—where ease of use, familiarity, and low entry barriers are critical.

Testing the application in real-world, often unpredictable client environments—where I didn't always have full control over what users would see or how they would behave—helped me become more comfortable in spontaneous situations. These experiences pushed me to think on my feet, adapt quickly, and trust my design instincts when immediate problem-solving was required.
Future Vision:
Looking ahead, we plan to expand the ARGO VR module by introducing more modular design features. The goal is to empower clients to independently create and customize their own training protocols with minimal input from our team. This will involve designing a library of modular environments, tools, and interactive assets—each built with consistent feedback systems, including animations and sound cues.

Additionally, an integrated Learning Management System (LMS) is currently in development. This will more tightly connect the ARGO VR module with the broader ARGO platform, enabling managers to assign VR-based lessons, track progress, and evaluate employee performance within a unified system.

We are also exploring the use of biometric data—such as heart rate, gaze tracking, and engagement metrics—to gain deeper insights into user behavior and improve the learning process in VR.

There was also an attempt to merge hand tracking and controller-based navigation into a single, flexible system. However, due to limitations in the current Meta Quest software, we were unable to implement this approach.It remains an area of interest for future development, pending platform advancements.