
Designing a Modular, Gamified Training System for Complex Decision-Making
Summary
WIT FORCE is an interactive training system designed to reinforce and extend the U.S. Army’s Weapons Intelligence Course.
I designed an interaction-driven learning system under real-world constraints, expanding an existing video-based platform into a modular, gamified experience.
The challenge required balancing realism, usability, cost, and technical limitations while translating complex, high-stakes training into something engaging, scalable, and effective.
This project demonstrates my approach to designing interaction-driven systems using gamification, feedback loops, and behavioral mechanics to guide user decision-making in complex environments.
Context
The Weapons Intelligence School trains soldiers to investigate incidents, analyze evidence, and identify threats in operational environments—essentially “CSI on the battlefield.”
The existing curriculum was dense and multi-disciplinary, heavily instructor-led and difficult to practice outside structured training in the classroom.
The project goal was to improve engagement and retention, enable self-directed practice, reduce reliance on in-person instruction and introduce scenario-based learning.
Core Challenge
The training required learners to conduct interviews, analyze physical evidence and then reconstruct complex systems. They also has to operate specialized tools and make decisions under pressure.
This all had to be done while maintaining accuracy to real-world procedures and working within budget and timeline constraints. It also had to support users with varying technical familiarity.
Constraints & Environment
This project operated under layered constraints:
- Technical: Required to work within an existing video-based platform
- Compliance: Deployment within restricted military environments
- User variability: Wide range of computer literacy
- Production: Coordination with external film production timelines
- Team size: Small, cross-functional team
- Delivery: Fixed timeline aligned to curriculum cycles
These constraints shaped nearly every design decision.
My Role
I led the design of the system end-to-end, including:
- interaction and system design
- learning and decision frameworks
- module architecture and flow
- documentation and implementation guidance
- collaboration with subject matter experts and stakeholders
My role focused on translating complex domain knowledge into structured, interactive systems that could be implemented within real-world constraints.
I operated at the system level—defining how learning, interaction, and feedback worked together across the entire experience.
Approach
To address the complexity of the training, I focused on three core design strategies:
1. Modular System Design
I designed a system composed of reusable modules that could function in two modes:
- Narrative-driven scenarios (Story Mode)
- Standalone training exercises (Drill Mode)
Each module could operate independently or be embedded within a larger experience, enabling scalable content creation and flexible learning paths.
The system was designed with modular, lightweight interactions that could translate effectively to mobile and constrained environments if needed.
2. Interaction-Driven Learning
Instead of passive content, the system centered on:
- decision-making
- cause-and-effect feedback
- iterative learning through failure
Key design principles:
- immediate feedback within interactions
- delayed feedback through after-action review
- safe environments for failure and retry
3. Selective Fidelity & Abstraction
A critical part of the design was determining what to simulate vs what to simplify.
Examples:
- Complex tools were abstracted into usable interaction models
- Low-impact tasks were embedded into narrative instead of fully simulated
- High-value behaviors were developed into interactive modules
This ensured:
- reduced complexity
- faster development
- focus on meaningful learning outcomes
The system emphasized lightweight, modular interactions that could translate well to mobile or constrained environments.
Key Design Decisions
Choosing Video + Interaction over Full 3D
We deliberately avoided building a fully 3D experience.
Reasons:
- high cost and complexity
- risk of unrealistic avatars breaking immersion
- mismatch with user familiarity
Instead, we:
- leveraged high-quality filmed video
- layered interactive systems on top
Result: A more believable, accessible, and cost-effective experience.
Designing for Mixed User Experience Levels
Users ranged from non-technical to experienced.
Solutions:
- simple click-based interactions as default
- optional advanced controls (e.g., 3D navigation)
- guided flows to reduce friction
Data-Driven Architecture
We implemented a data-driven system that allowed:
- rapid content updates
- modular expansion
- reduced reliance on engineering
This enabled scalable development and long-term maintainability.
Prioritizing Learning Impact
Not all features were built equally.
Example:
- A specialized camera system was reduced to key learning moments rather than fully simulated
- Reconstruction tasks were simplified into structured workflows rather than open-ended systems
Focus: maximize learning impact while minimizing unnecessary complexity.
Interaction Design & Core Loop
Each training scenario followed a structured loop:
- Narrative setup (video)
- Contextual decision or interaction
- Investigation or task-based modules
- Evidence processing and analysis
- Outcome + feedback
- After-action review
This loop:
- reinforced learning through repetition
- connected abstract concepts to real scenarios
- created a sense of progression and consequence
Example Systems
Interview System (Behavioral Interaction)
- branching dialogue
- stress-based failure system
- visible emotional feedback
Reconstruction System (Analytical Thinking)
- identification and classification
- object combination
- structured schematic selection
Search & Discovery (Spatial Awareness)
- environmental scanning
- tool selection
- evidence collection
- Recognition & Recall (Pattern Matching)
- rapid identification tasks
- immediate validation feedback
Each system targeted a different cognitive skill.
Tradeoffs & Design Tensions
Key tensions included:
- realism vs usability
- simulation vs abstraction
- engagement vs accuracy
- scope vs delivery timeline
Examples:
- limited branching to control complexity
- hybrid interaction models to support varied users
- selective use of 3D only where spatial interaction mattered
While full narrative branching would have increased realism, I constrained branching to key decision points to maintain scope and ensure learners could quickly iterate and learn from failure.
Outcomes
The final system enabled scalable, reusable training modules and introduced self-directed learning outside of formal instruction.
By combining immediate feedback with after-action review, learners could identify gaps, iterate quickly, and build confidence in decision-making.
The platform supported both structured curriculum delivery and independent practice—expanding how the program could be taught and experienced.
Enabled scalable training used by hundreds of soldiers, supporting both independent study and classroom instruction.
Why This Approach Worked
- modular design allowed scalable content
- decision-driven interactions improved engagement
- feedback loops reinforced learning